Transferring files from old dedicated server to a new one

884 Views Asked by At

Using Classic ASP (stop tutting), I need to build an application that transfers high resolution photos from one server to another, around 360,000 including the thumbnails to be exact. The application will be called via a Windows schedule and will run as a background process.

What is the best way to achieve this, keeping performance in-mind? The last time I built a monster script like this was transferring and converting database tables for over one million rows, the application started really fast, but then after 25,000 records it went really, really slow! So I want to avoid this.

Obviously it will be a cross-domain transfer, so I was thinking about using an ASP/FTP component, and one-by-one, grab a file, send it, and record its success in a DB table so it knows what is has done so far.

Is it best to process one file at a time and refresh, so it doesn't abuse the server's resources, or should I process 1000 at a time, or more? I want it to be as quick as possible but without clogging up the server.

Any help/suggestions would be gratefully received.

3

There are 3 best solutions below

0
On BEST ANSWER

I think is best to do one file at a time because if the connection goes down for a brief period of time you don't lost the files that you have already sent.

Even when you are using ASP Classic you can take advantage of .net for uploading the files using the FTP client classes in .net and avoid purchasing/installing a third party component. Surely .net is already installed on the server.

My process will look like this:

  1. Upload 1 file using FTP (better performance)
  2. If successful call an ASP page that records the action in the remote DB
  3. Wait a second and retry up to 3 times if error uploading
  4. Proceed to next file

If the process is clogging the server, you can put a brief pause between each upload.

2
On
  • Upload all the files via FTP
  • Create a CSV file with all your data
  • Pull it into the DB in one go

The amount of network handshake over 360,000 individual transactions would be the bottleneck.

0
On

i have something like that running in Classic ASP, it handles tenthousands of images without problem. On the server that houses the images I run a (vbs)script that for each image

  1. Makes a text-file with metadata
  2. Makes a thumbnail and a mid-sized image copy on the second (web)server

The script runs continuously and only checks per folder and file if the files are present on the webserver and if not creates them, No need for a DB. Between every check It sleeps a second. Like that the load on the server is only 2%. I use iPhoto in command-line modus to extract the metadata and images but you could use a library for that. So these three files are stored on the webserver in a copy of the mapstructure from the first server but without de full-sized images.

On the webserver you only need to be able to browse the thumbnails and visualize the metadata and mid-size images. If the user needs the full-size image he clicks the mid-sized which has as url the file on the first server.