Mobile app version of vmapp.org
Login or Join
Ogunnowo487

: SFTP Speed Limit (being a good neighbor on shared server) I've been developing my website on a local server and I am ready to upload to production. The initial upload will be in the 4 GB

@Ogunnowo487

Posted in: #Ftp #Transfer

I've been developing my website on a local server and I am ready to upload to production.

The initial upload will be in the 4 GB range. I don't want to slam the server, I think a Filezilla speed limit setting will help prevent that.

The default speed limits are Download: 100 KiB/s and Upload: 20 KiB/s. Are those workable? Would 100 KiB/s for one upload event be OK?

Also...should I:


Zip the website as a whole, and transfer it as one monolithic zip file (unzipping on the server)? or...
Just SFTP the local directory using the compression in Filezilla?


Thank you.

Post Mortem: For the initial upload, I ended up using the file manager in the hosting company's cpanel, to select and upload a monolithic zip file of the site. I figure.. how much trouble can I get in using their tools and their defaults. It uploaded quickly and unzipped with maybe just a momentary spike in CPU use. This would not be a good option for daily use, but for a one time event, I guess it was OK.

For long term synchronization between my local-development-site and the server-production-site, I'll try Filezilla and report back.

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Ogunnowo487

3 Comments

Sorted by latest first Latest Oldest Best

 

@Barnes591

ZIPing files and directories can be very helpful. Make sure you compare the results when you do so.


If the contents are mostly binaries the ZIP file can sometimes exceed the size of the original files. Mostly text-like files (HTML, Javascript, etc) can shrink by quite a lot so then it is well worth taking that step.




FileZilla has the speed limiting capability, available under the Transfer menu.

However, there are issues with it. Most have to do with buffering.

FileZilla may indeed respect those settings but if the network itself has fluctuating speed you can easily find yourself in a situation where your speed-limited FileZilla has filled up a transfer buffer while the network or server interface is busy.

Then, when the network opens up more bandwidth there will be no speed limit to how fast the data actually moves through the subnet. This is crucial to understand. It will just blast data to unload the buffers as fast as it can.

In other words, you are doing everything you can to be a good neighbor, within your capabilities. Trust me, your network administrator respects that.

Sometimes, when this is commonly happening in-house, the Administrator's best approach is to make sure both systems are on a network switch with a nice high backbone capacity so it can handle the port-to-port transfers without affecting the rest of the network. That is nice and tidy.

However, if the transfers are going through the internet gateway or the same server interface that others are using, you may find that there is no simple way to avoid excessive contention by maxing an ISP's transfer cap or saturating the interface on the server.

Lately, the ISP I have been using simply enforces the cap all by itself. So you are not causing them any headache.

But the problem is your neighbors on the rest of your network that need to share that gateway or anybody using the production server on the same interface you are trying to use for large updates while it is on-line.

Usually, I suggest that scheduling is going to be the best solution. 4am transfers via sftp triggered by cron, for instance.

In all cases, make sure you work closely with your Administrator. Yes, they are busy, but the last thing they want is unexplained stoppages which trigger help calls. Trust me on this one. Short supervised tests and careful evaluation of the approach you follow will make for a happier experience.

10% popularity Vote Up Vote Down


 

@Alves908

As for the Zip file, I think it depends on several factors. I know that at one point (not that long ago) WinZip had a practical file limit not written into the code, but would fail when I would try and zip 287,000+/- small files in 4 directories. It may be that the newer version works better. If you are using GunZip(GZip), then that may work best. I prefer to transfer larger files but then again if you zip up the whole install, you still may have trouble. Here is why.

It depends upon the FTP client. Some FTP clients can resume the file transfer if it fails. The client would know how much has been transferred and resume from where it fails. Some FTP clients do not even when they claim they do. As well, one FTP client would never transfer the 287,000+/- files (without zipping) no matter what I would do. It would fail somehow every time and it was a well known high quality FTP client. Very frustrating.

I like the idea of the speed limit and that you want to be nice to your neighbors. You are to be commended for that! I would suggest it if you do not mind the slower transfer rate.

My personal preference is to use a single or several zip files. Zipping and unzipping all of the files may take time. I do not fully know your situation. Transfers of larger zip files actually go faster because there is less work for the client and the data is compressed. I would try to zip the files and transfer as a single file. But do not be surprised if there are problems. It may be that transferring individual files, though slower, will work better.

Again, it all depends upon the zip software and the FTP client whether you have issues. It maybe that they will work just fine. I would say try it. I may have had issues, but you may not. I was surprised that I have issues at all since I was using well known clients.

Also keep in mind that unzipping a larger file may pound the CPU and I/O sub-system for a period, though these days, that is usually a much faster process.

BTW- I was looking for a new FTP client. Please let us know if this works well for you.

10% popularity Vote Up Vote Down


 

@Murray432

Most shared servers I've worked with will temporarily throttle your upload when you go over their allowed upload threshold. #1 is the best because it requires fewer network connections to process your files. A server spends a lot of resources opening and closing new connections. Lots of network connections typically bog down a cheap server.

Using #1 allows one connection to handle the files. The server's processor can then unzip the file without heavily impacting the servers ability to serve end-users.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme