Mobile app version of vmapp.org
Login or Join
Bryan171

: How can I tar.gz a huge directory on a shared host? The process keeps getting killed I'm working on moving a site to a new host for someone and I'm having trouble with their media. They

@Bryan171

Posted in: #Gzip #Migration #WebHosting

I'm working on moving a site to a new host for someone and I'm having trouble with their media.

They have a directory with thousands and thousands of images and videos. I attempted to zip this directory using the following:

tar -zcvf media.tar.gz path/to/directory

But it gets about 3/4 of the way through and then just stops with a message that just says


killed


How can I compress this directory? Should I just compress it in pieces? This site is hosted on Hostgator.

10.04% popularity Vote Up Vote Down


Login to follow query

More posts by @Bryan171

4 Comments

Sorted by latest first Latest Oldest Best

 

@Chiappetta492

An better solution might be SCP (Server to server Copy)

scp -pr usernameOldServer@servername.com:/home/location/to/document_root/* /home/location/to/place/files


This page provides simple examples to do so.



Or if you want to stick to tar:
The other answers explain why it gets killed, so I'm skipping that

You are running into problems because you either run out of time, memory or space. It seems like you do the images and movies in 1 go. You could split this. This way you don't need the big resource usage. Worst case you need to split it in a few steps.
Create a tar, and add files to it per group:

$ tar rvf archive_name.tar newfile


Combine this with a sleep to calm down the server, and it might just do the trick and not get killed.

10% popularity Vote Up Vote Down


 

@Annie201

you can group some data and gzip it in smaller chunks. divide whole data in 20 or 30 sections and use server side import rather than zipping it.
You can prepare a script that list files and folder on old server and new server using curl library, import all. It would be much better ,faster and error free. keep log file for verification

10% popularity Vote Up Vote Down


 

@Harper822

If the process is killed partway through, then you either ran out of memory or you ran out of disk space. I'm going to assume disk space. When compressing your entire disk data, assume you'll need the same amount of free disk space left as the total number of bytes all of your data is that is to be compressed.

Your best bet if you want compression is to compress in pieces, but if you're transferring from server to server and you have enough bandwidth available (so that you're not billed extra), then do a direct disk copy from server to server so less strain is put on both systems, which means your old running server will not slow down as much.

Compression will slow things down somewhat because the server needs to make calculations on the data as well as manipulate it in order to compress it and thousands of calculations and manipulations take time. The advantage to compression though is less data is being sent on the wire which may mean lower bandwidth costs.

10% popularity Vote Up Vote Down


 

@BetL925

it seems to me, the archiving procedure lets the tarball grow and if the tarball becomes (together with all other data) bigger then the available place in your booked hosting, the server kills the process. You will be forced firstly to download all of your data, and archive them locally.

Could it be true?

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme