Hi team,
actually i used command with tar -zcvf XXX.tar.gz <source_path> . directory size is 10 Gb , but it took more than 8 hours on linux el5 . could you please share speed command with output with log for verification
I guess the question is not clear. It appears that you are looking for a backup tool that can quickly backup 10GB. I can list a few tools below. Since you have already tried tar, I am not listing it.
1) dump
2) cpio
3) rsync
One impotant feature to consider is a reliability over speed. The time it takes to perform a backup in same DC would deffer from performing the same on a remote DC . If it is just a simple backup of 10G then I would prefer rsync as it is easy to backup and restore.
Hi @Jayadev ,
i am taking backup same box/host name and free space also available but with tar backup taking more than 8 hours and not completed.
I think did you understand about my question.
actually I did not install any any tools.
kindly provide any command with fast backup with monitor with logs .
if you are still clear then could you please share concerns.
Finally thanks try to provide solution.
Is the data you are backing up static, that is it doesn't change much or does it change frequently? How often do you need to make backups? Can you connect an external drive to the host you are backing up? Backing up to the same host will not protect you from loss if something happens to that host both your data and your backup will be gone. Rsync should solve the speed issue.
If you care more about speed than size, maybe stop telling tar to compress the archive?
Red Hat
Learning Community
A collaborative learning environment, enabling open source skill development.