Re: too many files

From: Aaron Morris (aaronmorris@MINDSPRING.COM)
Date: Tue Jul 29 2003 - 21:34:04 EDT


If your network link is sufficiently fast, try not compressing
everything (or you could tell compress not to compress data as
aggressively by using -1). Depending on the link speed and the CPU
power, the majority of your time might be spent on compression . Also,
using the find command can be a bottleneck since the backup command must
rely on the find command for a list of files. Using something like tar
or pax directly could possibly speed up the process.

ie.
tar cf - /indirectory | rsh remotehost "tar xf -"

or if you still need compression:
tar cf - /indirectory | compress -c | rsh remotehost "uncompress -c |
tar xf -"

Also, if you have a reliable link, take a look at rsync
http://rsync.samba.org

Nguyen, Joseph wrote:
> I have a filesystem that contains 900,000+ files in one directory. I ran
> the following command to copy files to another host and it ran for couple
> days and stops. Event just run the find command would take a long time.
>
> find /indirectory -print | backup -iqf- | compress -c | rsh remotehost
> "(uncompress -c | restore -xf- )"
>
> Do you know any other command that can speed up the copy? we try to backup
> to tape and restore and that also take days.
>
> Joseph
>

--
Aaron W Morris <aaronmorris@mindspring.com> (decep)


This archive was generated by hypermail 2.1.7 : Wed Apr 09 2008 - 22:17:04 EDT