Re: rsync or rdist

Herta Van den Eynde wrote:
On 10/03/2008, Rodrick Brown <rbrown@xxxxxxxxxxxxxxx> wrote:
tar cvfp - . | ssh -c blowfish remote '(cd /storage/archive; tar xvf - )'

-----Original Message-----
From: redhat-list-bounces@xxxxxxxxxx [mailto:
redhat-list-bounces@xxxxxxxxxx] On Behalf Of Mad Unix
Sent: Monday, March 10, 2008 9:29 AM
To: General Red Hat Linux discussion list
Subject: Re: rsync or rdist

any one have acript to do the remote transfer ...

On Mon, Mar 10, 2008 at 3:17 PM, Herta Van den Eynde <
herta.vandeneynde@xxxxxxxxx> wrote:

On 10/03/2008, Mad Unix <madunix@xxxxxxxxx> wrote:
I need a script transfer archive log files from Production site
Server1 to DR site Server2 on the same subnet
i want to sync the files between /arc with /storage/archive on both
servers ....

AFAIK, rdist copies entire files. rsync only copies the blocks that are

Note also that you can run rsync through ssh for a more secure transfer.

Kind regards,


"Life on Earth may be expensive,
but it comes with a free ride around the Sun."
redhat-list mailing list
unsubscribe mailto:redhat-list-request@xxxxxxxxxx?subject=unsubscribe


Looks like a complicated way to do what a simple 'scp -pr source target'
will accomplish. Or am I missing something?

Rodrick does have a point, though: if you simply want to copy new files from
server A to server B, a simple copy will be faster than rsync, as you don't
need the comparison phase. But scp will be faster than the tar - transfer -

Kind regards,


well if scp inherits the same limitation of rcp -r then it wont take links with it.
tar picks up all links, but does not follow them.

I would always use a variation of the tar command given above for complete directory copies, from one system to another, however would add the "B" modifier to the example given above to ensure that tar Blocks for pipes/network.

However rsync would be a much better option if say a DR host needs to be kept in sync with a production, as rsync can be configured to to incremental updates, ie only copy changes, and where files are deleted on the source delete them at the dest, maintaining a complete mirror of two directories across a network.
it could be cron's to run every few mins.

regards peter

redhat-list mailing list
unsubscribe mailto:redhat-list-request@xxxxxxxxxx?subject=unsubscribe

Relevant Pages

  • Servers are using limited amount of memory after upgrade from 6.2-PRE to 6.2-STABLE
    ... Re: rsync issues ... how to know what DNS server is being used ... Use of CVS ... more than 7 partitions on a SCSI-drive ...
  • Configuring rsync on Solaris 2.8
    ... Configure the backup server as your rsync server and your Clearcase ... How to install samba? ... leave as is and replace the remaining 10-18GB disks with 36GB disks ...
  • Re: Seeking Wisdom Concerning Backups
    ... I have a small server on which I need to backup the /home partition. ... tarballs get truncated at an apparent 2GB limit when using tar. ... effort into learning tar and rsync to make them work (I thought! ...
  • Seeking Wisdom Concerning Backups
    ... I have a Barracuda Terastation Pro backup server sitting right next to it, ... I had put enough effort into learning tar and rsync to make them work (I thought! ...
  • copying recently edited files server to server
    ... scp is possible, as is rsync through ssh, but a pasword challenge is ... My options are to somehow open up one rsync from the new server and ... feed it the find command to bring back those files, ... question to a tar on the older server and then bring the tar over at ...