At 14:13 -0400 2/6/11, Robert wrote:
Thanks all for the above advice. After experimenting for a few days,
I’ve learned a bunch about CRON and found partial success. The
following command does work, but has some limitations:
cp -r /server/local/home/source/* /server/local/home/target
If I drop something new into “source” the Cron Job above copies it
over to “target” as it should. Great. However, if something is
deleted from “source” I’ve (once) received a message from the Cron
Daemon stating
cp: omitting directory `/server/local/home/source/TRIAL'
where “TRIAL” was a folder deleted in “source”. This Cron Job
command didn’t delete “TRIAL” from “target”, hence the message (I
guess). If a single file (i.e. not a folder) is deleted from
“source”, I don’t receive a Cron Daemon message (and it too
continues to appear in the target folder). Actually the Daemon seems
inconsistent, though that’s an issue I’m not really worried about.
What I’m looking for is a command line that creates an identical
copy of the contents of the first folder into the second. From what
I’ve read, one post says “YOU ALMOST NEVER WANT -R or -r!” because
it doesn’t preserve attributes. Rather, one should use the -a
(archive) command to preserve file attributes. I replaced -r with -a
in the code line above and it appears to operate in the same manner,
though -a does preserve the file attributes.
I doubt many of the “source” files will ever be deleted, I just
don’t want a bunch of junk to be reloaded if a restoration is
required.
Being as you don’t need the backup to be a usable hierarchy of
folders you could use the ‘tar’ command. This creates a single file
containing an entire tree of folders and contents. tar is a very old
Unix command. Its name stands for Tape ARchiver, from the days when
the single file was just a stream of bits written to a mag tape.
$command='foldername=“path of the folder to backup” ';
$command .=“cd “$foldername” && tar
cf”.$_SERVER[‘DOCUMENT_ROOT’]."/…/backup.tar .;"
suitably executed will:
set a Shell variable called ‘foldername’ to whatever value you give
change directory to that place, and, if the change succeeds, perform
a tar archive putting the archive into a file called ‘backup.tar’
situated one level above the folder that the webserver serves out.
The contents of the archive will be whatever is currently in ‘.’
(dot) - the folder it just changed to. No problem with files that
have been deleted.
To restore just use the same command again, but with ‘tar cf’
replaced by ‘tar xf’ - c = create archive, x = extract from archive.
To restore individual files it’s a bit more complicated as you need a
field to be submitted to tell it which file.
The exact details are untested, but if you are interested in this
approach I can test it. I did have it working, but as I mentioned in
an earlier post the ISP removed it.
David
–
David Ledger - Freelance Unix Sysadmin in the UK.
HP-UX specialist of hpUG technical user group (www.hpug.org.uk)
email@hidden
www.ivdcs.co.uk
offtopic mailing list
email@hidden
Update your subscriptions at:
http://freewaytalk.net/person/options