Server Side Copying

Is there a way to make a copy of a file or folder on a hosted server? That is, I have a large folder and would like to make a backup copy on the server itself as I have limited download and upload speeds.

I’ll download a backup every week or so to have an off-server copy. Having a duplicate that I can slide over to the errant folder would be handy. Are FTP file transfer programs such as Fetch or Transmit capable of such a thing?

Thanks,
Robert


offtopic mailing list
email@hidden
Update your subscriptions at:
http://freewaytalk.net/person/options

If you want to copy a file on the server, you need to ask the server
to do it through the command shell. If you use an application like
Transmit or Fetch to do this, the file will be downloaded to your
computer, then uploaded again into a different folder on the server,
making for a round trip and double-the-file-size bandwidth charges.
Depending on your hosting provider, and how they have configured their
server, using the shell can range from trivially easy to impossible.

Here’s the trivially easy way, as done in one app I’m working on right
now:

//php code, FILES_BASE is a variable I made up in my application,
//and it points to the folder on the server where my application saves  
things
shell_exec('cp -R "' . FILES_BASE . '/' . $folder_name . '" "' .  
FILES_BASE . '/backup/' . $folder_name . '"');

Note the double-quotes around both paths – since I can’t guarantee
that there won’t be spaces in the $folder_name variable, I don’t like
to take any chances with confusing the shell.

On all the servers I have used, the shell functions require that you
use absolute paths for everything, so I define FILES_BASE as a root-
relative path, like /home/username/folder/ or whatever the real path
is to your files folder. A quick shortcut to this is to use
dirname(__FILE__) to have PHP work out the real path for you. If you
need to jump upward in the filesystem, multiple layers of dirname will
do that, so dirname(dirname(__FILE__)) would tell you the
grandparent folder of the current script file. So you could jump out
of the Web server’s root like this: define('FILES_BASE', dirname(dirname(__FILE__)) . '/files');

Walter

On May 31, 2011, at 2:51 PM, Robert wrote:

Is there a way to make a copy of a file or folder on a hosted
server? That is, I have a large folder and would like to make a
backup copy on the server itself as I have limited download and
upload speeds.

I’ll download a backup every week or so to have an off-server copy.
Having a duplicate that I can slide over to the errant folder would
be handy. Are FTP file transfer programs such as Fetch or Transmit
capable of such a thing?

Thanks,
Robert


offtopic mailing list
email@hidden
Update your subscriptions at:
List Options | FreewayTalk


offtopic mailing list
email@hidden
Update your subscriptions at:
http://freewaytalk.net/person/options

Walter,

Thanks — er, sort-of. I’m not a programmer and while I know what PHP is, I’m looking at your code going “huh?”

Since my post, I’ve discovered the world of Cron Tabs/Jobs/Commands. While the timing is easy to figure out, I’m struggling with the UNIX executable code. Any suggestions on resource sites that could step me through this past the “% cp” to get a folder found in one directory path to be copied into another folder in another path?

Robert


offtopic mailing list
email@hidden
Update your subscriptions at:
http://freewaytalk.net/person/options

At 14:51 -0400 31/5/11, Robert wrote:

Is there a way to make a copy of a file or folder on a hosted
server? That is, I have a large folder and would like to make a
backup copy on the server itself as I have limited download and
upload speeds.

I’ll download a backup every week or so to have an off-server copy.
Having a duplicate that I can slide over to the errant folder would
be handy. Are FTP file transfer programs such as Fetch or Transmit
capable of such a thing?

It’s possible to make backup copies on the server with a PHP script
subject to their being sufficient space. However the hosting company
may not like it. My main one doesn’t seem to.

Last week I used such a script to backup a site and move the new
version’s files over the existing ones in a very short time. The
script then oddly disappeared. The hosting company denied knowledge
of it. Nothing against the rules was done. Other scripts within the
same folder were untouched. The only problem to the hosting company
that I can see is that there is a high disc I/O rate, briefly, that
does not fit normal web server patterns.

Maybe some of the ISP participants here could expand on whether they
would object to their customers doing such things.

David


David Ledger - Freelance Unix Sysadmin in the UK.
HP-UX specialist of hpUG technical user group (www.hpug.org.uk)
email@hidden
www.ivdcs.co.uk


offtopic mailing list
email@hidden
Update your subscriptions at:
http://freewaytalk.net/person/options

If you look at the innermost line of my code, that was a simple cp
command (cp is Unix for copy) that included some interpolated values
from PHP. If you are always copying the same folder, you could hard-
code the entire thing.

So in your cron, you could add a command something like this:

cp -R "/path/to/folder/one" "/path/to/folder/two"

…and you should end up with a complete copy of the contents of
folder one inside folder two. You’ll have to experiment with this on
your server. What I recommend you do is create a folder, populate it
with a few small files and subfolders, then set up your cron job to
run once an hour, and launch it. If it works as you expect, you can
stop that job, modify it to manage a much larger folder, and set it to
run at a much more reasonable interval. Just be sure that you are
using the right paths to your files/folders. While it’s extremely
difficult to screw up someone else’s files on a Unix server, it is
trivially easy (and entirely un-recoverable) to screw up your own.

Walter

On May 31, 2011, at 4:58 PM, Robert wrote:

Walter,

Thanks — er, sort-of. I’m not a programmer and while I know what PHP
is, I’m looking at your code going “huh?”

Since my post, I’ve discovered the world of Cron Tabs/Jobs/Commands.
While the timing is easy to figure out, I’m struggling with the UNIX
executable code. Any suggestions on resource sites that could step
me through this past the “% cp” to get a folder found in one
directory path to be copied into another folder in another path?

Robert


offtopic mailing list
email@hidden
Update your subscriptions at:
http://freewaytalk.net/person/options


offtopic mailing list
email@hidden
Update your subscriptions at:
http://freewaytalk.net/person/options

I believe Forklift from http://binarynights.com/ does what you ask.
From the front page it suggests the servers need to be
‘FXP-enabled servers’ I don’t know if this is common or not, you can always download a demo to test.
Hope it helps.

JG


offtopic mailing list
email@hidden
Update your subscriptions at:
http://freewaytalk.net/person/options

Thanks all for the above advice. After experimenting for a few days, I’ve learned a bunch about CRON and found partial success. The following command does work, but has some limitations:

 cp -r /server/local/home/source/* /server/local/home/target

If I drop something new into “source” the Cron Job above copies it over to “target” as it should. Great. However, if something is deleted from “source” I’ve (once) received a message from the Cron Daemon stating

   cp: omitting directory `/server/local/home/source/TRIAL'

… where “TRIAL” was a folder deleted in “source”. This Cron Job command didn’t delete “TRIAL” from “target”, hence the message (I guess). If a single file (i.e. not a folder) is deleted from “source”, I don’t receive a Cron Daemon message (and it too continues to appear in the target folder). Actually the Daemon seems inconsistent, though that’s an issue I’m not really worried about.

What I’m looking for is a command line that creates an identical copy of the contents of the first folder into the second. From what I’ve read, one post says “YOU ALMOST NEVER WANT -R or -r!” because it doesn’t preserve attributes. Rather, one should use the -a (archive) command to preserve file attributes. I replaced -r with -a in the code line above and it appears to operate in the same manner, though -a does preserve the file attributes.

I doubt many of the “source” files will ever be deleted, I just don’t want a bunch of junk to be reloaded if a restoration is required.

Any thoughts?

Robert


offtopic mailing list
email@hidden
Update your subscriptions at:
http://freewaytalk.net/person/options

Take a look at rsync, it can do what you’re after, and it has the
added benefit that it only sends the “difference” after the first
copy. It can be very fast on a second, third, whateverth copy, because
only the things that have changed will be sent down the wire.

Walter

On Jun 2, 2011, at 2:13 PM, Robert wrote:

I doubt many of the “source” files will ever be deleted, I just
don’t want a bunch of junk to be reloaded if a restoration is
required.

Any thoughts?


offtopic mailing list
email@hidden
Update your subscriptions at:
http://freewaytalk.net/person/options

At 14:13 -0400 2/6/11, Robert wrote:

Thanks all for the above advice. After experimenting for a few days,
I’ve learned a bunch about CRON and found partial success. The
following command does work, but has some limitations:

 cp -r /server/local/home/source/* /server/local/home/target

If I drop something new into “source” the Cron Job above copies it
over to “target” as it should. Great. However, if something is
deleted from “source” I’ve (once) received a message from the Cron
Daemon stating

   cp: omitting directory `/server/local/home/source/TRIAL'

Š where “TRIAL” was a folder deleted in “source”. This Cron Job
command didn’t delete “TRIAL” from “target”, hence the message (I
guess). If a single file (i.e. not a folder) is deleted from
“source”, I don’t receive a Cron Daemon message (and it too
continues to appear in the target folder). Actually the Daemon seems
inconsistent, though that’s an issue I’m not really worried about.

What I’m looking for is a command line that creates an identical
copy of the contents of the first folder into the second. From what
I’ve read, one post says “YOU ALMOST NEVER WANT -R or -r!” because
it doesn’t preserve attributes. Rather, one should use the -a
(archive) command to preserve file attributes. I replaced -r with -a
in the code line above and it appears to operate in the same manner,
though -a does preserve the file attributes.

I doubt many of the “source” files will ever be deleted, I just
don’t want a bunch of junk to be reloaded if a restoration is
required.

Being as you don’t need the backup to be a usable hierarchy of
folders you could use the ‘tar’ command. This creates a single file
containing an entire tree of folders and contents. tar is a very old
Unix command. Its name stands for Tape ARchiver, from the days when
the single file was just a stream of bits written to a mag tape.

$command='foldername=“path of the folder to backup” ';
$command .=“cd “$foldername” && tar
cf”.$_SERVER[‘DOCUMENT_ROOT’]."/…/backup.tar .;"

suitably executed will:
set a Shell variable called ‘foldername’ to whatever value you give
change directory to that place, and, if the change succeeds, perform
a tar archive putting the archive into a file called ‘backup.tar’
situated one level above the folder that the webserver serves out.
The contents of the archive will be whatever is currently in ‘.’
(dot) - the folder it just changed to. No problem with files that
have been deleted.

To restore just use the same command again, but with ‘tar cf’
replaced by ‘tar xf’ - c = create archive, x = extract from archive.
To restore individual files it’s a bit more complicated as you need a

field to be submitted to tell it which file.

The exact details are untested, but if you are interested in this
approach I can test it. I did have it working, but as I mentioned in
an earlier post the ISP removed it.

David


David Ledger - Freelance Unix Sysadmin in the UK.
HP-UX specialist of hpUG technical user group (www.hpug.org.uk)
email@hidden
www.ivdcs.co.uk


offtopic mailing list
email@hidden
Update your subscriptions at:
http://freewaytalk.net/person/options

Again, my thanks to all for the assistance here.

Have a great weekend!

Robert


offtopic mailing list
email@hidden
Update your subscriptions at:
http://freewaytalk.net/person/options