robocopy...

Megadeth4168Megadeth4168 Member Posts: 2,157
I need a better solution, or maybe I need to refine my robocopy script. Anyway, we have 125 files each at the same size of 10gb. Every day we need to check to see if any of these files have changed and if so then we need to copy them across the network to another NAS. Its pretty simple except that for the fact that robocopy takes about 3-4 hours to copy one of these files whereas a regular copy/paste takes around 20 minutes.

I've gone through and "tweaked" my settings on the robocopy but something must still be off, I just can't imagine it taking that long.

Any ideas? Thanks.

Comments

  • mrmcmintmrmcmint Member Posts: 492 ■■■□□□□□□□
    Could you not add a /PURGE to your script? This will add new files and delete ones that have been deleted on the live side of things, ie it will maintain an exact copy of whatever you are copying kind of incrementally.

    We do this for one of our servers. It has 30gb of word/excel files mainly, it took several hours to robocopy them in the first instance, but now the /purge has been added, it maintains an exact copy of the live data in approx 20 mins. It also ignores the 256 file character rule which is bl**dy annoying!!

    HTH... I've had a few beers so I will read this again tomorrow drunken_smilie.gif
  • tierstentiersten Member Posts: 4,505
    mrmcmint wrote: »
    It also ignores the 256 file character rule which is bl**dy annoying!!
    /256 turns that off.
  • Megadeth4168Megadeth4168 Member Posts: 2,157
    We are currently using /MIR which I understand to imply /PURGE. The idea is that these 125 files are always there with the same name, same size, we are just copying the ones that have been modified.
  • HeroPsychoHeroPsycho Inactive Imported Users Posts: 1,940
    I need a better solution, or maybe I need to refine my robocopy script. Anyway, we have 125 files each at the same size of 10gb. Every day we need to check to see if any of these files have changed and if so then we need to copy them across the network to another NAS. Its pretty simple except that for the fact that robocopy takes about 3-4 hours to copy one of these files whereas a regular copy/paste takes around 20 minutes.

    I've gone through and "tweaked" my settings on the robocopy but something must still be off, I just can't imagine it taking that long.

    Any ideas? Thanks.

    Robocopy pwns for copying large numbers of small files. It is not geared for copying large data files.

    Try eseutil.

    Ask the Performance Team : Slow Large File Copy Issues
    Good luck to all!
  • Megadeth4168Megadeth4168 Member Posts: 2,157
    HeroPsycho wrote: »
    Robocopy pwns for copying large numbers of small files. It is not geared for copying large data files.

    Try eseutil.

    Ask the Performance Team : Slow Large File Copy Issues

    I actually did see that article, the problem is, we don't use exchange. I'm not sure how else to get that utility.
  • Megadeth4168Megadeth4168 Member Posts: 2,157
    tuscani wrote: »

    I tried it on one file, it took 17½ minutes to complete. Which is good, the problem is that this program doesn't appear to have an option like robocopy where it can be setup as a script to copy just the modified files only.

    Maybe I'm wrong.
  • HeroPsychoHeroPsycho Inactive Imported Users Posts: 1,940
    I tried it on one file, it took 17½ minutes to complete. Which is good, the problem is that this program doesn't appear to have an option like robocopy where it can be setup as a script to copy just the modified files only.

    Maybe I'm wrong.

    If it does have a command line interface (never used it before), you could get that portion done using Powershell to compare the file dates and then filter just ones that are different by day, then pipe those to a loop that utilizes terracopy to copy them.
    Good luck to all!
  • ClaymooreClaymoore Member Posts: 1,637
    If you are going to all the trouble of writing few lines of powershell code to filter the files by date, why not just pipe that to a copy-item command and skip Terracopy. Then you could run the powershell script as a scheduled task each night. Looks like Terracopy would only save a few minutes anyway.

    What would be great is if you had access to some kind of NAS replication software that would take care of this automatically. A Windows DFS implementation would work as well.

    Edit - Try this powershell script
    get-childitem c:\local\files |? {$_.CreationTime -gt (get-date).AddDays(-1)} | copy-item -dest [URL="file://\\server\share\remote"]\\server\share\remote[/URL]

    Edit 2 - Sorry, these aren't newly created files but modified files. You will need to use the LastWriteTime attribute instead
    get-childitem c:\local\files |? {$_.LastWriteTime -gt (get-date).AddDays(-1)} | copy-item -dest [URL="file://\\server\share\remote"]\\server\share\remote[/URL]

    Edit 3 - You're going to want to run this more than once, so you will need a couple more options at the end -force and -recurse for folders (if any)
    get-childitem c:\local\files |? {$_.LastWriteTime -gt (get-date).AddDays(-1)} | copy-item -dest [URL="file://server/share/remote"]\\server\share\remote[/URL] -force -recurse
  • HeroPsychoHeroPsycho Inactive Imported Users Posts: 1,940
    Claymoore wrote: »
    If you are going to all the trouble of writing few lines of powershell code to filter the files by date, why not just pipe that to a copy-item command and skip Terracopy. Then you could run the powershell script as a scheduled task each night. Looks like Terracopy would only save a few minutes anyway.

    What would be great is if you had access to some kind of NAS replication software that would take care of this automatically. A Windows DFS implementation would work as well.

    Edit - Try this powershell script
    get-childitem c:\local\files |? {$_.CreationTime -gt (get-date).AddDays(-1)} | copy -dest [URL="file://%5C%5Cserver%5Cshare%5Cremote"]\\server\share\remote[/URL]

    He's got 125 files at 10GB each. Saving 10 minutes a file could add up.

    And no reason you couldn't do the PowerShell script and scheduled task using Terracopy.
    Good luck to all!
  • sasiliksasilik Member Posts: 1 ■□□□□□□□□□
    We are currently using /MIR which I understand to imply /PURGE. The idea is that these 125 files are always there with the same name, same size, we are just copying the ones that have been modified.

    Doesn't logging job results reveal anything? /V /TS /FP /ETA /LOG:lofile.log
Sign In or Register to comment.