...
Code Block |
---|
guc is globus-url-copy - acronym invented by Levente
srm is a software layer over that - it makes sure transfers complete
There is a –r recursive copy option;
Or else you can do a loop in shell:
set daq = (1380001.daq 1380003.daq 1380004.daq);
foreach f ($daq)
globus-url-copy -p 25 file:/star/bla/bla/$f
gsiftp://pdsfgrid.nersc.gov/bla/bla/bla/$f
end
If you specify a source ending in "XXX/" , it will treat XXX as a
directory and transfer all files in XXX from the XXX directory
Use -p 25 option for having more streams, but isn't 25 overdoing it? You got 1 file
cut into 25 buffer chunks and all this has to be re-assembled on arrival.
I'd try +-2 around 8.
Use different end-point machine: dtn01.nersc.gov to avoid STAR,ATLAS conflicts
there is also carvergrid.nersc.gov gatekeeper I have access too.
pull from carver, you need to load the osg module ,and then you have the globus commands available
globus-job-run works like remote ssh command
Look at this page:
http://rcsg-gsir.imsb-dsgi.nrc-cnrc.gc.ca/globus_tutorial/#running
for globus-job-run, globus-job-submit and globus-job-get-output
(job is a command or script, not necessarily something that goes into
the batch system)
dtn01 is a data transfer node. It has a gsiftp server but no gatekeeper.
carvergrid has both but dtn01 has a 10Gb erthernet and carvergrid only 1Gb.
You can use dtn01 for globus-url-copy and carvergrid for globus-job... type commands.
|
bbbb