Help to transfer data from one cluster (Baobab) to another (Bamboo)?

Due to insufficient RAM to run my jobs, I was recommended to switch from Baobab to Bamboo (conversation here) which has more RAM available, but I don’t know how to do such a transfer and would like to avoid to retransfer the data locally in nasac as it would take very long to do so given the size of the data + I may not have enough space on nasac to store all the temporary files that are necessary to run my analysis.

Can you guide me through the best way to do such a transfer?
I just tried to use rsync following hpc:best_practices [eResearch Doc] but it doesn’t seem to work (maybe because I try to transfer a folder from scratch?). Here is the command that I launched:

(baobab)-[clairis@login1 scratch]$ DST=$/home/users/c/clairis/scratch
(baobab)-[clairis@login1 scratch]$ DIR=fMRI_analysis
(baobab)-[clairis@login1 scratch]$ BAMBOO=login1.bamboo
(baobab)-[clairis@login1 scratch]$ rsync -aviuzPrg ${DIR} ${BAMBOO}:${DST}
(clairis@login1.bamboo) Password:
sending incremental file list
rsync: [Receiver] mkdir "/home/users/c/clairis/$/home/users/c/clairis/scratch" failed: No such file or directory (2)
rsync error: error in file IO (code 11) at main.c(783) [Receiver=3.2.3]

Thanks in advance for any help!

There is a typo in DST definition:


rsync: [Receiver] mkdir "/home/users/c/clairis/$/home/users/c/clairis/scratch" failed: No such file or directory (2)
# ---------------------------------------------^ Here

(baobab)-[clairis@login1 scratch]$ DST=$/home/users/c/clairis/scratch
# -------------------------------------^ Remove the '$'

let me know it’s working :slight_smile:

2 Likes

Yes it seems to work now!!! Thanks a lot :folded_hands:

1 Like