Transferring large amount of files

Hi all,

I need to transfer from Baobab scratch folder 400 folders of around 5 GB each to an external cluster. I would use Globus as for other HPC clusters, but unfortunately I can not in this case.

I am using rsync to achieve my goal. I would normally open my terminal through ssh, and run it over the night. But for some reason, since yesterday, I can not login to Baobab (while I tried to Bamboo and I can safely login, with my ssh key for Unige).

Are there solutions for large scale files transfer? I need to share my files with some colleagues and store them in this other cluster, so that I can delete them on the Baobab scratch.

Thank you in advance.

Hello,

Personally, I would tar czf the 400 folders into one archive and scp the tarball to wherever I need it to be.

Best regards

hi @Omar.Darwish

It should work again,

As I’m off today, I won’t investigate any further.

The HPC team have already talked about globus. We’ll be exploring the subject and analyzing the feasibility of this project.

For now, you can use rsync:

https://doc.eresearch.unige.ch/hpc/best_practices?s[]=rsync#rsync

Or the @Raphael.Rubino 's idea.

Best Regards