Skip to content

Instantly share code, notes, and snippets.

@alexlicohen
alexlicohen / ERISOne.sh
Created December 19, 2020 22:23
[ERISOne] How to get into BWH computing #cohenlab #BWH #ERISOne
# web-based (must be within VPN):
https://jupyterhub.partners.org/
#ssh (must be within VPN; lowercase username):
ssh -XYC xxx@erisone.partners.org
# transfer node:
rsync -avz path/to/source/files <username>@erisonexf.partners.org:path/to/destination/folder
@alexlicohen
alexlicohen / GSPdownload.sh
Last active November 12, 2020 13:37
[GSP Dataverse Download] This is the command to download the GSP1000 from the Harvard Dataverse #cohenlab #GSP #shell #dataverse
# The API token is obtainable after you log-in to Harvard dataverse from your profile page
export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
export SERVER_URL=https://dataverse.harvard.edu
export PERSISTENT_ID=doi:10.7910/DVN/ILXIKS
export VERSION=1.0
curl -O -J -H "X-Dataverse-key:$API_TOKEN" $SERVER_URL/api/access/dataset/:persistentId/versions/$VERSION?persistentId=$PERSISTENT_ID
@alexlicohen
alexlicohen / GSP1000toDataverse.sh
Created November 12, 2020 13:34
[GSP1000 to Dataverse] This is the command used to upload the GSP to Harvard Dataverse #dataverse #GSP #cohenlab #shell
# key is your API key from Harvard Dataverse (optainable after log-in from the profile settings)
java -jar DVUploader-v1.0.9.jar -server=https://dataverse.harvard.edu -did=doi:10.7910/DVN/ILXIKS -key=xxxx-xxx-xxxx-xxxx -ex=^.DS_Store -recurse GSP1000
@alexlicohen
alexlicohen / rclone_e2_to_GDrive.sh
Last active December 2, 2020 00:15
[rclone E2 to GDrive] set of commands to do a one-way sync from E2 to the Google Drive partition #cohenlab #rclone #E2 #GDrive
# Back up most of E2 to GDrive
rclone sync /lab-share/Neuro-Cohen-e2/Public/collections/yeo1000 drive_cohenlab:/collections/yeo1000 --progress --copy-links --exclude '.*{/**,}' --delete-excluded --delete-during --drive-stop-on-upload-limit && \
rclone sync /lab-share/Neuro-Cohen-e2/Public/collections/GSP drive_cohenlab:/collections/GSP --progress --copy-links --exclude '.*{/**,}' --delete-excluded --delete-during --drive-stop-on-upload-limit && \
rclone sync /lab-share/Neuro-Cohen-e2/Public/connectomes drive_cohenlab_Connectomes:/connectomes --progress --copy-links --exclude '.*{/**,}' --delete-excluded --delete-during --drive-stop-on-upload-limit && \
rclone sync /lab-share/Neuro-Cohen-e2/Public/containers drive_cohenlab:/containers --progress --copy-links --exclude '.*{/**,}' --delete-excluded --delete-during --drive-stop-on-upload-limit && \
rclone sync /lab-share/Neuro-Cohen-e2/Public/environment drive_cohenlab:/environment -
@alexlicohen
alexlicohen / array_sbatch.sh
Last active October 27, 2020 16:18
[Manual array of sbatch jobs] Quick one-liner for running a script with a list of numbers as an input #cohenlab #slurm #E2
for i in $(seq 97); do
sbatch /lab-share/Neuro-Cohen-e2/Public/repos/e2scripts/single_LOOCV_PALM.sh \
/lab-share/Neuro-Cohen-e2/Public/notebooks/acohen8/PALM_Damage_Score_LOOCV/4_PALM_LOOCV_memV.py $i
sleep 10
done
@alexlicohen
alexlicohen / sbatch_E2_FSv7.1.sh
Last active August 9, 2020 21:07
[E2 Slurm Freesurfer v7.1.0] Command to run FSv7.1 on E2 using Slurm #cohenlab #E2 #slurm #freesurfer #ABCD
/lab-share/Neuro-Cohen-e2/Public/repos/e2scripts/sbatch_script_bids_target_sublist.sh \
/lab-share/Neuro-Cohen-e2/Public/repos/e2scripts/single_freesurfer_v7.1.sh \
/reference_databases/RestrictedAccessDatabase/abcdbids/s3 \
/lab-share/Neuro-Cohen-e2/Public/projects/ABCD/freesurfer_v7.1.0 \
/lab-share/Neuro-Cohen-e2/Public/projects/ABCD/ABCD6.txt
@alexlicohen
alexlicohen / github_credentials_setup.sh
Created June 28, 2020 19:35
[Store GitHub Credentials] How to store your credentials on the server to stop having to type your password everytime #cohenlab #github #shell
# Allow git to store your password for 24hours at a time
git config --global credential.helper 'cache --timeout=86400'
@alexlicohen
alexlicohen / connectome_quick.sh
Created June 26, 2020 19:10
[connectome_quick examples] How to run connectome_quick.py manually instead of through the Preprocessing notebook #cohenlab
#MGH_w_GSP1000_v45
python3 /usr/local/software_env/python_modules/nimlab/nimlab/scripts/connectome_quick.py \
-cs /data1/connectome_npy/GSP1000_CBIG_Legacy_v45/ \
-r /jupyter_mount/acohen8/Yeo_to_GSP/MGH_w_GSP1000_CBIG_Legacy_v45/lesion_list.csv \
-o /jupyter_mount/acohen8/Yeo_to_GSP/MGH_w_GSP1000_CBIG_Legacy_v45/Functional_Connectivity
#MGH_w_yeo1000
python3 /usr/local/software_env/python_modules/nimlab/nimlab/scripts/connectome_quick.py \
-cs /data1/connectome_npy/yeo1000_dil/ \
@alexlicohen
alexlicohen / add_lab_member.sh
Created June 22, 2020 21:23
[sudo on corrin/caladan] how do I add new lab members #cohenlab #shell
# Ask Sean Clancy in CRL to add user to crlfs9 and then, on each machine, run:
sudo perl /fileserver/sysadmin/pw/merge-system-files.pl
sudo nano /etc/sudoers.d/lab_sudoers
# add the following:
chxxxxxx ALL=(ALL) ALL # First Last name
@alexlicohen
alexlicohen / gdrive_E2_sync.sh
Last active June 22, 2020 11:41
[rclone GDrive to E2] One-way, with deleting of destination files, sync from GDrive to E2 for selected dirs #cohenlab #rclone #E2 #GDrive
rclone sync drive_cohenlab:/connectomes /lab-share/Neuro-Cohen-e2/Public/connectomes --progress && \
rclone sync drive_cohenlab:/lesions /lab-share/Neuro-Cohen-e2/Public/lesions --progress && \
rclone sync drive_cohenlab:/LNM_archive /lab-share/Neuro-Cohen-e2/Public/LNM_archive --progress && \
rclone sync drive_cohenlab:/notebooks /lab-share/Neuro-Cohen-e2/Public/notebooks --progress && \
rclone sync drive_cohenlab:/xnat_archive /lab-share/Neuro-Cohen-e2/Public/xnat_archive --progress && \
rclone sync drive_cohenlab:/collections/yeo1000_nii /lab-share/Neuro-Cohen-e2/Public/collections/yeo1000 --progress