This document walks you through the steps to prepare a
wget compatible link from a file that is located in your Google Drive.
Motivation: When working in Deep Learning, we often use Google Colab, Kaggle Kernels, or Cloud Instances for training our models on GPUs. But the problem that comes with it is we often have to upload all the necessary files required to get things up and running. This is particularly problematic when we have a large dataset and this cannot be uploaded/gathered directly (sometimes,
scp does not work as well). We may have a dataset stored in our Google Drives. In situations like that, we generally create a
wget compatible link from the file (typically the dataset) located in our Google Drive (this document only deals with Google Drive).
- Right click on the file (located in Google Drive) and click on "Share".
- In the
Link sharing onsection, change the permissions of your file to "Anyone with the link can view" and copy the link.
- Now, the link should resemble
https://drive.google.com/file/d/FILEID/view?usp=sharing(note: I have obfuscated the FILEID for security reasons). Copy the FILEID.
- Now run the following from a Terminal (from Colab/Kaggle/Kernel/Your environment of choice):
wget --load-cookies /tmp/cookies.txt "https://docs.google.com/uc?export=download&confirm=$(wget --quiet --save-cookies /tmp/cookies.txt --keep-session-cookies --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILEID' -O- | sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\1\n/p')&id=FILEID" -O DESIRED_FILENAME && rm -rf /tmp/cookies.txt
Let me know if your thoughts in the comments section.