Skip to content

Instantly share code, notes, and snippets.

@urigoren
Last active January 29, 2023 13:04
Show Gist options
  • Star 6 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save urigoren/fde89fcfa454627152f83e41c3927572 to your computer and use it in GitHub Desktop.
Save urigoren/fde89fcfa454627152f83e41c3927572 to your computer and use it in GitHub Desktop.
Display the source blob
Display the rendered blob
Raw
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"colab": {
"name": "drive2youtube.ipynb",
"provenance": [],
"collapsed_sections": []
},
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
}
},
"cells": [
{
"cell_type": "code",
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "9uZKzRN5imQt",
"outputId": "a7217868-3a8f-40c4-85cc-83e903a150cb"
},
"source": [
"!pip install google-api-python-client youtube-video-upload tqdm"
],
"execution_count": 50,
"outputs": [
{
"output_type": "stream",
"text": [
],
"name": "stdout"
}
]
},
{
"cell_type": "code",
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "Z0xlUV_FhT78",
"outputId": "936df033-af20-42f6-c5fa-6492fe8c6bd3"
},
"source": [
"import json, sys, os, collections, itertools\r\n",
"from pathlib import Path\r\n",
"from time import sleep\r\n",
"from tqdm.notebook import tqdm\r\n",
"from youtube_video_upload.upload_video import upload_video\r\n",
"from youtube_video_upload.get_credentials import get_credentials\r\n",
"from google.colab import drive\r\n",
"ls = lambda p: print(\"\\n\".join(map(str, p.iterdir())))\r\n",
"drive.mount('/content/drive/')"
],
"execution_count": 62,
"outputs": [
{
"output_type": "stream",
"text": [
"Drive already mounted at /content/drive/; to attempt to forcibly remount, call drive.mount(\"/content/drive/\", force_remount=True).\n"
],
"name": "stdout"
}
]
},
{
"cell_type": "code",
"metadata": {
"id": "jaJqqF59hbiC"
},
"source": [
"video_dir = Path('/content/drive/My Drive/argmax/courses/RecSys')\r\n",
"video_urls_file = \"/content/drive/My Drive/Colab Notebooks/drive2youtube/videos.json\"\r\n",
"youtube_secret = \"/content/drive/My Drive/Colab Notebooks/drive2youtube/client_id.json\"\r\n",
"with open(youtube_secret, 'r') as f:\r\n",
" yt_credentials = get_credentials(json.load(f))"
],
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "6J6fNMgKoKkJ"
},
"source": [
"videos = itertools.chain(video_dir.rglob(\"*.mkv\"), video_dir.rglob(\"*.mp4\"))\r\n",
"videos = list(videos)"
],
"execution_count": 63,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "nD6uv_lDhrUG"
},
"source": [
"youtube_urls = dict()\r\n",
"for video in tqdm(videos):\r\n",
" filename = str(video)\r\n",
" basename = video.name.rsplit('.',1)[0]\r\n",
" try:\r\n",
" url = upload_video(yt_credentials,filename,title=basename,description=basename,privacy=\"unlisted\",category=\"27\")\r\n",
" youtube_urls[basename]=url\r\n",
" except:\r\n",
" youtube_urls[basename]=\"-ERROR-\"\r\n",
" print (\"Error uploading \" + basename)\r\n",
" sleep(35)\r\n",
"\r\n",
"with open(video_urls_file, 'w') as f:\r\n",
" json.dump(youtube_urls,f,indent=4)"
],
"execution_count": null,
"outputs": []
}
]
}
@mnabil77
Copy link

Hi .. How to get videos.json file

@urigoren
Copy link
Author

videos.json is the output file
@mnabil77

@fuhrmanator
Copy link

Hello - Thanks for this gist! Before anyone else tries, I think this method isn't so great for a few reasons:

  1. OAuth changed and you need to link back to a server running on local host, which is not easy to get running in colab. I ran this solution for a server, and used flow.run_local_server(), but had to hack the last step because I couldn't figure out how to redirect to the colab web server.
  2. Youtube uploads under the API are now set with a quota of 10k "points" per day, which amounts to about less than 10 videos per day (each one is around 1.6K "points"). So, despite the beauty of finding all your videos in Google Drive, only a handful will successfully upload (althought it's super fast because you don't use your local bandwidth).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment