Skip to content

Instantly share code, notes, and snippets.

@utkuboduroglu
Last active September 29, 2021 15:04
Show Gist options
  • Save utkuboduroglu/bc5eaac95062f4f28db553f2b8fe0caf to your computer and use it in GitHub Desktop.
Save utkuboduroglu/bc5eaac95062f4f28db553f2b8fe0caf to your computer and use it in GitHub Desktop.
A sample Colab notebook for training YOLOv4-tiny with Darknet.
Display the source blob
Display the rendered blob
Raw
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"colab": {
"name": "training-yolov4.ipynb",
"provenance": [],
"collapsed_sections": []
},
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
},
"language_info": {
"name": "python"
},
"accelerator": "GPU"
},
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "D7v0VgCs5sUE"
},
"source": [
"# Notes\n",
"* For GPU support during training, make sure to enable GPU in Colab. To do this, go to\n",
"```\n",
"Runtime > Change runtime type > Hardware Accelerator > GPU\n",
"```\n",
"* Make sure to change the environment variable `MODEL_PATH` to your own path. Otherwise, you will write over someone else's data.\n",
"* Make sure to change the Roboflow URL so that you pull your own dataset.\n",
"* Modify the `yolov4-tiny.cfg` you copied. Otherwise, you will not train correctly!"
]
},
{
"cell_type": "code",
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "bLZ7HXdkSFXS",
"outputId": "2cfa988b-22c2-46dc-f9a5-4da35fccf9df"
},
"source": [
"# mount your google drive\n",
"from google.colab import drive\n",
"\n",
"drive.mount('/content.gdrive', force_remount=True)\n",
"# saving the file at Driverless' Perception subfolder\n",
"import os\n",
"os.environ['MODEL_PATH'] = \"/content.gdrive/MyDrive/Driverless/Perception/general_vehicles\"\n",
"!mkdir -p $MODEL_PATH"
],
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Mounted at /content.gdrive\n"
]
}
]
},
{
"cell_type": "code",
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "UvR1o0d3SM51",
"outputId": "6f364c68-7fcc-40ae-ac4f-4a34967b15da"
},
"source": [
"# check whether our GPU is loaded\n",
"!nvidia-smi"
],
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Tue Sep 21 08:25:04 2021 \n",
"+-----------------------------------------------------------------------------+\n",
"| NVIDIA-SMI 470.63.01 Driver Version: 460.32.03 CUDA Version: 11.2 |\n",
"|-------------------------------+----------------------+----------------------+\n",
"| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |\n",
"| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |\n",
"| | | MIG M. |\n",
"|===============================+======================+======================|\n",
"| 0 Tesla K80 Off | 00000000:00:04.0 Off | 0 |\n",
"| N/A 47C P8 31W / 149W | 0MiB / 11441MiB | 0% Default |\n",
"| | | N/A |\n",
"+-------------------------------+----------------------+----------------------+\n",
" \n",
"+-----------------------------------------------------------------------------+\n",
"| Processes: |\n",
"| GPU GI CI PID Type Process name GPU Memory |\n",
"| ID ID Usage |\n",
"|=============================================================================|\n",
"| No running processes found |\n",
"+-----------------------------------------------------------------------------+\n"
]
}
]
},
{
"cell_type": "code",
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "HvsR_l2FSyAJ",
"outputId": "4dfa4a72-148d-41c9-9826-e938b2d45f7e"
},
"source": [
"# clone Darknet\n",
"import os\n",
"os.environ['PATH'] += ':/usr/local/cuda/bin'\n",
"\n",
"!rm -rf darknet\n",
"!git clone https://github.com/AlexeyAB/darknet.git"
],
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Cloning into 'darknet'...\n",
"remote: Enumerating objects: 15308, done.\u001b[K\n",
"remote: Total 15308 (delta 0), reused 0 (delta 0), pack-reused 15308\u001b[K\n",
"Receiving objects: 100% (15308/15308), 13.70 MiB | 17.07 MiB/s, done.\n",
"Resolving deltas: 100% (10402/10402), done.\n"
]
}
]
},
{
"cell_type": "code",
"metadata": {
"id": "Q3AhUAEOTQUe"
},
"source": [
"# enable GPU and OpenCV for training\n",
"!sed -i 's/GPU=0/GPU=1/g' ./darknet/Makefile\n",
"!sed -i 's/OPENCV=0/OPENCV=1/g' ./darknet/Makefile\n",
"\n",
"# compile Darknet with more frequent weight saving\n",
"# add commands for modifying src/yolo.c here if we want to increase frequency"
],
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "FrxReayWWZY1"
},
"source": [
"%%capture\n",
"# build Darknet\n",
"!(cd darknet && make)"
],
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "_91kwNXETr3h",
"outputId": "f2e75e68-115c-43e4-d1b5-cff2a344a5f6"
},
"source": [
"# check if darknet has compiled successfully\n",
"!./darknet/darknet detector"
],
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
" CUDA-version: 11010 (11020), GPU count: 1 \n",
" OpenCV version: 3.2.0\n",
"usage: ./darknet/darknet detector [train/test/valid/demo/map] [data] [cfg] [weights (optional)]\n"
]
}
]
},
{
"cell_type": "code",
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "hU2dei1OUFBK",
"outputId": "5258acf9-c8a5-499d-ed59-dd0c956bbd2c"
},
"source": [
"# for this notebook we're only using yolov4-tiny for testing purposes.\n",
"# the process is almost identical to yolov4, check alexeyAB for more details.\n",
"\n",
"# create the appropriate directories\n",
"!mkdir -p $MODEL_PATH/{cfg,data,backup,train_logs}\n",
"\n",
"!wget \"https://github.com/AlexeyAB/darknet/releases/download/darknet_yolo_v4_pre/yolov4-tiny.conv.29\" -P $MODEL_PATH/data"
],
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"--2021-09-14 14:50:53-- https://github.com/AlexeyAB/darknet/releases/download/darknet_yolo_v4_pre/yolov4-tiny.conv.29\n",
"Resolving github.com (github.com)... 140.82.114.3\n",
"Connecting to github.com (github.com)|140.82.114.3|:443... connected.\n",
"HTTP request sent, awaiting response... 302 Found\n",
"Location: https://github-releases.githubusercontent.com/75388965/28807d00-3ea4-11eb-97b5-4c846ecd1d05?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20210914%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20210914T145053Z&X-Amz-Expires=300&X-Amz-Signature=4aaf5e73907ed28dec3605bf563ddc1e5889b56bfe57510f7b8534f676da7053&X-Amz-SignedHeaders=host&actor_id=0&key_id=0&repo_id=75388965&response-content-disposition=attachment%3B%20filename%3Dyolov4-tiny.conv.29&response-content-type=application%2Foctet-stream [following]\n",
"--2021-09-14 14:50:53-- https://github-releases.githubusercontent.com/75388965/28807d00-3ea4-11eb-97b5-4c846ecd1d05?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20210914%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20210914T145053Z&X-Amz-Expires=300&X-Amz-Signature=4aaf5e73907ed28dec3605bf563ddc1e5889b56bfe57510f7b8534f676da7053&X-Amz-SignedHeaders=host&actor_id=0&key_id=0&repo_id=75388965&response-content-disposition=attachment%3B%20filename%3Dyolov4-tiny.conv.29&response-content-type=application%2Foctet-stream\n",
"Resolving github-releases.githubusercontent.com (github-releases.githubusercontent.com)... 185.199.108.154, 185.199.111.154, 185.199.110.154, ...\n",
"Connecting to github-releases.githubusercontent.com (github-releases.githubusercontent.com)|185.199.108.154|:443... connected.\n",
"HTTP request sent, awaiting response... 200 OK\n",
"Length: 19789716 (19M) [application/octet-stream]\n",
"Saving to: ‘/content.gdrive/MyDrive/Driverless/Perception/general_vehicles/data/yolov4-tiny.conv.29’\n",
"\n",
"yolov4-tiny.conv.29 100%[===================>] 18.87M 39.1MB/s in 0.5s \n",
"\n",
"2021-09-14 14:50:54 (39.1 MB/s) - ‘/content.gdrive/MyDrive/Driverless/Perception/general_vehicles/data/yolov4-tiny.conv.29’ saved [19789716/19789716]\n",
"\n"
]
}
]
},
{
"cell_type": "code",
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "qUh9fBfkWTJB",
"outputId": "9c81c871-bdda-4808-9122-eb4ab90b2b56"
},
"source": [
"# copy the sample yolov4 config to the path\n",
"!cp -v /content/darknet/cfg/yolov4-tiny-custom.cfg $MODEL_PATH/cfg/yolov4-tiny.cfg"
],
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"'/content/darknet/cfg/yolov4-tiny-custom.cfg' -> '/content.gdrive/MyDrive/Driverless/Perception/general_vehicles/cfg/yolov4-tiny.cfg'\n"
]
}
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "UE6mUX8ImFUU"
},
"source": [
"# Some notes on datasets\n",
"In this jupyter notebook, we use a random (and small) dataset from Roboflow to demonstrate how to train our dataset. In our actual use case (training on cones), we will need to load our dataset in through Google Drive by uploading the files and generating the annotations and metadata by hand, so the steps following this cell will be a bit different.\n",
"\n",
"Additionally, we do not always have to download and extract the dataset we have, since it is already saved to our team's Google Drive.\n",
"\n",
"Finally, some of the configurations we can change are:\n",
"\n",
"1. The environment variable `MODEL_PATH`: this variable is for storing the absolute path of our model configs and dataset. For each different model and dataset we train, we only need to change this value to the new path we want to use. Thus, we can keep our old changes and compare models as we like.\n",
"1. The config file `yolov4-tiny.cfg`: This config file is in accordance to [AlexeyAB's guide for custom training models](https://github.com/AlexeyAB/darknet#how-to-train-to-detect-your-custom-objects), therefore we can modify this file so that it is the best way for training models.\n",
"1. The file `data/obj.data`: this file is for specifying to Darknet where our images and annotations are stored, where to save the weight files etc. Check below for more details, as this file can be modified and improved to do k-fold crossvalidation."
]
},
{
"cell_type": "code",
"metadata": {
"id": "s9b6Am5fbbkb"
},
"source": [
"%%capture\n",
"# pull a sample dataset; this specific one is a chess-piece dataset\n",
"!(cd $MODEL_PATH/data && curl -L \"https://public.roboflow.com/ds/3uiMvhdiKM?key=2WUlC7taaW\" > roboflow.zip; unzip roboflow.zip; rm roboflow.zip)"
],
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "DhwoAGUwfpND",
"outputId": "3bc86c4a-7e4b-421c-e00a-ed15ed26bf3a"
},
"source": [
"# generate text files train.txt, valid.txt\n",
"!(cd $MODEL_PATH/data && find \"$(cd train; pwd)\" -type f -name '*.jpg' > train.txt)\n",
"!(cd $MODEL_PATH/data && find \"$(cd valid; pwd)\" -type f -name '*.jpg' > valid.txt)\n",
"!(cd $MODEL_PATH/data && find \"$(cd test; pwd)\" -type f -name '*.jpg' > test.txt)\n",
"!ls $MODEL_PATH/data -ltr"
],
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"total 19483\n",
"-rw------- 1 root root 19789716 Dec 15 2020 yolov4-tiny.conv.29\n",
"drwx------ 2 root root 4096 Jun 16 07:40 valid\n",
"-rw------- 1 root root 419 Jun 16 07:40 README.roboflow.txt\n",
"-rw------- 1 root root 1371 Jun 16 07:40 README.dataset.txt\n",
"drwx------ 2 root root 4096 Sep 14 14:51 test\n",
"drwx------ 2 root root 4096 Sep 14 14:51 train\n",
"-rw------- 1 root root 113262 Sep 14 14:56 train.txt\n",
"-rw------- 1 root root 32250 Sep 14 14:56 valid.txt\n"
]
}
]
},
{
"cell_type": "code",
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "CfVfOo7sp_98",
"outputId": "6ca384ab-5f25-4b46-9ea4-4c71c30dc393"
},
"source": [
"!pip install pyngrok\n",
"from pyngrok import ngrok\n",
"\n",
"# use pyngrok for constant training graphs\n",
"public_url = ngrok.connect(addr='8090')\n",
"public_url"
],
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Collecting pyngrok\n",
" Downloading pyngrok-5.1.0.tar.gz (745 kB)\n",
"\u001b[?25l\r\u001b[K |▍ | 10 kB 25.1 MB/s eta 0:00:01\r\u001b[K |▉ | 20 kB 8.8 MB/s eta 0:00:01\r\u001b[K |█▎ | 30 kB 8.1 MB/s eta 0:00:01\r\u001b[K |█▊ | 40 kB 7.4 MB/s eta 0:00:01\r\u001b[K |██▏ | 51 kB 4.1 MB/s eta 0:00:01\r\u001b[K |██▋ | 61 kB 4.4 MB/s eta 0:00:01\r\u001b[K |███ | 71 kB 4.4 MB/s eta 0:00:01\r\u001b[K |███▌ | 81 kB 5.0 MB/s eta 0:00:01\r\u001b[K |████ | 92 kB 5.1 MB/s eta 0:00:01\r\u001b[K |████▍ | 102 kB 4.2 MB/s eta 0:00:01\r\u001b[K |████▉ | 112 kB 4.2 MB/s eta 0:00:01\r\u001b[K |█████▎ | 122 kB 4.2 MB/s eta 0:00:01\r\u001b[K |█████▊ | 133 kB 4.2 MB/s eta 0:00:01\r\u001b[K |██████▏ | 143 kB 4.2 MB/s eta 0:00:01\r\u001b[K |██████▋ | 153 kB 4.2 MB/s eta 0:00:01\r\u001b[K |███████ | 163 kB 4.2 MB/s eta 0:00:01\r\u001b[K |███████▌ | 174 kB 4.2 MB/s eta 0:00:01\r\u001b[K |████████ | 184 kB 4.2 MB/s eta 0:00:01\r\u001b[K |████████▍ | 194 kB 4.2 MB/s eta 0:00:01\r\u001b[K |████████▉ | 204 kB 4.2 MB/s eta 0:00:01\r\u001b[K |█████████▎ | 215 kB 4.2 MB/s eta 0:00:01\r\u001b[K |█████████▊ | 225 kB 4.2 MB/s eta 0:00:01\r\u001b[K |██████████▏ | 235 kB 4.2 MB/s eta 0:00:01\r\u001b[K |██████████▌ | 245 kB 4.2 MB/s eta 0:00:01\r\u001b[K |███████████ | 256 kB 4.2 MB/s eta 0:00:01\r\u001b[K |███████████▍ | 266 kB 4.2 MB/s eta 0:00:01\r\u001b[K |███████████▉ | 276 kB 4.2 MB/s eta 0:00:01\r\u001b[K |████████████▎ | 286 kB 4.2 MB/s eta 0:00:01\r\u001b[K |████████████▊ | 296 kB 4.2 MB/s eta 0:00:01\r\u001b[K |█████████████▏ | 307 kB 4.2 MB/s eta 0:00:01\r\u001b[K |█████████████▋ | 317 kB 4.2 MB/s eta 0:00:01\r\u001b[K |██████████████ | 327 kB 4.2 MB/s eta 0:00:01\r\u001b[K |██████████████▌ | 337 kB 4.2 MB/s eta 0:00:01\r\u001b[K |███████████████ | 348 kB 4.2 MB/s eta 0:00:01\r\u001b[K |███████████████▍ | 358 kB 4.2 MB/s eta 0:00:01\r\u001b[K |███████████████▉ | 368 kB 4.2 MB/s eta 0:00:01\r\u001b[K |████████████████▎ | 378 kB 4.2 MB/s eta 0:00:01\r\u001b[K |████████████████▊ | 389 kB 4.2 MB/s eta 0:00:01\r\u001b[K |█████████████████▏ | 399 kB 4.2 MB/s eta 0:00:01\r\u001b[K |█████████████████▋ | 409 kB 4.2 MB/s eta 0:00:01\r\u001b[K |██████████████████ | 419 kB 4.2 MB/s eta 0:00:01\r\u001b[K |██████████████████▌ | 430 kB 4.2 MB/s eta 0:00:01\r\u001b[K |███████████████████ | 440 kB 4.2 MB/s eta 0:00:01\r\u001b[K |███████████████████▍ | 450 kB 4.2 MB/s eta 0:00:01\r\u001b[K |███████████████████▉ | 460 kB 4.2 MB/s eta 0:00:01\r\u001b[K |████████████████████▎ | 471 kB 4.2 MB/s eta 0:00:01\r\u001b[K |████████████████████▋ | 481 kB 4.2 MB/s eta 0:00:01\r\u001b[K |█████████████████████ | 491 kB 4.2 MB/s eta 0:00:01\r\u001b[K |█████████████████████▌ | 501 kB 4.2 MB/s eta 0:00:01\r\u001b[K |██████████████████████ | 512 kB 4.2 MB/s eta 0:00:01\r\u001b[K |██████████████████████▍ | 522 kB 4.2 MB/s eta 0:00:01\r\u001b[K |██████████████████████▉ | 532 kB 4.2 MB/s eta 0:00:01\r\u001b[K |███████████████████████▎ | 542 kB 4.2 MB/s eta 0:00:01\r\u001b[K |███████████████████████▊ | 552 kB 4.2 MB/s eta 0:00:01\r\u001b[K |████████████████████████▏ | 563 kB 4.2 MB/s eta 0:00:01\r\u001b[K |████████████████████████▋ | 573 kB 4.2 MB/s eta 0:00:01\r\u001b[K |█████████████████████████ | 583 kB 4.2 MB/s eta 0:00:01\r\u001b[K |█████████████████████████▌ | 593 kB 4.2 MB/s eta 0:00:01\r\u001b[K |██████████████████████████ | 604 kB 4.2 MB/s eta 0:00:01\r\u001b[K |██████████████████████████▍ | 614 kB 4.2 MB/s eta 0:00:01\r\u001b[K |██████████████████████████▉ | 624 kB 4.2 MB/s eta 0:00:01\r\u001b[K |███████████████████████████▎ | 634 kB 4.2 MB/s eta 0:00:01\r\u001b[K |███████████████████████████▊ | 645 kB 4.2 MB/s eta 0:00:01\r\u001b[K |████████████████████████████▏ | 655 kB 4.2 MB/s eta 0:00:01\r\u001b[K |████████████████████████████▋ | 665 kB 4.2 MB/s eta 0:00:01\r\u001b[K |█████████████████████████████ | 675 kB 4.2 MB/s eta 0:00:01\r\u001b[K |█████████████████████████████▌ | 686 kB 4.2 MB/s eta 0:00:01\r\u001b[K |██████████████████████████████ | 696 kB 4.2 MB/s eta 0:00:01\r\u001b[K |██████████████████████████████▍ | 706 kB 4.2 MB/s eta 0:00:01\r\u001b[K |██████████████████████████████▊ | 716 kB 4.2 MB/s eta 0:00:01\r\u001b[K |███████████████████████████████▏| 727 kB 4.2 MB/s eta 0:00:01\r\u001b[K |███████████████████████████████▋| 737 kB 4.2 MB/s eta 0:00:01\r\u001b[K |████████████████████████████████| 745 kB 4.2 MB/s \n",
"\u001b[?25hRequirement already satisfied: PyYAML in /usr/local/lib/python3.7/dist-packages (from pyngrok) (3.13)\n",
"Building wheels for collected packages: pyngrok\n",
" Building wheel for pyngrok (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
" Created wheel for pyngrok: filename=pyngrok-5.1.0-py3-none-any.whl size=19006 sha256=bcb9646706c662a5e8aa7d1553ff49838cffeb16a4c7a41560e999593fa18c4f\n",
" Stored in directory: /root/.cache/pip/wheels/bf/e6/af/ccf6598ecefecd44104069371795cb9b3afbcd16987f6ccfb3\n",
"Successfully built pyngrok\n",
"Installing collected packages: pyngrok\n",
"Successfully installed pyngrok-5.1.0\n"
]
},
{
"output_type": "execute_result",
"data": {
"text/plain": [
"<NgrokTunnel: \"http://dcd4-35-236-189-136.ngrok.io\" -> \"http://localhost:8090\">"
]
},
"metadata": {},
"execution_count": 7
}
]
},
{
"cell_type": "code",
"metadata": {
"id": "XxuVGHugXBPN"
},
"source": [
"# Start training!!!\n",
"# Use this line for initial training\n",
"#!(cd $MODEL_PATH && /content/darknet/darknet detector train data/obj.data cfg/yolov4-tiny.cfg data/yolo4-tiny.conv.29 -dont_show -mjpeg_port 8090 -map 2>&1 | tee $MODEL_PATH/train_logs/training_$(date +%H%M).log)\n",
"# Use this line for resuming training\n",
"!(cd $MODEL_PATH && /content/darknet/darknet detector train data/obj.data cfg/yolov4-tiny.cfg backup/yolov4-tiny_last.weights -dont_show -mjpeg_port 8090 -map 2>&1 | tee $MODEL_PATH/train_logs/training_$(date +%H%M).log)"
],
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "ejYVqWQ_drUc"
},
"source": [
"# run to check whether we have any new weight files\n",
"!ls $MODEL_PATH/backup"
],
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "mSKOuaCNlMIO"
},
"source": [
"# Some notes on training\n",
"* We should use k-fold validation on our datasets. To do this, Alexey suggests that we create appropriate *.data files for each variation of train/valid split and run each consecutively. _add reference for this_\n",
"* For the cone dataset, we will need image augmentation to improve our results. Either find how to use Darknet's image augmentation, or do it by hand.\n",
"* For general best practices for machine learning, consult \"Deep Learning with Python\" by Francois Chollet.\n",
"* When resuming training, pick it up from the latest weight file.\n",
"* If we want to increase the frequency with which we save our weights files, we can do so [through modifying the source code of Darknet](https://github.com/pjreddie/darknet/issues/190):\n",
"```\n",
"function train_detector in examples/detector.c:\n",
"(this exists in src/yolo.c in AlexeyAB/Darknet)\n",
"if(i%1000==0 || (i < 1000 && i%100 == 0))\n",
"```"
]
}
]
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment