Skip to content

Instantly share code, notes, and snippets.

@fay59
Last active March 3, 2024 16:49
Show Gist options
  • Star 32 You must be signed in to star a gist
  • Fork 8 You must be signed in to fork a gist
  • Save fay59/8f719cd81967e0eb2234897491e051ec to your computer and use it in GitHub Desktop.
Save fay59/8f719cd81967e0eb2234897491e051ec to your computer and use it in GitHub Desktop.
Download entire iCloud shared albums
#!/bin/bash
# requires jq
# arg 1: iCloud web album URL
# arg 2: folder to download into (optional)
function curl_post_json {
curl -sH "Content-Type: application/json" -X POST -d "@-" "$@"
}
BASE_API_URL="https://p23-sharedstreams.icloud.com/$(echo $1 | cut -d# -f2)/sharedstreams"
pushd $2 > /dev/null
STREAM=$(echo '{"streamCtag":null}' | curl_post_json "$BASE_API_URL/webstream")
CHECKSUMS=$(echo $STREAM | jq -r '.photos[] | [(.derivatives[] | {size: .fileSize | tonumber, value: .checksum})] | max_by(.size | tonumber).value')
echo $STREAM \
| jq -c "{photoGuids: [.photos[].photoGuid]}" \
| curl_post_json "$BASE_API_URL/webasseturls" \
| jq -r '.items[] | "https://" + .url_location + .url_path' \
| while read URL; do
for CHECKSUM in $CHECKSUMS; do
if echo $URL | grep $CHECKSUM > /dev/null; then
curl -sOJ $URL &
break
fi
done
done
popd > /dev/null
wait
@vishna
Copy link

vishna commented Sep 30, 2017

This didn't work for me for an album that had >300 items. I came across other similar script written in python (https://github.com/VMannello/iCloud-PS-Download/blob/master/iCloudBD.py) that was doing batches of 20 items requests to /webassetsurls endpoint. That seemed to do the trick.

That said I don't really know how I would implement batching with jq so just leaving a comment here in case someone faces this issue and wonders wtf.

@DunhamGitHub
Copy link

When running in Terminal OSX I get following errors
/Users/myname/Downloads/icloud-album-download.sh: line 15: jq: command not found
/Users/myname/Downloads/icloud-album-download.sh: line 18: jq: command not found
/Users/myname/Downloads/icloud-album-download.sh: line 20: jq: command not found

@a3nm
Copy link

a3nm commented Dec 15, 2018

No longer seems to work. :-/ You probably need to replace "p23" by "p43", but even then the call to "webasserturls" doesn't work.

@WildDIC
Copy link

WildDIC commented Jan 22, 2019

I make few changes.

  1. Check right hostname
  2. Add checksum to url

Works for me. Lets try

#!/bin/bash

# requires jq
# arg 1: iCloud web album URL
# arg 2: folder to download into (optional)

function curl_post_json {
	curl -sH "Content-Type: application/json" -X POST -d "@-" "$@"
}

BASE_API_URL="https://p23-sharedstreams.icloud.com/$(echo $1 | cut -d# -f2)/sharedstreams"

pushd $2 > /dev/null
STREAM=$(echo '{"streamCtag":null}' | curl_post_json "$BASE_API_URL/webstream")
HOST=$(echo $STREAM | jq '.["X-Apple-MMe-Host"]' | cut -c 2- | rev | cut -c 2- | rev)

if [ "$HOST" ]; then
    BASE_API_URL="https://$(echo $HOST)/$(echo $1 | cut -d# -f2)/sharedstreams"
    STREAM=$(echo '{"streamCtag":null}' | curl_post_json "$BASE_API_URL/webstream")
fi

CHECKSUMS=$(echo $STREAM | jq -r '.photos[] | [(.derivatives[] | {size: .fileSize | tonumber, value: .checksum})] | max_by(.size | tonumber).value')

echo $STREAM \
| jq -c "{photoGuids: [.photos[].photoGuid]}" \
| curl_post_json "$BASE_API_URL/webasseturls" \
| jq -r '.items | to_entries[] | "https://" + .value.url_location + .value.url_path + "&" + .key' \
| while read URL; do
	for CHECKSUM in $CHECKSUMS; do
		if echo $URL | grep $CHECKSUM > /dev/null; then
			curl -sOJ $URL &
			break
		fi
	done
done

popd > /dev/null
wait

@jeffuu
Copy link

jeffuu commented Apr 12, 2019

Where do you put the URL?

@bricepepin
Copy link

Last version working for me thanks !

@jeffuu in terminal, you have to enter :
./icloud-album-download.sh https://www.icloud.com/path/to/album path/to/local/folder

@vchatela
Copy link

vchatela commented Sep 1, 2019

I make few changes.

  1. Check right hostname
  2. Add checksum to url

Works for me. Lets try

#!/bin/bash

# requires jq
# arg 1: iCloud web album URL
# arg 2: folder to download into (optional)

function curl_post_json {
	curl -sH "Content-Type: application/json" -X POST -d "@-" "$@"
}

BASE_API_URL="https://p23-sharedstreams.icloud.com/$(echo $1 | cut -d# -f2)/sharedstreams"

pushd $2 > /dev/null
STREAM=$(echo '{"streamCtag":null}' | curl_post_json "$BASE_API_URL/webstream")
HOST=$(echo $STREAM | jq '.["X-Apple-MMe-Host"]' | cut -c 2- | rev | cut -c 2- | rev)

if [ "$HOST" ]; then
    BASE_API_URL="https://$(echo $HOST)/$(echo $1 | cut -d# -f2)/sharedstreams"
    STREAM=$(echo '{"streamCtag":null}' | curl_post_json "$BASE_API_URL/webstream")
fi

CHECKSUMS=$(echo $STREAM | jq -r '.photos[] | [(.derivatives[] | {size: .fileSize | tonumber, value: .checksum})] | max_by(.size | tonumber).value')

echo $STREAM \
| jq -c "{photoGuids: [.photos[].photoGuid]}" \
| curl_post_json "$BASE_API_URL/webasseturls" \
| jq -r '.items | to_entries[] | "https://" + .value.url_location + .value.url_path + "&" + .key' \
| while read URL; do
	for CHECKSUM in $CHECKSUMS; do
		if echo $URL | grep $CHECKSUM > /dev/null; then
			curl -sOJ $URL &
			break
		fi
	done
done

popd > /dev/null
wait

Thank you very much, working for me !
Cheers

@Jwink3101
Copy link

This is really cool. Do you know if there is a way to get the comments?

@gwu888
Copy link

gwu888 commented Jul 12, 2020

@vchatela:

  1. I downloaded this script and run it, but didn't know why it's not working for me:
    https://www.icloud.com/photos/ -- is the URL of my browser logged into icloud photo;
    ./ -- this is the local directory in my MacBook:
    $ ./icloud-album-download.sh https://www.icloud.com/photos/ ./
    $ (return nothing back here, nothing downloaded, no error message)

  2. tried it in shell debug mode: still the same, even no debugging info showed:
    $ sh +x icloud-album-download.sh https://www.icloud.com/photos/ ./
    $ (again, return nothing back here, nothing downloaded, no error message)

Do you know what's was wrong here?

@vchatela
Copy link

vchatela commented Jul 14, 2020

Hi @gwu888
Did you try the original gist or the one I commented ? Only the second was working for me.
I used this script for a friend, but I think you need public sharing for icloud links. Your session cookie isn't shared and so it cannot works with this url for sure

@skauss
Copy link

skauss commented Oct 25, 2020

some of the error occur because jq is missing.
You will found jq

jq is a lightweight and flexible command-line JSON processor

https://stedolan.github.io/jq/download/

@dmschlab
Copy link

dmschlab commented Dec 6, 2021

An update on @vchatela's version.

I added in a few bits as I needed to download a large iCloud share.

Added user screen updates.
Added Code comments.
De-dup the Large File checksum list.
Moved the Large File checksum to an array so we can do a fast lookup.
Check the filesystem for the requested file so we are not spending time downloading a file to only throw away on save.
Unsilenced the download so we can see the progress.

#!/bin/bash

# requires jq
# arg 1: iCloud web album URL
# arg 2: folder to download into (optional)

clear

function curl_post_json {
	curl -sH "Content-Type: application/json" -X POST -d "@-" "$@"
}

printf "Getting iCloud Stream\n"
BASE_API_URL="https://p23-sharedstreams.icloud.com/$(echo $1 | cut -d# -f2)/sharedstreams"

pushd $2 > /dev/null
STREAM=$(echo '{"streamCtag":null}' | curl_post_json "$BASE_API_URL/webstream")
HOST=$(echo $STREAM | jq '.["X-Apple-MMe-Host"]' | cut -c 2- | rev | cut -c 2- | rev)

if [ "$HOST" ]; then
    BASE_API_URL="https://$(echo $HOST)/$(echo $1 | cut -d# -f2)/sharedstreams"
    STREAM=$(echo '{"streamCtag":null}' | curl_post_json "$BASE_API_URL/webstream")
fi

printf "Grabbing Large File Checksums\n"
CHECKSUMS=$(echo $STREAM | jq -r '.photos[] | [(.derivatives[] | {size: .fileSize | tonumber, value: .checksum})] | max_by(.size | tonumber).value')

printf "Adding Checksums to Array\n"
for CHECKSUM in $CHECKSUMS; do
    arrCHKSUM+=($CHECKSUM)
done
printf "Total Downloads: ${#arrCHKSUM[@]}\n"

# Dedup checksum to only include unique ids.
arrCHKSUM=($(printf "%s\n" "${arrCHKSUM[@]}" | sort -u))
printf "Unique Downloads: ${#arrCHKSUM[@]}\n"

printf "Streaming All Assets\n"
echo $STREAM \
| jq -c "{photoGuids: [.photos[].photoGuid]}" \
| curl_post_json "$BASE_API_URL/webasseturls" \
| jq -r '.items | to_entries[] | "https://" + .value.url_location + .value.url_path + "&" + .key' \
| while read URL; do

	# Get this URL's checksum value, not all URL's will be downloaded as there are both the fill size AND the thumbnail link in the Assets stream.
	LOCAL_CHECKSUM=$(echo "${URL##*&}")

	# If the url's checksum exists in the large checksum array then proceed with the download steps.
	if [[ " ${arrCHKSUM[*]} " =~ " ${LOCAL_CHECKSUM} " ]]; then

			# Get the filename from the URL, first we delimit on the forward slashes grabbing index 6 where the filename starts.
			# then we must delimit again on ? to remove all the URL parameters after the filename.
			# Example: https://www.example.com/4/5/IMG_0828.JPG?o=param1&v=param2&z=param3....
			FILE=$(echo $URL|cut -d "/" -f6 | cut -d "?" -f1)

			# Don't download movies
			if [[ "$FILE" == *.mp4* ]]; then
				echo "Skipping Movie"
			else

				# Don't download files that already exist
				if [[ -f "$FILE" ]]; then
					printf "Skipping $FILE\n"
				else
					# Original curl -sOJ $URL -> s = silent : O = download to file : J = Save using uploaded filename -- this also skips files that already exist.
					curl -OJ $URL
				fi

			fi

	else
		echo "Skipping Thumbnail"
	fi

done

popd > /dev/null
wait

@editwentyone
Copy link

@dmschlab thank you a lot, I will use this to sync for magic mirror

@Uj947nXmRqV2nRaWshKtHzTvckUUpD
Copy link

how to download mp4 as well?

Update: figured it out
i copied the 'curl -OJ $URL' inside the mp4 'if'

if [[ "$FILE" == *.mp4* ]]; then
curl -OJ $URL

@Uj947nXmRqV2nRaWshKtHzTvckUUpD

i would suggest :
pushd 2> /dev/null

to skip showing errors when path is not set

@Uj947nXmRqV2nRaWshKtHzTvckUUpD
Copy link

also if there are multiple files with same name, i would suggest to rename them by appending some index (or download timestamp with seconds) to the name

@Uj947nXmRqV2nRaWshKtHzTvckUUpD
Copy link

i made some slight changes to be able to download also mp4, as well as eventual files with same name. Also hides the pushd/popd warnings when no specific path is set:

#!/bin/bash

# requires jq
# arg 1: iCloud web album URL
# arg 2: folder to download into (optional)

clear

function curl_post_json {
	curl -sH "Content-Type: application/json" -X POST -d "@-" "$@"
}

printf "Getting iCloud Stream\n"
BASE_API_URL="https://p23-sharedstreams.icloud.com/$(echo $1 | cut -d# -f2)/sharedstreams"

pushd $2 2> /dev/null
STREAM=$(echo '{"streamCtag":null}' | curl_post_json "$BASE_API_URL/webstream")
HOST=$(echo $STREAM | jq '.["X-Apple-MMe-Host"]' | cut -c 2- | rev | cut -c 2- | rev)

if [ "$HOST" ]; then
    BASE_API_URL="https://$(echo $HOST)/$(echo $1 | cut -d# -f2)/sharedstreams"
    STREAM=$(echo '{"streamCtag":null}' | curl_post_json "$BASE_API_URL/webstream")
fi

printf "Grabbing Large File Checksums\n"
CHECKSUMS=$(echo $STREAM | jq -r '.photos[] | [(.derivatives[] | {size: .fileSize | tonumber, value: .checksum})] | max_by(.size | tonumber).value')

printf "Adding Checksums to Array\n"
for CHECKSUM in $CHECKSUMS; do
    arrCHKSUM+=($CHECKSUM)
done
printf "Total Downloads: ${#arrCHKSUM[@]}\n"

# Dedup checksum to only include unique ids.
arrCHKSUM=($(printf "%s\n" "${arrCHKSUM[@]}" | sort -u))
printf "Unique Downloads: ${#arrCHKSUM[@]}\n"

printf "Streaming All Assets\n"
echo $STREAM \
| jq -c "{photoGuids: [.photos[].photoGuid]}" \
| curl_post_json "$BASE_API_URL/webasseturls" \
| jq -r '.items | to_entries[] | "https://" + .value.url_location + .value.url_path + "&" + .key' \
| while read URL; do

	# Get this URL's checksum value, not all URL's will be downloaded as there are both the fill size AND the thumbnail link in the Assets stream.
	LOCAL_CHECKSUM=$(echo "${URL##*&}")

	# If the url's checksum exists in the large checksum array then proceed with the download steps.
	if [[ " ${arrCHKSUM[*]} " =~ " ${LOCAL_CHECKSUM} " ]]; then

			# Get the filename from the URL, first we delimit on the forward slashes grabbing index 6 where the filename starts.
			# then we must delimit again on ? to remove all the URL parameters after the filename.
			# Example: https://www.example.com/4/5/IMG_0828.JPG?o=param1&v=param2&z=param3....
			FILE=$(echo $URL|cut -d "/" -f6 | cut -d "?" -f1)

			# Don't download movies
			if [[ "$FILE" == *.mp4* ]]; then
				echo "Downloading movie"
					curl -OJ $URL
			else

				# Don't download files that already exist
				if [[ -f "$FILE" ]]; then
					printf "File $FILE already present. Renaming..\n"
					TIMESTAMP=$(date +%s%N)
					curl $URL -o "${TIMESTAMP}_${FILE}"

				else
					# Original curl -sOJ $URL -> s = silent : O = download to file : J = Save using uploaded filename -- this also skips files that already exist.
					curl -OJ $URL
				fi

			fi

	else
		echo "Skipping Thumbnail"
	fi

done

popd 2> /dev/null
wait

@robots4life
Copy link

i made some slight changes to be able to download also mp4, as well as eventual files with same name. Also hides the pushd/popd warnings when no specific path is set:

#!/bin/bash

# requires jq
# arg 1: iCloud web album URL
# arg 2: folder to download into (optional)

clear

function curl_post_json {
	curl -sH "Content-Type: application/json" -X POST -d "@-" "$@"
}

printf "Getting iCloud Stream\n"
BASE_API_URL="https://p23-sharedstreams.icloud.com/$(echo $1 | cut -d# -f2)/sharedstreams"

pushd $2 2> /dev/null
STREAM=$(echo '{"streamCtag":null}' | curl_post_json "$BASE_API_URL/webstream")
HOST=$(echo $STREAM | jq '.["X-Apple-MMe-Host"]' | cut -c 2- | rev | cut -c 2- | rev)

if [ "$HOST" ]; then
    BASE_API_URL="https://$(echo $HOST)/$(echo $1 | cut -d# -f2)/sharedstreams"
    STREAM=$(echo '{"streamCtag":null}' | curl_post_json "$BASE_API_URL/webstream")
fi

printf "Grabbing Large File Checksums\n"
CHECKSUMS=$(echo $STREAM | jq -r '.photos[] | [(.derivatives[] | {size: .fileSize | tonumber, value: .checksum})] | max_by(.size | tonumber).value')

printf "Adding Checksums to Array\n"
for CHECKSUM in $CHECKSUMS; do
    arrCHKSUM+=($CHECKSUM)
done
printf "Total Downloads: ${#arrCHKSUM[@]}\n"

# Dedup checksum to only include unique ids.
arrCHKSUM=($(printf "%s\n" "${arrCHKSUM[@]}" | sort -u))
printf "Unique Downloads: ${#arrCHKSUM[@]}\n"

printf "Streaming All Assets\n"
echo $STREAM \
| jq -c "{photoGuids: [.photos[].photoGuid]}" \
| curl_post_json "$BASE_API_URL/webasseturls" \
| jq -r '.items | to_entries[] | "https://" + .value.url_location + .value.url_path + "&" + .key' \
| while read URL; do

	# Get this URL's checksum value, not all URL's will be downloaded as there are both the fill size AND the thumbnail link in the Assets stream.
	LOCAL_CHECKSUM=$(echo "${URL##*&}")

	# If the url's checksum exists in the large checksum array then proceed with the download steps.
	if [[ " ${arrCHKSUM[*]} " =~ " ${LOCAL_CHECKSUM} " ]]; then

			# Get the filename from the URL, first we delimit on the forward slashes grabbing index 6 where the filename starts.
			# then we must delimit again on ? to remove all the URL parameters after the filename.
			# Example: https://www.example.com/4/5/IMG_0828.JPG?o=param1&v=param2&z=param3....
			FILE=$(echo $URL|cut -d "/" -f6 | cut -d "?" -f1)

			# Don't download movies
			if [[ "$FILE" == *.mp4* ]]; then
				echo "Downloading movie"
					curl -OJ $URL
			else

				# Don't download files that already exist
				if [[ -f "$FILE" ]]; then
					printf "File $FILE already present. Renaming..\n"
					TIMESTAMP=$(date +%s%N)
					curl $URL -o "${TIMESTAMP}_${FILE}"

				else
					# Original curl -sOJ $URL -> s = silent : O = download to file : J = Save using uploaded filename -- this also skips files that already exist.
					curl -OJ $URL
				fi

			fi

	else
		echo "Skipping Thumbnail"
	fi

done

popd 2> /dev/null
wait

works 100% - thank you

@txhammer68
Copy link

works great, thanks
any chance to modify for google photos shared albums?

@Ezema
Copy link

Ezema commented Oct 7, 2023

Thanks @fusionneur now October 2023 and still working

@furtimx
Copy link

furtimx commented Feb 8, 2024

i made some slight changes to be able to download also mp4, as well as eventual files with same name. Also hides the pushd/popd warnings when no specific path is set:

#!/bin/bash

# requires jq
# arg 1: iCloud web album URL
# arg 2: folder to download into (optional)

clear

function curl_post_json {
	curl -sH "Content-Type: application/json" -X POST -d "@-" "$@"
}

printf "Getting iCloud Stream\n"
BASE_API_URL="https://p23-sharedstreams.icloud.com/$(echo $1 | cut -d# -f2)/sharedstreams"

pushd $2 2> /dev/null
STREAM=$(echo '{"streamCtag":null}' | curl_post_json "$BASE_API_URL/webstream")
HOST=$(echo $STREAM | jq '.["X-Apple-MMe-Host"]' | cut -c 2- | rev | cut -c 2- | rev)

if [ "$HOST" ]; then
    BASE_API_URL="https://$(echo $HOST)/$(echo $1 | cut -d# -f2)/sharedstreams"
    STREAM=$(echo '{"streamCtag":null}' | curl_post_json "$BASE_API_URL/webstream")
fi

printf "Grabbing Large File Checksums\n"
CHECKSUMS=$(echo $STREAM | jq -r '.photos[] | [(.derivatives[] | {size: .fileSize | tonumber, value: .checksum})] | max_by(.size | tonumber).value')

printf "Adding Checksums to Array\n"
for CHECKSUM in $CHECKSUMS; do
    arrCHKSUM+=($CHECKSUM)
done
printf "Total Downloads: ${#arrCHKSUM[@]}\n"

# Dedup checksum to only include unique ids.
arrCHKSUM=($(printf "%s\n" "${arrCHKSUM[@]}" | sort -u))
printf "Unique Downloads: ${#arrCHKSUM[@]}\n"

printf "Streaming All Assets\n"
echo $STREAM \
| jq -c "{photoGuids: [.photos[].photoGuid]}" \
| curl_post_json "$BASE_API_URL/webasseturls" \
| jq -r '.items | to_entries[] | "https://" + .value.url_location + .value.url_path + "&" + .key' \
| while read URL; do

	# Get this URL's checksum value, not all URL's will be downloaded as there are both the fill size AND the thumbnail link in the Assets stream.
	LOCAL_CHECKSUM=$(echo "${URL##*&}")

	# If the url's checksum exists in the large checksum array then proceed with the download steps.
	if [[ " ${arrCHKSUM[*]} " =~ " ${LOCAL_CHECKSUM} " ]]; then

			# Get the filename from the URL, first we delimit on the forward slashes grabbing index 6 where the filename starts.
			# then we must delimit again on ? to remove all the URL parameters after the filename.
			# Example: https://www.example.com/4/5/IMG_0828.JPG?o=param1&v=param2&z=param3....
			FILE=$(echo $URL|cut -d "/" -f6 | cut -d "?" -f1)

			# Don't download movies
			if [[ "$FILE" == *.mp4* ]]; then
				echo "Downloading movie"
					curl -OJ $URL
			else

				# Don't download files that already exist
				if [[ -f "$FILE" ]]; then
					printf "File $FILE already present. Renaming..\n"
					TIMESTAMP=$(date +%s%N)
					curl $URL -o "${TIMESTAMP}_${FILE}"

				else
					# Original curl -sOJ $URL -> s = silent : O = download to file : J = Save using uploaded filename -- this also skips files that already exist.
					curl -OJ $URL
				fi

			fi

	else
		echo "Skipping Thumbnail"
	fi

done

popd 2> /dev/null
wait

Feb 9, 2024. Still working! Thank you very much guys!

@Cactuskingz
Copy link

Hi,
Can anyone explain how exactly to run this script?
I'm not a coder (never learned how to code) so I don't even know where to start to run this. Like, where do I paste the shared album link in the script. Am I supposed to edit the script before running it in Terminal?

p.s. I downloaded jq and python and Git because a tutorial told me to, but I still, don't know where to start with getting this script running.

@Uj947nXmRqV2nRaWshKtHzTvckUUpD

Hi, Can anyone explain how exactly to run this script? I'm not a coder (never learned how to code) so I don't even know where to start to run this. Like, where do I paste the shared album link in the script. Am I supposed to edit the script before running it in Terminal?

p.s. I downloaded jq and python and Git because a tutorial told me to, but I still, don't know where to start with getting this script running.

hi. you don't need python, the code is a shell script (runs on linux (baremetal host, virtual machine, container or WSL) or linux related apps such as cygwin or git bash.

are you mainly using windows or linux?

@Uj947nXmRqV2nRaWshKtHzTvckUUpD
Copy link

a TLDR guide

  1. First, you need jq, and also you need to be able to run it from anywhere, by properly installing or adding the portable version to environment variables. Assuming you downloaded it and properly installed it
    https://ioflood.com/blog/install-jq-command-linux/

2.1
if you are using windows, assuming you already have git downloaded from https://git-scm.com/download/win, you should have git bash in start menu, so you can run it and browser to the folder where you placed this script using cd
eg.

cd /c/Users/<your_user>/Desktop

2.2
if you are using linux already, then just run the following in the terminal:

cd ~/Desktop

3
then make the script executable

chmod +x ./icloud-album-download.sh

4
then run it from current directory, by passing the URL as argument as described in the comments in the script (first lines)

./icloud-album-download.sh <URL>
# arg 1: iCloud web album URL
# arg 2: folder to download into (optional)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment