Instantly share code, notes, and snippets.

Embed
What would you like to do?
Download all your photobucket images in bulk via CLI

backstory

On Jul 4, 2017 theverge.com posted an article about photobucket (silently) no longer allowing their users to source their images on 3rd party websites for free, thus leaving websites all over the web broken displaying the following image in replace:

Me being one of those individual, I attempted to go into my photobucket account and download my content as I now have my own hosting I am able to store those images on; however, the only ways to bulk download (on desktop) is by downloading albums through their interface. Doing so, gave me the following error message: "Hmmm. Something didn't click. Want to give it another shot? Try again now."

Doing this serveral times, in different browsers (chrome, firefox and safari), after disabling all my addons and extensions (including ad blockers), it still didn't work.

At this point, doing anything on their website would flood my screen with popups and 3+ ads. I swear I haven't seen anything like it since AOL. But I took screenshots of the errors, doing my due diligance and went to create a ticket for their support team to investigate. Only one problem: their ticket creation system's image uploader doesn't work. Interesting considering their whole business model is designed around you being able to upload images to their interface and serve them on the web...

At this point, I am just under the impression everything is broken and this site is just run down hoping it will get fixed soon. But after talking to their support, they pretty much shrugged their shoulders and said "if you don't have the android app, thats the only way to download them" - HENCE why I am writing this how-to on getting all your image!

Requirements

  • Ability to run bash commands
  • Terminal application to run bash on
  • Text editor
  • Patience to collect all the links via the ad cluttered photobucket website

Steps to download your files in bulk

Fun fact: Unlike photobuckets interface, this also works with your video files and gifs.

  1. Navigate to the album or folder you are wanting to download
  2. Check one of the boxes of the images you want to download
  3. A select all option will pop up in the buttons above, click that button - this will select all the images in that album, not just the images on your current page view.
  4. At the bottom of your screen, a menu bar will show up with your selections. WAIT for the number count of selected items to update before moving onto the next step!
  5. Navigate to the next album or folder you want to download and select those images as well. The selection you made from the previous album(s) will continue to accumulate as you continue to move from album to album.
  6. Once you have collected all the items you want to download, click the link option at the menu displayed at the bottom of the screen
  7. Photobucket will populate a box with all the direct links to the content you've selected, once you click on this, it will copy them to your computer's clipboard. NOTE: I had to do this in firefox, because the button to copy the links didn't work in chrome (might have been a popup blocker or something of that nature)
  8. Create a folder on your desktop titled photobucket
  9. Open a text editor you have on your computer (I use sublime which is available for Mac and PC), paste your links into a new blank document and save it on your photobucket folder on your desktop as a txt file type (ie. photobucket_files.txt)
  10. Open up a terminal based application and run the following commands:
cd ~/Desktop/photobucket
cut -d\/ -f 7 photobucket_files.txt | grep "\." | while read file; do grep "${file}$" photobucket_files.txt; done | while read file; do curl -O --referer "http://s.photobucket.com/" ${file}; done
cut -d\/ -f 7 photobucket_files.txt | grep -v "\." |  sort -u | while read dir; do mkdir ${dir}; cd ${dir}; grep "/${dir}/" ../photobucket_files.txt | while read file; do curl -O --referer "http://s.photobucket.com/" ${file}; done; cd -; done

What this does is tricks the request to the image file is coming from photobucket themselves by definining --referer "http://s.photobucket.com/" allowing you to download the file without dealing with the redirect to their upgrade/update image.

According to the time stamps of my commands, it took roughly 8 minutes for all of the content to download (~347M worth of content, over 1300 files).

NOTE: These scripts will create the albums for you on your personal computer; however, it will only create the albums one level deep. So I feel like this is more than enough just to get away from them if you need to download your content and don't want to pay them.

These one-liners aren't the cleanest in regards to code, but this isn't something I have any interest in cleaning up or improving as it completed the job it was intended for with me.

Hope this helps you as it did me.

@jorgimello

This comment has been minimized.

jorgimello commented Nov 1, 2017

Dude, thank you so much for your help.
The commands you explained worked like a charm. I don't have many pictures in my photobucket album but I wanted to save them somewhere else.
Great job man

screenshot from 2017-11-01 17-07-05

@Maranello2017

This comment has been minimized.

Maranello2017 commented Nov 7, 2017

Nice approach but have they now stopped the ability to "Direct Link". You can use IMG codes and just strip the [IMG] and [/IMG].

@philipjewell

This comment has been minimized.

Owner

philipjewell commented Nov 10, 2017

@Maranello2017 That is exactly the reason for this posting... To allow people to download their content because direct linking is disabled.

@philipjewell

This comment has been minimized.

Owner

philipjewell commented Nov 10, 2017

Glad to help! @jorgimello

@None1230

This comment has been minimized.

None1230 commented Nov 15, 2017

Thanks for your guide. However for windows users, this might not be so easy.
So here's what i did...
I downloaded the Photobucket mobile (android) app, and it had the option to save Album to device.
And it really worked well... finished downloading all the files & put it to an Album called Photobucket_Album1.

Those who do not have an Android or want to do it on your PC can,
1, Get the "Photobucket Hotlink Fix" Chrome extension,
2, Export all your picture HTML links and save it to a text file.... photos.htm
3, Open the file in Chrome and press CTRL+S (or Save page) ...
4, select the option Webpage COMPLETE
Hope it helps others trying to download their memories from these greedy website owners.

@gatorman22

This comment has been minimized.

gatorman22 commented Nov 20, 2017

As Maranello2017 noted above, you'll need to strip the IMG tags before processing. On a BASH terminal I used:

sed -i 's/\[IMG]//g; s/\[\/IMG]//g' photobucket_files.txt

Have fun backing up your images!

@ArcAIN6

This comment has been minimized.

ArcAIN6 commented Dec 18, 2017

this no longer works, even the iIMG tags now use the same URL as everywhere else, so no direct way to download them. Attempting either trashes the image, or replaces it with the stock "you have to pay for 3rd party" image.

@naahthx

This comment has been minimized.

naahthx commented Jan 2, 2018

I followed your instructions step by step and when I run the code in the terminal based application for Mac it says "no such file or directory". What am I doing wrong? I can't figure out any other way to mass export my albums from photobucket to my desktop.

@nr1q

This comment has been minimized.

nr1q commented Jan 3, 2018

I could made it a little bit simpler, at per-album basis (because I only have 5 albums with 2.5K photos each); I use fish shell but for bash you only have to edit the while command:

 cat album_photo_links.txt | while read line; wget --wait=1 --referer 'http://s.photobucket.com/' "$line"; end

Update: Actually this is a better command:

 cat album_photo_links.txt | xargs wget --referer 'http://s.photobucket.com/' --wait=1 --random-wait --input-file=-
@Jakeukalane

This comment has been minimized.

Jakeukalane commented Jan 10, 2018

Worked like a charm. Just used the first line as I didn't want subfolders. NO need to use the [IMG] version, just what he said.

244 images download in little time. Thank you so much.

@PaulNerd

This comment has been minimized.

PaulNerd commented Jan 15, 2018

Or, just copy the direct link text and give it to JDownloader2's linkgrabber.
Works fine, no console involved.

@giannisergio

This comment has been minimized.

giannisergio commented Jan 24, 2018

I tried both philipjewell and nr1q commands and I get the "bwe" 3rd party hosting image for almost all of my images but a few.
Same thing with the JDownloader2's linkgrabber suggested by PaulNerd.
Do you guys have any updated method?
Thanks!

@jamesoncollins

This comment has been minimized.

jamesoncollins commented Jan 28, 2018

If you are on a windows box, and you are using cygwin, don't forget to get rid of the carriage returns in your file using something like dos2unix.

@dubistdu

This comment has been minimized.

dubistdu commented Feb 1, 2018

Thanks for this post. I was pretty bummed that they got rid of download album option. Not to mention their aweful new(I haven't been to the site for over 3 years) ui. I don't know what's going on with photo photobucket. Seems like they are losing their mind.
I used @nr1q solution.
Worked great!

@srikanthdoss

This comment has been minimized.

srikanthdoss commented Feb 5, 2018

Works great! Thank you so much for this script.

@bash64

This comment has been minimized.

bash64 commented Feb 14, 2018

This worked for me and did not make mistakes... >cat photobucket.txt | xargs -I url basename url | while read file; do grep "${file}$" photobucket.txt; done | while read file; do curl -O --referer "http://s.photobucket.com/" ${file}; echo ${file}; done

@raaomoka

This comment has been minimized.

raaomoka commented Feb 26, 2018

This doesn't work directly in the Windows 10 command prompt, probably need to enable bash shell on your Windows 10 PC:

https://www.howtogeek.com/249966/how-to-install-and-use-the-linux-bash-shell-on-windows-10/

Alternatively, you could just use the Android App on your phone to download each of your albums, like I did, then transfer them via Wi-Fi direct or USB to your Windows PC. You may run into the issue of their half-baked app continuing to fail to download big albums, what you could do in this case, like I did, is as follows:

  1. Download the album using the Photobucket app and watch it fail, keep note of the album's name

  2. Open your phone's file manage, navigate to Internal SD Card > Pictures > Photobucket, and rename the corresponding album folder to Photobucket-AlbumName2

  3. Repeat steps 1 and 2 about 4 times, incrementing the number at the end of the corresponding album folder each time

  4. Now set your file manager to sort the album photos by size in ascending order. The ones that randomly failed to download will appear at the top of the list; delete these ones from each of the Photobucket-AlbumName up to Photobucket-AlbumName4 folders

  5. Now move and merge all the photos from the Photobucket-AlbumName2 to 4 folders into the Photobucket-AlbumName folder.

  6. Vuala, you should now have all your full-res photos saved in the Photobucket-AlbumName folder 😃 Double check by looking at the number of files your file manager says is in the folder, if it doesn't match the number of pictures the Photobucket app states is in the album, you'll need to repeat steps 2 to 5 above until the numbers match. Otherwise, you'll be missing pictures.

@NikkiDelRosso

This comment has been minimized.

NikkiDelRosso commented Mar 6, 2018

Why not use the -i or --input-file argument in wget?

Read URLs from a local or external file. If ‘-’ is specified as file, URLs are read from the standard input.

You should be able to download all of the files by running wget --referer 'http://s.photobucket.com/' -i photobucket_files.txt

Additionally, you can prevent accidental download of any of the "3rd party hosting disabled" image by adding --max-redirect=0, and if you have to run this multiple times (I did, because some of them were randomly 404ing), you can add -nc to have it ignore files that are already in the directory. Here's the full command that I used:

wget --referer 'https://s.photobucket.com/' --wait=1 --random-wait --max-redirect=0 -nc -i photobucket.txt

@NefariozBG

This comment has been minimized.

NefariozBG commented Apr 17, 2018

For anyone who finds this, I found that the original script kept giving me an error when I tried to download my files,
curl: (3) Illegal characters found in URL

Turns out, this can be easily fixed if you change one thing in the commands. The last command should be:
cut -d/ -f 7 photobucket_files.txt | grep -v "." | sort -u | while read dir; do mkdir ${dir}; cd ${dir}; grep "/${dir}/" ../photobucket_files.txt | while read file; do curl -O --referer "http://s.photobucket.com/" ${file%$'\r'}; done; cd -; done

Notice the "%$'\r'" that was added after the last {file}.

@jhuseby82

This comment has been minimized.

jhuseby82 commented Apr 20, 2018

Hoping someone can help. I'm on Windows 10, I've installed Ubuntu, running bash.exe. I've tried copying/pasting nr1q command (after mounting to my desktop/photobucket folder where the file photobucket_files.txt is).
cat photobucket_files.txt | xargs wget --referer 'http://s.photobucket.com/' --wait=1 --random-wait --input-file=-

But I'm getting a "404 not found" error, and I notice after the URL's that pop up, it's appending a %0D to the end of each one, so I'm assuming that's why it can't find the URL. I've tried copying/pasting directly from here to Notepad++ (instead of notepad in Windows), but it's still not working. I've tried manually typing in the above code in bash, but get the same results (appending %0D to the end of the URLs).

I've tried the original code from phillipjewell and the edit from NefariozBG but get an error: curl: (3) Illegal characters found in URL

Any help would be greatly appreciated (I have about 2000 images in Photobucket, haven't uploaded to the site in probably 5-6 years due to not being able to easily retrieve them, but man would it be nice to bulk pull these images out and move to another service).

@BCE75

This comment has been minimized.

BCE75 commented May 13, 2018

I just got this to work! It kept failing on me but I saw that my library was set as private. I had to make my library public long enough for this to run. I'm going to delete my photobucket account when this is done. Thank you for this!

@estevesds12

This comment has been minimized.

estevesds12 commented May 26, 2018

I'm on a windows 10 pc and it only worked the way None1230 explained it. But is there a way I can download the photos AND the tags ?

@JamesHagerman

This comment has been minimized.

JamesHagerman commented Jun 3, 2018

This method worked on 2018-06-03.

@tromlet

This comment has been minimized.

tromlet commented Jun 11, 2018

It also worked on 11 June 2018. Thank you so, so much, this was super, super helpful.

@kenme243

This comment has been minimized.

kenme243 commented Jun 20, 2018

Works wonder on 20 June 2018, thank you very much, anyone have problem with file directory should check if the text edit file is .txt or not, you can search for how to save txt on Mac as do as instructs, pretty easy :)

BTW, any suggestion on best and free photo hosting right now anyone?

@kredzsays

This comment has been minimized.

kredzsays commented Aug 14, 2018

@NikkiDelRosso Thanks so much for posting your script variation. Worked like a charm for me! was getting the "curl: (3) Illegal characters found in URL" error using the OP code.

I'm running Ubuntu shell (1604.2018.510.0) on a windows 10 machine, for those in a similar pickle.

@samanthaimx

This comment has been minimized.

samanthaimx commented Aug 18, 2018

I am having so much trouble with this. I've tried all the commands above but I'm sure I'm doing something wrong. Would anyone mind assisting me?

@Rocketman69

This comment has been minimized.

Rocketman69 commented Oct 4, 2018

Well...I've played hell all morning trying to get this to work. Finally got it to at least do something after adding the "%$'\r" after the last {file} per @NefariozBG (although I did this to the first command as I kept getting the "curl: (3)…" error). Now I keep getting an error on each file that basically says: curl: (7) Failed to connect to i38.photobucket.com port 80: Connection refused

Any ideas what I'm doing wrong? Bear with me, I'm new to Linux bash stuff.

@CaptainBadass

This comment has been minimized.

CaptainBadass commented Oct 11, 2018

Used philipjewell's original commands on a Mac, stock Terminal. Worked perfectly!

@Protobikes

This comment has been minimized.

Protobikes commented Oct 12, 2018

I am not seeing the toolbar with numbers in 4 and 5. Is it still there for other? Using Firefox. Just installed it for this.I guess what I am trying to do is get a list of links. Maybe can copy 1 and increment a sequence? Trying to make sure I have copies all my pics off photobucket so I can close the account. Don't have an Android phone.

@Protobikes

This comment has been minimized.

Protobikes commented Oct 12, 2018

Should this run in command prompt? Does not recognise cut as a command.

@mynameiskaren

This comment has been minimized.

mynameiskaren commented Oct 23, 2018

What worked for me using Windows 10:

  1. Enabled Bash shell using raaomoka's methods.
  2. Used a combination everyone else's comments:
cd /mnt/c/users/karen/desktop/photobucket

cut -d\/ -f 7 photobucket_files.txt | grep "\." | while read file; do grep "${file}$" photobucket_files.txt; done | while read file; do curl -O --referer "http://s.photobucket.com/" ${file}; done

wget --referer 'https://s.photobucket.com/' --wait=1 --random-wait --max-redirect=0 -nc -i photobucket_files.txt

Finally got it working after troubleshooting. Worked as of 10.23.18

Thanks everyone!

@Dennist03

This comment has been minimized.

Dennist03 commented Nov 1, 2018

Worked for Mac Mojave 10.28.18 with the original commands.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment