Skip to content

Instantly share code, notes, and snippets.

@rrrodrigo
Created April 10, 2012 11:45
Show Gist options
  • Star 13 You must be signed in to star a gist
  • Fork 3 You must be signed in to fork a gist
  • Save rrrodrigo/2350790 to your computer and use it in GitHub Desktop.
Save rrrodrigo/2350790 to your computer and use it in GitHub Desktop.
How to download all your Instagram pictures in highest resolution without using any API
Following the news about Facebook buying Instagram I decided to delete my Instagram account before Facebook claims ownership of my pictures.
Since the Instagram-recommended (in their FAQ): http://instaport.me/export doesn't work for me (probably they can't cope with the high demand),
here is a quick and dirty way to download all my Instagram pictures in their highest resolution in a few easy steps.
You will need: Firefox, Firebug, some text editor, wget
1. Go to http://statigr.am/yourlogin using Firefox with Firebug extension active
2. Scroll down as many times as it is needed to have all yor pictures thumbnails displayed (I had some 3 hundred pictures so it was not that much scrolling, YMMV)
3. In the Firebug JS console run this JS code: $(".lienPhotoGrid a img").each(function(index) { console.log($(this).attr('src')) })
4. JS console will contain urls to all the thumbnails images, like this: http://distilleryimage1.s3.amazonaws.com/4ed46cf2801511e1b9f1123138140926_5.jpg
5. Copy/paste to your favorite editor
6. Search and replace "_5.jpg" with "_7.jpg"
7. Save the resulting file as 'filelist.txt' or whatever
8. Run in your terminal: wget -i filelist.txt
9. Done! Now you can upload the pictures to another photo sharing service and enjoy them there until it gets bought by Facebook or Google.
Flickr seems a safe bet for now.
10. Don't forget to delete your Instagram account ASAP at https://instagram.com/accounts/remove/request/
Note, that you won't get titles, comments, locations, faves, etc. that way, only pictures (which is all I care about).
@tombryan
Copy link

Worked beautifully using Chrome and it's JavaScript Console.

@danyshaanan
Copy link

That's what I've been looking for!!

Step 6 can be done either in vi with

:%s/5.jpg/7.jpg/

or in the terminal with

sed -i '' 's/5\.jpg/7\.jpg/g' FILE.txt

Or just removed all together if you'll change the code in step 3 to:

$(".lienPhotoGrid a img").each(function(index) { console.log($(this).attr('src').replace(/5\.jpg/,"7.jpg")) })

Thanks :]

@danyshaanan
Copy link

Right after running the wget, run:

exiftool "-FileName<FileModifyDate" -d "%Y%m%d_%H%M%S.%%e" *

This will rename the images as to their original dates and times. Notice that due to the way exiftools works, this will not work if you'll copy the directory first. This must be ran on the first downloaded files with wget.

@esd
Copy link

esd commented Feb 4, 2013

They seem to have changed their HTML, so this is an updated version (just removed the 'a'):

$(".lienPhotoGrid img").each(function(index) { console.log($(this).attr('src').replace(/5\.jpg/,"7.jpg")) })

@iamjetlag
Copy link

how to do in step 8? plzz tell me...im stuck right there..

@jiridanek
Copy link

they changed it yet again. In step 8, first three screens will load while scrolling, after that, to keep loading more, you need to repeatedly click the Load more... link. So this is my version of step 8 (in Chromium, but their Developer Console is pretty similar to Fire Bug)

Right click the Load more... link, select Inspect element, note the id, for example

var link = document.getElementById(".reactRoot[0].[0].[1].[3].{userphotos268071787}.[1].0.[1].0.[0]");

use the click() method from JQuery to automatically click the link every 100 milliseconds.

var pes = window.setInterval(function(){link.click()}, 100);

look at the progress from time to time and check if there is more photos to load. When all is loaded, stop the procedure

window.clearInterval(pes)

now proceed with step 9

@jayson0m
Copy link

They seemed to update their site again--anyone know a work around?

@james2doyle
Copy link

Paste this into your console.

var images = [];
$(".photo-feed .photo .photo-wrapper .Image").each(function(index) {
  var src = $(this).attr('src').replace(/6\.jpg/,"8.jpg");
  images.push('<li><a target="_blank" href="'+src+'">'+src+'</a></li>')
});
document.body.innerHTML = '<ul>'+images.join('')+'</ul>'

This will get all the large image links and replaces the body of the site with a list. You can then click on the links and open the large image in a new tab.

@segebee
Copy link

segebee commented Aug 23, 2014

$(".mediaPhoto div").each(function(index) { console.log($(this).attr('src')) })

//run in console to get a list of image urls

@pablonen
Copy link

pablonen commented Oct 1, 2014

I did this today, and found out that the jquery that esd posted worked for me. Also they had changed the urls of the images. To get the large pictures you need to change the "_s.jpg" to "_n.jpg".

@Lujaw
Copy link

Lujaw commented Jan 3, 2015

@rrrodrigo Thanks for the script and @atte-tamminen Thanks for the informing about the change. I was trying "_l.jpg" .. Downloading all my IG photos. :)

Copy link

ghost commented Feb 5, 2015

@Lujaw I tried that script too..and the others..Did they changed again something ? Because no script works anymore. One month ago, the script worked for me.

@ToniApps
Copy link

ToniApps commented Jan 5, 2016

and to do it on android??

@tullyhansen
Copy link

$(".lienPhotoGrid img").each(function(index) { document.write($(this).attr('src').replace("s150x150","s1080x1080").replace(/\?.*/,"").concat("<br />") )})

Works for me (as at 2016-03-31) with iconosquare.com and Chrome console – document.write used because Chrome truncates long URLs logged to console. Similar technique for videos:

$(".popinvideo").each(function(index) { document.write($(this).attr('href').concat("<br />") )})

And (largely for my own reference, when I inevitably end up here in a few months' time having forgotten everything) how to use wget to redownload only new or updated files linked in filelist.txt:

wget -N -i filelist.txt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment