Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
Save images from chrome inspector/dev tools network tab
/* open up chrome dev tools (Menu > More tools > Developer tools)
* go to network tab, refresh the page, wait for images to load (on some sites you may have to scroll down to the images for them to start loading)
* right click/ctrl click on any entry in the network log, select Copy > Copy All as HAR
* open up JS console and enter: var har = [paste]
* (pasting could take a while if there's a lot of requests)
* paste the following JS code into the console
* copy the output, paste into a text file
* open up a terminal in same directory as text file, then: wget -i [that file]
*/
var imageUrls = [];
har.log.entries.forEach(function (entry) {
// This step will filter out all URLs except images. If you just want e.g. just jpg's then check mimeType against "image/jpeg", etc.
if (entry.response.content.mimeType.indexOf("image/") !== 0) return;
imageUrls.push(entry.request.url);
});
console.log(imageUrls.join('\n'));
@tucq88
Copy link

tucq88 commented Jul 30, 2015

Thanks mate. Very helpful. If you could provide some filter to get correct images, it would be greater (y)

@felipebhz
Copy link

felipebhz commented Aug 25, 2016

Thank you very much! Saved me a lot of manual work to save around 150+ images from a website.

@Pustur
Copy link

Pustur commented Dec 26, 2017

Thanks! saved me some time :)

@Norod
Copy link

Norod commented Sep 17, 2018

Thank you very much for this

@nitinsunny
Copy link

nitinsunny commented Oct 10, 2019

can you explain in detail?

@tobek
Copy link
Author

tobek commented Oct 10, 2019

can you explain in detail?

I updated the instructions somewhat, but basic familiarity with chrome dev tools and command line are needed, otherwise searching for a chrome extension that does it for you might be your best bet.

@wangyung
Copy link

wangyung commented Apr 13, 2020

thanks! this gist is very useful

@rgluis
Copy link

rgluis commented May 4, 2020

thanks!!

@protrolium
Copy link

protrolium commented Aug 7, 2020

fantastic demo. It does get sluggish for me and the inspector tended to freeze depending on the amount of images.

@VianaArthur
Copy link

VianaArthur commented Aug 27, 2020

Thank you.

@cutero
Copy link

cutero commented Sep 14, 2020

Than you!!! OMG :)

@umop3plsdn
Copy link

umop3plsdn commented Oct 21, 2020

lmfaoooo this is super clever haha

@puziyi
Copy link

puziyi commented Jan 7, 2021

Thank you so much! And I found it does take some time to paste, so I write a Python script to get image URLs offline. See below, please.

import json
from haralyzer import HarParser, HarPage

# Download the .har file from Developer tools(roughly the same as your operations), and we can parse it offline.
# Even if we have many image files to be download, it will not take too much time to wait to paste.
with open('source_har.har', 'r') as f:
    har_parser = HarParser(json.loads(f.read()))

data = har_parser.har_data["entries"]
image_urls = []

for entry in data:
    if entry["response"]["content"]["mimeType"].find("image/") == 0:
        image_urls.append(entry["request"]["url"])
     
# Save the URL list to a text file directly.
with open('target_link.txt', 'w') as f:
    for link in image_urls:
        f.write("%s\n" % link)

@aykun1907
Copy link

aykun1907 commented Jul 3, 2021

@puziyi thanks for the python script!

@robinagata
Copy link

robinagata commented Sep 6, 2021

hi, I just created an account to respond to this thread. I followed the steps up until "7. copy the output, paste into a text file" because when I attempt to copy the output or right click in it, it freezes DevTools, and if I'm not on the tab for too long, DevTools will become blank unless I refresh. I'm also having trouble downloading Python, so I can't use the offline downloader script provided by @puziyi. how do I circumvent the first issue?

@robinagata
Copy link

robinagata commented Sep 6, 2021

I figured out a workaround a while ago by using Mozilla Firefox and following the steps from there. Now, my issue is at "8. open up a terminal in same directory as text file, then: wget -i [that file]" because when I input "wget -i [the file path]", Windows Terminal at first needed me to "Supply values for the following parameters: Uri:" and typing the target website comes back with an error. Should I go somewhere else because my problem deviates from the original topic?

@tobek
Copy link
Author

tobek commented Sep 8, 2021

when I input "wget -i [the file path]", Windows Terminal at first needed me to "Supply values for the following parameters: Uri:" and typing the target website comes back with an error

The instructions I wrote are for Linux. I didn't think Windows even had wget, but sounds like it does but with a different interface. Look up how to download files using a text file with a list of URLs in Windows.

@climardo
Copy link

climardo commented Mar 30, 2022

Cool. Thanks!

@AbdullahJames
Copy link

AbdullahJames commented May 27, 2022

I don't understand what you mean by it takes a long time to paste? because when i paste, its instant then i get the message "undefined"

@puziyi
Copy link

puziyi commented May 27, 2022

I don't understand what you mean by it takes a long time to paste? because when i paste, its instant then i get the message "undefined"

It occurs in situations where one needs to download a bunch of images.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment