Skip to content

Instantly share code, notes, and snippets.

@antiops
Created November 1, 2020 04:16
Show Gist options
  • Save antiops/2d6fcf7a78111e09aefad45eeaaa5e21 to your computer and use it in GitHub Desktop.
Save antiops/2d6fcf7a78111e09aefad45eeaaa5e21 to your computer and use it in GitHub Desktop.
pixieset.com dumper
/* pixieset.com full size image scraper
*
* Rips all images in highest quality
*
* To use scroll to the very bottom of the album and press F12 then paste the below snippet in and press enter.
* The full file list will be copied to your clipboard afterwards. You can use jdownloader to easily download the full list.
*/
(() => {
const data = document.querySelectorAll('.masonry-brick img')
let arr = []
for (let i = 0; i < data.length; i++) {
arr.push(data[i].src.replace(/-medium/g, '-xxlarge'))
}
console.log(arr)
let dataToArray = arr.toString().split(',').map(item => item.trim())
const dummy = document.createElement('textarea')
document.body.appendChild(dummy)
dummy.value = dataToArray.join('\n')
dummy.select()
document.execCommand('copy')
document.body.removeChild(dummy)
})()
@DeflateAwning
Copy link

Perform final download with:

wget -i "url_list_pasted.txt" -P "output_folder/"

@CyberAstronaut101
Copy link

Thank you for this!

@smejky
Copy link

smejky commented Sep 28, 2022

before performing wget, replace "xlarge" with "xxlarge" in the list of urls and you will get 1600px photos instead of 1024px photos.

@DeflateAwning
Copy link

before performing wget, replace "xlarge" with "xxlarge" in the list of urls and you will get 1600px photos instead of 1024px photos.

My fork does that automatically: https://gist.github.com/DeflateAwning/8567037cc7125cb3ede76fed40d27ba1

@brozikcz
Copy link

Thank you!

@m1ndy
Copy link

m1ndy commented Nov 18, 2024

For Firefox, and with xxlarge (~ 1600x2400) it's simpler:

  1. Scroll to bottom of Pixieset page to see all photos loaded.
  2. open Console (F12)
  3. Paste:
(() => {
  const data = document.querySelectorAll('img');
  let arr = [];

  data.forEach(img => {
    let url = img.src || img.getAttribute('data-src');

    if (url && url.includes('images.pixieset.com')) {
      arr.push(url.replace(/-medium|-large|-xlarge/g, '-xxlarge'));
    }
  });

  if (arr.length > 0) {
    console.log(arr.join('\n'));
  } else {
    console.warn('No matching images found.');
  }
})();
  1. the URLs should be printed out now. Right click on that list and 'Copy Object'.
  2. Paste into a new text file and save it as url_list_pasted.txt
  3. Follow @DeflateAwning's wget -i "url_list_pasted.txt" -P "output_folder/".
  4. All the images should download serially (couldn't figure out parallelism) in the folder

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment