Created
November 1, 2020 04:16
-
-
Save antiops/2d6fcf7a78111e09aefad45eeaaa5e21 to your computer and use it in GitHub Desktop.
pixieset.com dumper
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
/* pixieset.com full size image scraper | |
* | |
* Rips all images in highest quality | |
* | |
* To use scroll to the very bottom of the album and press F12 then paste the below snippet in and press enter. | |
* The full file list will be copied to your clipboard afterwards. You can use jdownloader to easily download the full list. | |
*/ | |
(() => { | |
const data = document.querySelectorAll('.masonry-brick img') | |
let arr = [] | |
for (let i = 0; i < data.length; i++) { | |
arr.push(data[i].src.replace(/-medium/g, '-xxlarge')) | |
} | |
console.log(arr) | |
let dataToArray = arr.toString().split(',').map(item => item.trim()) | |
const dummy = document.createElement('textarea') | |
document.body.appendChild(dummy) | |
dummy.value = dataToArray.join('\n') | |
dummy.select() | |
document.execCommand('copy') | |
document.body.removeChild(dummy) | |
})() |
Thank you for this!
before performing wget, replace "xlarge" with "xxlarge" in the list of urls and you will get 1600px photos instead of 1024px photos.
before performing wget, replace "xlarge" with "xxlarge" in the list of urls and you will get 1600px photos instead of 1024px photos.
My fork does that automatically: https://gist.github.com/DeflateAwning/8567037cc7125cb3ede76fed40d27ba1
Thank you!
For Firefox, and with xxlarge
(~ 1600x2400) it's simpler:
- Scroll to bottom of Pixieset page to see all photos loaded.
- open Console (F12)
- Paste:
(() => {
const data = document.querySelectorAll('img');
let arr = [];
data.forEach(img => {
let url = img.src || img.getAttribute('data-src');
if (url && url.includes('images.pixieset.com')) {
arr.push(url.replace(/-medium|-large|-xlarge/g, '-xxlarge'));
}
});
if (arr.length > 0) {
console.log(arr.join('\n'));
} else {
console.warn('No matching images found.');
}
})();
- the URLs should be printed out now. Right click on that list and 'Copy Object'.
- Paste into a new text file and save it as
url_list_pasted.txt
- Follow @DeflateAwning's
wget -i "url_list_pasted.txt" -P "output_folder/"
. - All the images should download serially (couldn't figure out parallelism) in the folder
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Perform final download with: