Skip to content

Instantly share code, notes, and snippets.

@aluramh
Last active March 24, 2018 00:19
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save aluramh/0b6bd7df225ac043ede469c2a4ee6189 to your computer and use it in GitHub Desktop.
Save aluramh/0b6bd7df225ac043ede469c2a4ee6189 to your computer and use it in GitHub Desktop.
Reads results.json and sends a POST request for each row to be saved to the DB.
const axios = require('axios');
const fs = require("fs");
const promisesArray = [];
(async function main () {
try {
// Read data dump
let scraperOutput = JSON.parse(fs.readFileSync('./input/results.json', 'utf8'));
// Schedule operations
scraperOutput.forEach(row => {
promisesArray.push(
axios.post('http://localhost:8001/api/scraper', row, {
headers: { 'Content-Type': 'application/json' }
})
)
});
// Wait for all of them to end before ending script execution
const responses = await axios.all(promisesArray);
// Obtain the responses that got errors.
const errors = responses
.filter(i => i.data.error != null)
.map(i => i.data)
// Write the responses that got errors to a file.
fs.writeFile('./output/errors.json', JSON.stringify(errors), 'utf8');
// End
console.log('Finished.')
} catch (e) {
console.error(e)
}
})()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment