Skip to content

Instantly share code, notes, and snippets.

@WA9ACE
Last active April 27, 2016 07:16
Show Gist options
  • Save WA9ACE/8509117 to your computer and use it in GitHub Desktop.
Save WA9ACE/8509117 to your computer and use it in GitHub Desktop.
Node.js Web Crawler using Request and Cheerio
var request = require('request');
var cheerio = require('cheerio');
var fs = require('fs');
var data = fs.createWriteStream('data.txt', {'flags': 'a'});
var urlsToCrawl = [];
var spider = function(url) {
var index = urlsToCrawl.indexOf(url);
// Remove the current url we're crawling from the list to be crawled.
if(index > -1) {
urlsToCrawl.splice(index, 1);
}
try {
request(url, function(error, response, body) {
if(!error && response.statusCode == 200) {
var $ = cheerio.load(body);
data.write($.html());
console.log('Data saved for url: ' + url);
$('a').each(function(i, element) {
var link = element.attribs.href;
urlsToCrawl.push(link);
});
// console.log(urlsToCrawl.length);
return spider(urlsToCrawl[0]);
} else {
// This was probably a relative url or a page anchor,
// which I don't account for yet.
// console.log(urlsToCrawl.length);
return spider(urlsToCrawl[0]);
// console.log(error);
}
});
} catch(error) {
return spider(urlsToCrawl[0]);
// console.log(error);
}
};
spider('https://news.ycombinator.com/');
@WA9ACE
Copy link
Author

WA9ACE commented Jan 19, 2014

Crashes with this output from the console.


Data saved for url: http://www.avc.com/a_vc/2014/01/investing-in-startups-in-europe.html
(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
    at Request.EventEmitter.addListener (events.js:160:15)
    at Request.init (c:\Users\Ace\Projects\crawler\node_modules\request\request.js:364:8)
    at Request.onResponse (c:\Users\Ace\Projects\crawler\node_modules\request\request.js:809:10)
    at ClientRequest.g (events.js:180:16)
    at ClientRequest.EventEmitter.emit (events.js:95:17)
    at HTTPParser.parserOnIncomingClient [as onIncoming] (http.js:1688:21)
    at HTTPParser.parserOnHeadersComplete [as onHeadersComplete] (http.js:121:23)
    at Socket.socketOnData [as ondata] (http.js:1583:20)
    at TCP.onread (net.js:525:27)
(node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
    at Request.EventEmitter.addListener (events.js:160:15)
    at Request.start (c:\Users\Ace\Projects\crawler\node_modules\request\request.js:610:8)
    at Request.end (c:\Users\Ace\Projects\crawler\node_modules\request\request.js:1226:28)
    at c:\Users\Ace\Projects\crawler\node_modules\request\request.js:412:12
    at process._tickCallback (node.js:415:13)
Data saved for url: http://www.asciiflow.com/
Data saved for url: https://lh5.googleusercontent.com/-dJsRfi7_Crw/Utl_miUi3II/AAAAAAAA8jM/2ODyIK015WI/s450-no/How+radians+work.gif
Data saved for url: http://www.slate.com/articles/technology/technology/2014/01/do_what_you_love_love_what_you_do_an_omnipresent_mantra_that_s_bad_for_work.html
Data saved for url: http://raganwald.com/2014/01/19/prototypes-are-not-classes.html
Data saved for url: http://en.wikipedia.org/wiki/Peelian_Principles
Data saved for url: https://bugzilla.redhat.com/show_bug.cgi?id=1054340
Data saved for url: http://paulgraham.com/cities.html
Data saved for url: http://scott.a16z.com/2014/01/17/success-at-work-failure-at-home/
Data saved for url: http://www.nbcnews.com/video/meet-the-press/54117741#54117741
Data saved for url: http://hoplon.io/
Data saved for url: http://homakov.blogspot.com/2014/01/cookie-bomb-or-lets-break-internet.html
Data saved for url: http://maxtaco.github.io/bitcoin/2014/01/16/how-jason-bourne-stores-his-bitcoin/

RangeError: Maximum call stack size exceeded

@GeneGenie
Copy link

Hello. Your issue is that you do consequet calls (in your recursion) And you will always get this exception at some point, coz you never close scope.

Use event emitter instead of recursion

function spider(url){
.....
// finished parsing page
eventEmitter.emit('page.finished',result)
...
}
eventEmitter.on('page.finished',function(result){
//handle result
spider(nextUrl)
})
spider(firstUrl)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment