Skip to content

Instantly share code, notes, and snippets.

@raza-ul-haq
Created June 24, 2015 07:01
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save raza-ul-haq/e2e82d960b8ca110692b to your computer and use it in GitHub Desktop.
Save raza-ul-haq/e2e82d960b8ca110692b to your computer and use it in GitHub Desktop.
D:\pythonProjects\proPakistani>scrapy crawl pro1
2015-06-24 11:57:50+0500 [scrapy] INFO: Scrapy 0.24.6 started (bot: proPakistani
)
2015-06-24 11:57:50+0500 [scrapy] INFO: Optional features available: ssl, http11
2015-06-24 11:57:50+0500 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE'
: 'proPakistani.spiders', 'SPIDER_MODULES': ['proPakistani.spiders'], 'BOT_NAME'
: 'proPakistani'}
2015-06-24 11:57:50+0500 [scrapy] INFO: Enabled extensions: LogStats, TelnetCons
ole, CloseSpider, WebService, CoreStats, SpiderState
2015-06-24 11:57:50+0500 [scrapy] INFO: Enabled downloader middlewares: HttpAuth
Middleware, DownloadTimeoutMiddleware, UserAgentMiddleware, RetryMiddleware, Def
aultHeadersMiddleware, MetaRefreshMiddleware, HttpCompressionMiddleware, Redirec
tMiddleware, CookiesMiddleware, ChunkedTransferMiddleware, DownloaderStats
2015-06-24 11:57:50+0500 [scrapy] INFO: Enabled spider middlewares: HttpErrorMid
dleware, OffsiteMiddleware, RefererMiddleware, UrlLengthMiddleware, DepthMiddlew
are
2015-06-24 11:57:50+0500 [scrapy] INFO: Enabled item pipelines:
2015-06-24 11:57:50+0500 [pro1] INFO: Spider opened
2015-06-24 11:57:50+0500 [pro1] INFO: Crawled 0 pages (at 0 pages/min), scraped
0 items (at 0 items/min)
2015-06-24 11:57:50+0500 [scrapy] DEBUG: Telnet console listening on 127.0.0.1:6
023
2015-06-24 11:57:50+0500 [scrapy] DEBUG: Web service listening on 127.0.0.1:6080
2015-06-24 11:57:51+0500 [pro1] DEBUG: Crawled (200) <GET http://propakistani.pk
/> (referer: None)
2015-06-24 11:57:51+0500 [pro1] ERROR: Spider error processing <GET http://propa
kistani.pk/>
Traceback (most recent call last):
File "C:\Python27\lib\site-packages\twisted\internet\base.py", line 12
03, in mainLoop
self.runUntilCurrent()
File "C:\Python27\lib\site-packages\twisted\internet\base.py", line 82
5, in runUntilCurrent
call.func(*call.args, **call.kw)
File "C:\Python27\lib\site-packages\twisted\internet\defer.py", line 3
93, in callback
self._startRunCallbacks(result)
File "C:\Python27\lib\site-packages\twisted\internet\defer.py", line 5
01, in _startRunCallbacks
self._runCallbacks()
--- <exception caught here> ---
File "C:\Python27\lib\site-packages\twisted\internet\defer.py", line 5
88, in _runCallbacks
current.result = callback(current.result, *args, **kw)
File "C:\Python27\lib\site-packages\scrapy\spider.py", line 56, in par
se
raise NotImplementedError
exceptions.NotImplementedError:
2015-06-24 11:57:51+0500 [pro1] INFO: Closing spider (finished)
2015-06-24 11:57:51+0500 [pro1] INFO: Dumping Scrapy stats:
{'downloader/request_bytes': 214,
'downloader/request_count': 1,
'downloader/request_method_count/GET': 1,
'downloader/response_bytes': 11301,
'downloader/response_count': 1,
'downloader/response_status_count/200': 1,
'finish_reason': 'finished',
'finish_time': datetime.datetime(2015, 6, 24, 6, 57, 51, 552000),
'log_count/DEBUG': 3,
'log_count/ERROR': 1,
'log_count/INFO': 7,
'response_received_count': 1,
'scheduler/dequeued': 1,
'scheduler/dequeued/memory': 1,
'scheduler/enqueued': 1,
'scheduler/enqueued/memory': 1,
'spider_exceptions/NotImplementedError': 1,
'start_time': datetime.datetime(2015, 6, 24, 6, 57, 50, 549000)}
2015-06-24 11:57:51+0500 [pro1] INFO: Spider closed (finished)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment