Skip to content

Instantly share code, notes, and snippets.

@barraponto

barraponto/- Secret

Created May 26, 2015 17:47
Show Gist options
  • Save barraponto/02aa5b60398c8d434029 to your computer and use it in GitHub Desktop.
Save barraponto/02aa5b60398c8d434029 to your computer and use it in GitHub Desktop.
Packing version 1432662424
Deploying to Scrapy Cloud project "15591"
Deploy failed (400):
Vary: Cookie
Content-Length: 1505
Content-Type: application/json
{"status": "badrequest", "message": "ERROR:root:Script initialization failed\nTraceback (most recent call last):\n File \"/usr/lib/pymodules/python2.7/hworker/bot/runner.py\", line 77, in main\n run_scrapy(settings)\n File \"/usr/lib/pymodules/python2.7/hworker/bot/runner.py\", line 87, in run_scrapy\n execute(settings=settings)\n File \"/usr/lib/pymodules/python2.7/scrapy/cmdline.py\", line 142, in execute\n cmd.crawler_process = CrawlerProcess(settings)\n File \"/usr/lib/pymodules/python2.7/scrapy/crawler.py\", line 139, in __init__\n super(CrawlerProcess, self).__init__(settings)\n File \"/usr/lib/pymodules/python2.7/scrapy/crawler.py\", line 87, in __init__\n self.spider_loader = _get_spider_loader(settings)\n File \"/usr/lib/pymodules/python2.7/scrapy/crawler.py\", line 207, in _get_spider_loader\n return loader_cls.from_settings(settings.frozencopy())\n File \"/usr/lib/pymodules/python2.7/scrapy/spiderloader.py\", line 30, in from_settings\n return cls(settings)\n File \"/usr/lib/pymodules/python2.7/scrapy/spiderloader.py\", line 21, in __init__\n for module in walk_modules(name):\n File \"/usr/lib/pymodules/python2.7/scrapy/utils/misc.py\", line 71, in walk_modules\n submod = import_module(fullpath)\n File \"/usr/lib/python2.7/importlib/__init__.py\", line 37, in import_module\n __import__(name)\n File \"/tmp/eggs-X_AVLs/__main__.egg/comebebesp/spiders/baressp.py\", line 3, in <module>\nImportError: No module named linkextractors\n"}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment