Vote count: 0
I had deploy my project(sogou) successfuly, but while I run this, failed
$ curl http://localhost:6800/schedule.json -d project=sogou -d spider=sogou
2017-02-13 10:44:51 [scrapy] INFO: Scrapy 1.2.1 started (bot: sogou)
2017-02-13 10:44:51 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'sogou.spiders', 'CONCURRENT_REQUESTS': 5, 'SPIDER_MODULES': ['sogou.spiders'], 'RETRY_HTTP_CODES': [500, 502, 503, 504, 400, 403, 408], 'BOT_NAME': 'sogou', 'DOWNLOAD_TIMEOUT': 10,
'RETRY_TIMES': 10, 'LOG_FILE': 'logs/sogou/sogou/63a0bbacf19611e69eea240a644f1626.log'}
2017-02-13 10:44:51 [scrapy] INFO: Enabled extensions: ['scrapy.extensions.logstats.LogStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.corestats.CoreStats'] 2017-02-13 10:44:51 [twisted] CRITICAL: Unhandled error in Deferred: 2017-02-13 10:44:51 [twisted] CRITICAL: Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks result = g.send(result)
File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 90, in crawl six.reraise(*exc_info)
File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 71, in crawl self.spider = self._create_spider(*args, **kwargs) File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 94, in _create_spider return self.spidercls.from_crawler(self, *args, **kwargs) File "/usr/local/lib/python2.7/dist-packages/scrapy/spiders/init.py", line 50, in from_crawler spider = cls(*args, **kwargs)
TypeError: init() got an unexpected keyword argument '_job'
error runing scrapyd project
Aucun commentaire:
Enregistrer un commentaire