• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

istresearch / scrapy-cluster / 195 / 2
71%
master: 66%

Build:
Build:
LAST BUILD BRANCH: dev
DEFAULT BRANCH: master
Ran 03 Aug 2016 08:46PM UTC
Files 38
Run time 5s
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

03 Aug 2016 08:29PM UTC coverage: 64.774% (+0.1%) from 64.67%
distribution=ubuntu version=14.04 init=/sbin/init run_opts=

push

travis-ci

madisonb
Scheduler queue cache implementation

fixes #64

Adds a new setting that allows you to control how many domain queues live within your spider process in order to be more memory efficient. This is only an issue when crawling a very large number of domains within your cluster.

1620 of 2501 relevant lines covered (64.77%)

0.65 hits per line

Source Files on job 195.2 (distribution=ubuntu version=14.04 init=/sbin/init run_opts=)
  • Tree
  • List 0
  • Changed 1
  • Source Changed 1
  • Coverage Changed 1
Coverage ∆ File Lines Relevant Covered Missed Hits/Line
  • Back to Build 195
  • Travis Job 195.2
  • b2e9b332 on github
  • Prev Job for distribution=ubuntu version=14.04 init=/sbin/init run_opts= on dev (#194.2)
  • Next Job for distribution=ubuntu version=14.04 init=/sbin/init run_opts= on dev (#197.2)
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc