• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

istresearch / scrapy-cluster / 195
71%
master: 66%

Build:
Build:
LAST BUILD BRANCH: dev
DEFAULT BRANCH: master
Ran 03 Aug 2016 08:46PM UTC
Jobs 2
Files 38
Run time 7s
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

pending completion
195

push

travis-ci

madisonb
Scheduler queue cache implementation

fixes #64

Adds a new setting that allows you to control how many domain queues live within your spider process in order to be more memory efficient. This is only an issue when crawling a very large number of domains within your cluster.

23 of 23 new or added lines in 1 file covered. (100.0%)

1620 of 2501 relevant lines covered (64.77%)

1.3 hits per line

New Missed Lines in Diff

Lines Coverage ∆ File
5
100.0
crawler/crawling/distributed_scheduler.py
Jobs
ID Job ID Ran Files Coverage
1 195.1 (distribution=centos version=7 init=/usr/lib/systemd/systemd run_opts=--privileged --volume=/sys/fs/cgroup:/sys/fs/cgroup:ro) 03 Aug 2016 08:46PM UTC 0
64.77
Travis Job 195.1
2 195.2 (distribution=ubuntu version=14.04 init=/sbin/init run_opts=) 03 Aug 2016 08:46PM UTC 0
64.77
Travis Job 195.2
Source Files on build 195
Detailed source file information is not available for this build.
  • Back to Repo
  • Travis Build #195
  • b2e9b332 on github
  • Prev Build on dev (#194)
  • Next Build on dev (#197)
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc