• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

run-llama / llama_deploy / 12212370550
87%

Build:
DEFAULT BRANCH: main
Ran 07 Dec 2024 11:08AM UTC
Jobs 1
Files 70
Run time 1min
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

07 Dec 2024 11:07AM UTC coverage: 73.597% (+0.7%) from 72.924%
12212370550

push

github

web-flow
fix: avoid early exit of RabbitMQ consumer (#397)

* add e2e test running a workflow

* cosmetics

* use topic everywhere

* keep the consumer alive

* add multi-control plane tests

* proper manage cancellation and cancel task in tests

* increase coverage

* relax assertion

* wait for the process to finish

* run queue on a different tcp port

* concurrency

* revert

* try different image

25 of 28 new or added lines in 1 file covered. (89.29%)

2754 of 3742 relevant lines covered (73.6%)

0.74 hits per line

New Missed Lines in Diff

Lines Coverage ∆ File
3
75.21
22.92% llama_deploy/message_queues/rabbitmq.py
Jobs
ID Job ID Ran Files Coverage
1 12212370550.1 07 Dec 2024 11:08AM UTC 70
73.6
GitHub Action Run
Source Files on build 12212370550
  • Tree
  • List 70
  • Changed 1
  • Source Changed 1
  • Coverage Changed 1
Coverage ∆ File Lines Relevant Covered Missed Hits/Line
  • Back to Repo
  • Github Actions Build #12212370550
  • 47efff88 on github
  • Prev Build on main (#12132494976)
  • Next Build on main (#12329639620)
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2025 Coveralls, Inc