• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

kobotoolbox / kpi / 22687010118 / 2
81%
master: 76%

Build:
Build:
LAST BUILD BRANCH: main
DEFAULT BRANCH: master
Ran 04 Mar 2026 08:44PM UTC
Files 894
Run time 22s
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

04 Mar 2026 08:05PM UTC coverage: 82.061% (+2.3%) from 79.785%
22687010118.2

push

github

web-flow
fix(qual): handle empty result from LLM DEV-1814 (#6785)

### đŸ“Ŗ Summary
Handle an error resulting from a rare edge case with the LLMs.


### 💭 Notes
On very rare occasions, the LLM will return 'None' instead of a string
answer. This previously resulted in a 500. With this PR we catch it and
use the regular error-handling flow.

### 👀 Preview steps
This can be hard to recreate. The most consistent way I've found to
force the LLM to provide an empty response is to locally set the
MAX_TOKENS to 1.

1. â„šī¸ have an account and a project with an audio question called
'audio'
2. Add a submission
3. Add a manual transcript
4. Add a select one question, ideally a nonsensical one
5. "Generate with AI"
6. 🔴 [on main] 500 error
7. đŸŸĸ [on PR] Response completes successfully (result will probably be
empty)

7553 of 11648 branches covered (64.84%)

28837 of 35141 relevant lines covered (82.06%)

0.82 hits per line

Source Files on job 22687010118.2
  • Tree
  • List 894
  • Changed 34
  • Source Changed 0
  • Coverage Changed 34
Coverage ∆ File Lines Relevant Covered Missed Hits/Line Branch Hits Branch Misses
  • Back to Build 22687010118
  • 7f25e230 on github
  • Prev Job for on main (#22682929922.1)
  • Next Job for on main (#22686993721.2)
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc