• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

kobotoolbox / kpi / 22687010118 / 1
82%
master: 76%

Build:
Build:
LAST BUILD BRANCH: dev-1829-allauth-openapi-docs
DEFAULT BRANCH: master
Ran 04 Mar 2026 08:39PM UTC
Files 892
Run time 22s
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

04 Mar 2026 08:05PM UTC coverage: 79.786% (+0.001%) from 79.785%
22687010118.1

push

github

web-flow
fix(qual): handle empty result from LLM DEV-1814 (#6785)

### đŸ“Ŗ Summary
Handle an error resulting from a rare edge case with the LLMs.


### 💭 Notes
On very rare occasions, the LLM will return 'None' instead of a string
answer. This previously resulted in a 500. With this PR we catch it and
use the regular error-handling flow.

### 👀 Preview steps
This can be hard to recreate. The most consistent way I've found to
force the LLM to provide an empty response is to locally set the
MAX_TOKENS to 1.

1. â„šī¸ have an account and a project with an audio question called
'audio'
2. Add a submission
3. Add a manual transcript
4. Add a select one question, ideally a nonsensical one
5. "Generate with AI"
6. 🔴 [on main] 500 error
7. đŸŸĸ [on PR] Response completes successfully (result will probably be
empty)

7307 of 11636 branches covered (62.8%)

28032 of 35134 relevant lines covered (79.79%)

0.8 hits per line

Source Files on job 22687010118.1
  • Tree
  • List 892
  • Changed 1
  • Source Changed 0
  • Coverage Changed 1
Coverage ∆ File Lines Relevant Covered Missed Hits/Line Branch Hits Branch Misses
  • Back to Build 22687010118
  • 7f25e230 on github
  • Prev Job for on main (#22682929922.1)
  • Next Job for on main (#22686993721.2)
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc