• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

kobotoolbox / kpi / 23752379599
82%
master: 76%

Build:
Build:
LAST BUILD BRANCH: main
DEFAULT BRANCH: master
Ran 30 Mar 2026 03:21PM UTC
Jobs 10
Files 908
Run time 2min
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

30 Mar 2026 03:17PM UTC coverage: 82.208%. Remained the same
23752379599

push

github

web-flow
fix(qual): fix flaky QA throttling tests in parallel CI using `LocMemCache` isolation (#6881)

### 📣 Summary
This PR resolves intermittent test failures in
`TestAutomaticQAThrottling` by isolating its cache to local memory. This
prevents a race condition where parallel workers with identical
test-database User IDs were colliding in the shared Redis cache.

### 📖 Description
Fixes the intermittent `200 != 429` and `429 != 200` assertion errors in
`TestAutomaticQAThrottling` during parallel GitHub Action runs.

**The Root Cause (Database ID Collision):**
The issue happens when tests run in parallel with multiple workers. DRF
throttling stores request history in the Django cache using a key
derived from the authenticated user's ID (e.g.
`throttle_<scope>_<user.pk>`).

When tests run with multiple workers, each worker creates its own test
database but they share the same cache backend. Since users in each
worker often have the same primary key (e.g. pk=1), they generate the
same throttle cache key.

As a result, requests from different workers collide in the same cache
entry, causing the throttle state to leak across tests. This leads to
nondeterministic failures where tests may receive 200 instead of 429 or
vice versa.

**The Fix:**
Applied the `@override_settings` decorator to the
`TestAutomaticQAThrottling` class to force it to use
`django.core.cache.backends.locmem.LocMemCache`.

This isolates the cache entirely to the local memory of the specific
worker process. Each worker now has its own private "cache bucket,"
preventing cross-worker interference and resolving the flakiness without
requiring us to patch application logic or hardcode unique cache keys.

7764 of 11944 branches covered (65.0%)

29548 of 35943 relevant lines covered (82.21%)

5.8 hits per line

Jobs
ID Job ID Ran Files Coverage
1 23752379599.1 30 Mar 2026 03:21PM UTC 904
52.85
2 23752379599.2 30 Mar 2026 03:22PM UTC 904
54.39
3 23752379599.3 30 Mar 2026 03:22PM UTC 906
54.08
4 23752379599.4 30 Mar 2026 03:23PM UTC 906
55.64
5 23752379599.5 30 Mar 2026 03:24PM UTC 904
60.44
6 23752379599.6 30 Mar 2026 03:24PM UTC 906
66.0
7 23752379599.7 30 Mar 2026 03:25PM UTC 906
54.43
8 23752379599.8 30 Mar 2026 03:25PM UTC 906
61.58
9 23752379599.9 30 Mar 2026 03:28PM UTC 908
68.25
10 23752379599.10 30 Mar 2026 04:12PM UTC 904
53.08
Source Files on build 23752379599
  • Tree
  • List 908
  • Changed 0
  • Source Changed 0
  • Coverage Changed 0
Coverage ∆ File Lines Relevant Covered Missed Hits/Line Branch Hits Branch Misses
  • Back to Repo
  • 5a5b5cc2 on github
  • Prev Build on release/2.026.12 (#23690660597)
  • Next Build on release/2.026.12 (#23860065850)
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc