• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

agentjido / req_llm / a47489b2d5819649e80a12521c32c25cd6afa60b / 2
49%
main: 49%

Build:
DEFAULT BRANCH: main
Ran 17 Nov 2025 08:10AM UTC
Files 82
Run time 611min
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

17 Nov 2025 08:10AM UTC coverage: 49.965% (+0.06%) from 49.909%
a47489b2d5819649e80a12521c32c25cd6afa60b.2

push

github

web-flow
Fix cached token extraction from Google API responses (#192)

Google's API returns cachedContentTokenCount in the usageMetadata
field for both implicit (automatic) and explicit (CachedContent API)
prompt caching. The provider was not extracting this value, causing
cached_tokens to always report 0 even when caching was active.

This fix extracts cachedContentTokenCount and converts it to the
OpenAI-compatible prompt_tokens_details.cached_tokens format for both
google and google-vertex providers using Gemini models.

Closes #191

3572 of 7149 relevant lines covered (49.97%)

14.7 hits per line

Source Files on job a47489b2d5819649e80a12521c32c25cd6afa60b.2
  • Tree
  • List 82
  • Changed 2
  • Source Changed 0
  • Coverage Changed 2
Coverage ∆ File Lines Relevant Covered Missed Hits/Line
  • Back to Build 0
  • a47489b2 on github
  • Prev Job for on main (#4908869ad3982fb484b75fd9cbfb45cb808438d4.1)
  • Next Job for on main (#932bb71a22339354d13a7a831534226c0b6df5c6.3)
  • Delete
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc