• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

agentjido / req_llm / 68dcc54c0cb2c59d9fcd702b964062d11fbf00cf-PR-193
49%
main: 49%

Build:
Build:
LAST BUILD BRANCH: feat/load-dotenv-config
DEFAULT BRANCH: main
Ran 18 Nov 2025 07:59AM UTC
Jobs 4
Files 84
Run time 2min
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

18 Nov 2025 07:52AM UTC coverage: 48.985% (-1.0%) from 49.986%
68dcc54c0cb2c59d9fcd702b964062d11fbf00cf-PR-193

Pull #193

github

neilberkman
feat: Add Google Context Caching support for Gemini models

Adds explicit context caching API for Gemini models to reduce costs by up to
90% when reusing large amounts of content.

- Add ReqLLM.Providers.Google.CachedContent module for cache CRUD operations
- Support for both Google AI Studio and Vertex AI (when Gemini support is added)
- Add cached_content provider option to reference existing caches
- Comprehensive tests for cache creation, listing, updating, and deletion
- Documentation and examples in Google provider moduledoc
- Updated CHANGELOG

```elixir
{:ok, cache} = ReqLLM.Providers.Google.CachedContent.create(
  provider: :google,
  model: "gemini-2.5-flash",
  api_key: System.get_env("GOOGLE_API_KEY"),
  contents: [%{role: "user", parts: [%{text: large_document}]}],
  ttl: "3600s"
)

{:ok, response} = ReqLLM.generate_text(
  "google:gemini-2.5-flash",
  "Question about the document?",
  provider_options: [cached_content: cache.name]
)

IO.inspect(response.usage.cached_tokens)
```

- Gemini 2.5 Flash: 1,024 minimum tokens
- Gemini 2.5 Pro: 4,096 minimum tokens
Pull Request #193: feat: Add Google Context Caching support for Gemini models

6 of 159 new or added lines in 2 files covered. (3.77%)

161 existing lines in 1 file now uncovered.

3597 of 7343 relevant lines covered (48.99%)

57.03 hits per line

New Missed Lines in Diff

Lines Coverage ∆ File
153
2.55
lib/req_llm/providers/google/cached_content.ex

Uncovered Existing Lines

Lines Coverage ∆ File
161
62.72
0.14% lib/req_llm/providers/google.ex
Jobs
ID Job ID Ran Files Coverage
1 68dcc54c0cb2c59d9fcd702b964062d11fbf00cf-PR-193.1 18 Nov 2025 07:59AM UTC 84
48.99
GitHub Action Run
2 68dcc54c0cb2c59d9fcd702b964062d11fbf00cf-PR-193.2 18 Nov 2025 07:59AM UTC 84
48.93
GitHub Action Run
3 68dcc54c0cb2c59d9fcd702b964062d11fbf00cf-PR-193.3 18 Nov 2025 07:59AM UTC 84
48.92
GitHub Action Run
4 68dcc54c0cb2c59d9fcd702b964062d11fbf00cf-PR-193.4 18 Nov 2025 07:59AM UTC 84
48.92
GitHub Action Run
Source Files on build 68dcc54c0cb2c59d9fcd702b964062d11fbf00cf-PR-193
  • Tree
  • List 84
  • Changed 1
  • Source Changed 0
  • Coverage Changed 1
Coverage ∆ File Lines Relevant Covered Missed Hits/Line
  • Back to Repo
  • Pull Request #193
  • PR Base - main (#E86A7A74...)
  • Delete
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc