• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

agentjido / req_llm
49%
main: 50%

Build:
Build:
LAST BUILD BRANCH: fix/google-file-support
DEFAULT BRANCH: main
Repo Added 15 Sep 2025 11:42AM UTC
Token Qrw4J5oDDoi2zjHqBZDp1Hok4eVoONJAy regen
Build 151 Last
Files 79
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

LAST BUILD ON BRANCH put-max-tokens-model-options
branch: put-max-tokens-model-options
CHANGE BRANCH
x
Reset
Sync Branches
  • put-max-tokens-model-options
  • add-vllm-provider
  • bedrock-clean
  • bug/api-return-types
  • bug/codec-tool-calls
  • bug/debug-stream-return
  • bug/incorrect-model-spec-docs
  • bug/openai-tool-calls
  • bug/streaming-race-condition
  • cerebras
  • chore/2025-10-14-update-fixtures
  • chore/object-fixtures
  • chore/object-fixtures-resurrected
  • chore/refine-fixtures
  • chore/refresh-coverage-tests
  • chore/update-models-2025-09-21
  • copilot/fix-32
  • devtools
  • egomes/fix-claude-multi-turn
  • feat/context-json-serialization
  • feat/google-upload-file
  • feat/in-type-support
  • feat/structured-output-openai-google
  • feature/cerebras-provider
  • feature/pre-release-fixes
  • feature/refactor-llm-api-fixtures
  • feature/refined-key-management
  • feature/unique-model-provider-options
  • feature/zai-fixtures
  • fix-anthropic-streaming
  • fix-duplicate-clause
  • fix-google
  • fix-groq-stream-error
  • fix-mix-task-docs
  • fix-openai-max-tokens-param
  • fix/cost-calculation-in-usage
  • fix/google-file-support
  • fix/google-structured-output
  • fix/proxy-options
  • fix/registry-get-provider-nil-module
  • google-vision
  • improve-metadata-provider-errors
  • main
  • patch-1
  • refactor/context-tools
  • refactor/req-streaming
  • refactor/xai-structured-objects
  • zai

11 Oct 2025 04:02AM UTC coverage: 48.773% (+0.09%) from 48.682%
799236d071d64072b20e90dd25d3bc0975dd4572-PR-95

Pull #95

github

lucas-stellet
Add test to ensure max_tokens: 0 is not extracted for embedding models
Pull Request #95: fix: Respect max_tokens from Model.new/3 across all providers

4 of 4 new or added lines in 1 file covered. (100.0%)

2684 of 5503 relevant lines covered (48.77%)

177.42 hits per line

Relevant lines Covered
Build:
Build:
5503 RELEVANT LINES 2684 COVERED LINES
177.42 HITS PER LINE
Source Files on put-max-tokens-model-options
  • Tree
  • List 67
  • Changed 13
  • Source Changed 0
  • Coverage Changed 13
Coverage ∆ File Lines Relevant Covered Missed Hits/Line

Recent builds

Builds Branch Commit Type Ran Committer Via Coverage
799236d0... put-max-tokens-model-options Add test to ensure max_tokens: 0 is not extracted for embedding models Pull #95 13 Oct 2025 01:24PM UTC lucas-stellet github
48.77
See All Builds (151)

Badge your Repo: req_llm

We detected this repo isn’t badged! Grab the embed code to the right, add it to your repo to show off your code coverage, and when the badge is live hit the refresh button to remove this message.

Could not find badge in README.

Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

Refresh
  • Settings
  • Repo on GitHub
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2025 Coveralls, Inc