• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

agentjido / req_llm
51%
main: 52%

Build:
Build:
LAST BUILD BRANCH: bug/object-array
DEFAULT BRANCH: main
Repo Added 15 Sep 2025 11:42AM UTC
Token Qrw4J5oDDoi2zjHqBZDp1Hok4eVoONJAy regen
Build 237 Last
Files 83
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

LAST BUILD ON BRANCH bug/usage_total_cost
branch: bug/usage_total_cost
CHANGE BRANCH
x
Reset
Sync Branches
  • bug/usage_total_cost
  • add-bedrock-structured-output
  • add-hex-changelog-and-module-grouping
  • add-vllm-provider
  • add_openai_responses_api_structured_responses
  • add_retry_step
  • allow_json_schemas
  • bedrock-clean
  • bug/api-return-types
  • bug/codec-tool-calls
  • bug/debug-stream-return
  • bug/incorrect-model-spec-docs
  • bug/object-array
  • bug/openai-tool-calls
  • bug/streaming-race-condition
  • cerebras
  • chore/2025-10-14-update-fixtures
  • chore/object-fixtures
  • chore/object-fixtures-resurrected
  • chore/refine-fixtures
  • chore/refresh-coverage-tests
  • chore/update-models-2025-09-21
  • copilot/fix-32
  • devtools
  • egomes/fix-claude-multi-turn
  • egomes/fix-tool-inspection-with-json-schema
  • feat/context-json-serialization
  • feat/google-upload-file
  • feat/in-type-support
  • feat/structured-output-openai-google
  • feature/base-url-override
  • feature/cerebras-provider
  • feature/configurable-metadata-timeout
  • feature/model-catalog
  • feature/normalize-bedrock-inference-profiles
  • feature/pre-release-fixes
  • feature/refactor-llm-api-fixtures
  • feature/refined-key-management
  • feature/unique-model-provider-options
  • feature/upgrade-ex-aws-auth
  • feature/zai-fixtures
  • feature/zoi-schema
  • fix-anthropic-streaming
  • fix-duplicate-clause
  • fix-google
  • fix-groq-stream-error
  • fix-mix-task-docs
  • fix-openai-max-tokens-param
  • fix/bug-119-aws-auth-credentials
  • fix/cost-calculation-in-usage
  • fix/google-file-support
  • fix/google-structured-output
  • fix/http2-large-request-bodies
  • fix/issue-65-http-status-validation
  • fix/issue-96-validation-error-fields
  • fix/proxy-options
  • fix/registry-get-provider-nil-module
  • fix/tool_calls
  • google-vision
  • improve-metadata-provider-errors
  • main
  • patch-1
  • put-max-tokens-model-options
  • refactor/context-tools
  • refactor/req-streaming
  • refactor/xai-structured-objects
  • remove-jido-keys
  • zai

21 Oct 2025 01:30PM UTC coverage: 50.948% (-0.009%) from 50.957%
17f048844a5c0dc8acbfd1073201a288e9c9b1ac-PR-124

Pull #124

github

mikehostetler
Fix Dialyzer type mismatch in decode_response/2

- Normalize model to string when building Response structs
- Add @dialyzer nowarn attribute for decode_response/2 minimal fixture request
- Extract model_string in decode_chat_response for consistent typing
Pull Request #124: Enhance response handling and usage normalization

12 of 15 new or added lines in 4 files covered. (80.0%)

3 existing lines in 2 files now uncovered.

3359 of 6593 relevant lines covered (50.95%)

445.18 hits per line

Relevant lines Covered
Build:
Build:
6593 RELEVANT LINES 3359 COVERED LINES
445.18 HITS PER LINE
Source Files on bug/usage_total_cost
  • Tree
  • List 80
  • Changed 5
  • Source Changed 0
  • Coverage Changed 5
Coverage ∆ File Lines Relevant Covered Missed Hits/Line

Recent builds

Builds Branch Commit Type Ran Committer Via Coverage
17f04884... bug/usage_total_cost Fix Dialyzer type mismatch in decode_response/2 - Normalize model to string when building Response structs - Add @dialyzer nowarn attribute for decode_response/2 minimal fixture request - Extract model_string in decode_chat_response for consisten... Pull #124 21 Oct 2025 01:31PM UTC mikehostetler github
50.95
a2ce4888... bug/usage_total_cost Enhance response handling and usage normalization - Updated `ReqLLM.Response` to include a new usage normalization step, ensuring consistent field names between streaming and non-streaming responses. - Modified `ReqLLM.StreamResponse` to utilize ... Pull #124 21 Oct 2025 01:16PM UTC mikehostetler github
50.97
See All Builds (237)

Badge your Repo: req_llm

We detected this repo isn’t badged! Grab the embed code to the right, add it to your repo to show off your code coverage, and when the badge is live hit the refresh button to remove this message.

Could not find badge in README.

Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

Refresh
  • Settings
  • Repo on GitHub
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2025 Coveralls, Inc