• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

vintasoftware / django-ai-assistant
94%
main: 94%

Build:
Build:
LAST BUILD BRANCH: 0.1.2
DEFAULT BRANCH: main
Repo Added 21 Jun 2024 12:42PM UTC
Files 24
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

LAST BUILD ON BRANCH feat/get_llm-models-without-temperature
branch: feat/get_llm-models-without-temperature
CHANGE BRANCH
x
Reset
  • feat/get_llm-models-without-temperature
  • 0.1.0
  • 0.1.1
  • 0.1.2
  • feat/json-example
  • feat/json-tour-guide
  • feat/markdown-rendering-htmx
  • feat/more-methods-in-ai-assistant-base
  • feat/movie-improvements
  • feat/store-all-messages
  • feat/threads-per-assistant
  • feat/update-dependencies
  • fix/coveralls
  • fix/history-aware-retriever-rewrite-question
  • fix/improve-movie-perf
  • fix/langchain-03
  • fix/parallel-tool-calling-sqlite
  • fix/rag-example
  • fix/tox-py-dj-versions
  • langgraph-refactoring
  • main
  • pr/ilikerobots/155
  • readme-poetry
  • refactor-tour-guide-ui
  • release/0.1.0
  • release/0.1.1
  • release/0.1.2
  • remove-aschain
  • setup-fixes
  • structured-outputs
  • temp-rodbv

09 Apr 2025 09:49PM UTC coverage: 94.48% (+0.03%) from 94.452%
14367633779

Pull #197

github

pamella
Undo wrong formatting
Pull Request #197: Allow omitting `temperature` in `get_llm` for models that do not support it

82 of 92 branches covered (89.13%)

Branch coverage included in aggregate %.

6 of 6 new or added lines in 1 file covered. (100.0%)

654 of 687 relevant lines covered (95.2%)

4.17 hits per line

Relevant lines Covered
Build:
Build:
687 RELEVANT LINES 654 COVERED LINES
4.17 HITS PER LINE
Source Files on feat/get_llm-models-without-temperature
  • Tree
  • List 24
  • Changed 3
  • Source Changed 0
  • Coverage Changed 3
Coverage ∆ File Lines Relevant Covered Missed Hits/Line Branch Hits Branch Misses

Recent builds

Builds Branch Commit Type Ran Committer Via Coverage
14367633779 feat/get_llm-models-without-temperature Undo wrong formatting Pull #197 09 Apr 2025 09:51PM UTC pamella github
94.48
14367619743 feat/get_llm-models-without-temperature Merge 261a35aac into be763a56a Pull #197 09 Apr 2025 09:50PM UTC web-flow github
94.48
14367619539 feat/get_llm-models-without-temperature Add none temp param test case push 09 Apr 2025 09:50PM UTC pamella github
94.48
14367532250 feat/get_llm-models-without-temperature Merge 5e8a13ed2 into be763a56a Pull #197 09 Apr 2025 09:43PM UTC web-flow github
94.48
14367443858 feat/get_llm-models-without-temperature Allow omitting temperature in get_llm for models that do not support the parameter push 09 Apr 2025 09:37PM UTC pamella github
94.48
14367206184 feat/get_llm-models-without-temperature Allow omitting temperature in get_llm for models that do not support the parameter push 09 Apr 2025 09:21PM UTC pamella github
94.48
See All Builds (325)
  • Repo on GitHub
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2025 Coveralls, Inc