• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

rendezqueue / rendezllama / 6725686114
90%
trunk: 87%

Build:
Build:
LAST BUILD BRANCH: coverage-fix-14520834744988857627
DEFAULT BRANCH: trunk
Ran 01 Nov 2023 10:14PM UTC
Jobs 1
Files 14
Run time 1s
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

01 Nov 2023 09:53PM UTC coverage: 82.424% (-0.06%) from 82.484%
6725686114

push

github

grencez
update(llama.cpp): with min-p sampling

This also includes a change to call llama_kv_cache_seq_rm()
instead of llama_kv_cache_tokens_rm(), which was removed upstream
because it only works for code tha calls the deprecated llama_eval().
We use llama_decode() instead of llama_eval(),
so all rewriting was probably a bit broken.

2 of 2 new or added lines in 1 file covered. (100.0%)

891 of 1081 relevant lines covered (82.42%)

14.09 hits per line

Jobs
ID Job ID Ran Files Coverage
1 6725686114.1 01 Nov 2023 10:14PM UTC 0
82.42
GitHub Action Run
Source Files on build 6725686114
Detailed source file information is not available for this build.
  • Back to Repo
  • e44e5cfd on github
  • Prev Build on stage (#6718438638)
  • Next Build on update (#6765855778)
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc