• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

trishullab / LibraryAugmentedSymbolicRegression.jl / 13914428920 / 10
34%
master: 34%

Build:
DEFAULT BRANCH: master
Ran 18 Mar 2025 03:39AM UTC
Files 13
Run time 0s
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

18 Mar 2025 03:00AM UTC coverage: 35.811% (-1.3%) from 37.099%
13914428920.10

push

github

web-flow
Auto-download LLM (#43)

* adding llamafile exec capability

* format :/

* update deps

* update async call

* update serve_llm

* add: llamafile support.
llamafile will now start up with correct env variiable.

* Updating logging verbosity.
This helps address parsing issues that show up when using smaller LLMs.

* Update example

* formatting :/

* add flag passing option

* remove default port due to type instability

* formatting and resolve type instability.

* remove output type signature

* add LLM_FLAGS only if positive length

* for MacOS, we need to invoke the server with bashotherwise exec doesn't understand how to call the llamafile.

* update prompts

318 of 888 relevant lines covered (35.81%)

8102978.67 hits per line

Source Files on job julia-1-ubuntu-latest-online-push - 13914428920.10
  • Tree
  • List 13
  • Changed 4
  • Source Changed 4
  • Coverage Changed 3
Coverage ∆ File Lines Relevant Covered Missed Hits/Line
  • Back to Build 13914428920
  • 36ceeba8 on github
  • Prev Job for on master (#13911136694.3)
  • Next Job for on master (#16994568409.3)
  • Delete
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc