• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

Alan-Jowett / CoPilot-For-Consensus / 20648709675 / 11
78%
main: 78%

Build:
DEFAULT BRANCH: main
Ran 02 Jan 2026 01:25AM UTC
Files 6
Run time 0s
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

02 Jan 2026 01:20AM UTC coverage: 67.514%. Remained the same
20648709675.11

push

github

web-flow
refactor: Optimize prompts for KV cache reuse and proper adapter separation (#678)

* refactor: optimize prompts for KV cache reuse

- Move static instructions to system prompt to create stable cacheable prefix
- Keep variable thread data in user prompt
- Improves performance by caching instructions across requests
- No functional changes to prompt behavior or output

* feat: integrate prompt files into orchestrator service

- Add COPY instruction in Dockerfile to include orchestrator/prompts
- Load system and user prompts from files at service startup
- Pass system_prompt_path and user_prompt_path from environment config
- Update _publish_summarization_requested to send actual prompt content
- Replace hardcoded 'consensus-summary-v1' identifier with file-based prompts
- Enables KV cache reuse optimization from PR #673
- Honors SYSTEM_PROMPT_PATH and USER_PROMPT_PATH environment variables
  (defaults: /app/prompts/system.txt and /app/prompts/user.txt)

* fix: merge prompts into prompt_template field to match event schema
refactor(adapters): make summarizers dumb - move all prompt construction to service layer

- Orchestrator now merges system_prompt and user_prompt into single prompt_template
- Maintains event schema contract (single string field)
- System prompt stays static (instructions), user prompt stays variable (data)
- Merged at event publish time with newline separation
- Enables KV cache reuse while honoring event schema

BREAKING CHANGE: Thread.prompt_template is now Thread.prompt (complete pre-constructed prompt)

- All three summarizer adapters (local_llm, openai, llamacpp) now receive complete prompts
- Removed prompt construction logic from adapters (no business logic in adapters)
- Summarization service now fully responsible for: template substitution, message formatting, prompt assembly
- Thread model updated: prompt_template field -> prompt field (semantic: complete ready-to-use prompt)
- Adapters now true dumb wrappers: receive prom... (continued)

478 of 708 relevant lines covered (67.51%)

0.68 hits per line

Source Files on job copilot_storage - 20648709675.11
  • Tree
  • List 6
  • Changed 0
  • Source Changed 0
  • Coverage Changed 0
Coverage ∆ File Lines Relevant Covered Missed Hits/Line
  • Back to Build 20648709675
  • 00854030 on github
  • Prev Job for on main (#20646713635.13)
  • Next Job for on main (#20649650818.5)
  • Delete
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc