• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

joagonzalez / cyclops-devops-agent / 15243805511

26 May 2025 12:50AM UTC coverage: 85.284% (-12.4%) from 97.685%
15243805511

push

github

web-flow
Merge pull request #20 from joagonzalez/rc-v0.1.0

Rc v0.1.0

45 of 84 new or added lines in 5 files covered. (53.57%)

255 of 299 relevant lines covered (85.28%)

0.85 hits per line

Source File
Press 'n' to go to next uncovered line, 'b' for previous

69.23
/src/api/llm.py
1
"""
2
API endpoints for sending prompts to Large Language Models (LLMs)
3
from external tools and applications.
4
"""
5
from typing import Any
1✔
6
from fastapi import APIRouter, status
1✔
7
from src.config.settings import config
1✔
8
from src.services.chatgpt import ChatGPTService
1✔
9
from fastapi import Body
1✔
10

11

12
router = APIRouter()
1✔
13
llmClient = ChatGPTService(
1✔
14
    api_key=config["OPENAI"]["API_KEY"], model=config["OPENAI"]["MODEL"]
15
)
16

17
@router.post("/query/", tags=["LLM"], status_code=status.HTTP_200_OK)
1✔
18
async def query_llm(prompt: str = Body(..., embed=True), secret: str = 'letmepass') -> Any:
1✔
19
    """
20
    Sends a prompt to the instantiated LLM and returns the response.
21

22
    Args:
23
        prompt (str): The input prompt to send to the LLM.
24

25
    Returns:
26
        Any: The LLM's response.
27
    """
NEW
28
    if secret != 'cyclops2025':
×
NEW
29
        return {"error": "Unauthorized access. Invalid secret."}
×
30
    
NEW
31
    response = llmClient.chat(
×
32
        prompt=prompt,
33
        system_prompt=config["OPENAI"]["SYSTEM_PROMPT"],
34
        temperature=config["OPENAI"]["TEMPERATURE"],
35
    )
NEW
36
    return {"response": response}
×
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc