• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

lucasliet / llm-telegram-bot / 21687984441

04 Feb 2026 08:53PM UTC coverage: 59.131% (-2.3%) from 61.43%
21687984441

push

github

web-flow
feat: agent tool call loop (#22)

# Agent Loop Implementation - Resumo

## 📦 Estrutura Criada

```
src/service/openai/
├── agent/
│   ├── AgentLoopConfig.ts      # Configurações e tipos
│   ├── AgentLoopExecutor.ts    # Orquestrador do loop
│   └── index.ts                # Exports
├── stream/
│   ├── StreamProcessor.ts      # Interface base
│   ├── ChatCompletionsStream.ts    # Processador Chat API
│   ├── ResponsesAPIStream.ts       # Processador Responses API
│   └── index.ts                # Exports
└── OpenAIService.ts            # Serviço refatorado
```

## 🎯 Mudança Fundamental

### Antes (comportamento original)
```typescript
if (tool_calls.length > 0) {
  // Executa tools UMA vez
  // Retorna resposta
}
```

### Depois (comportamento agente)
```typescript
while (tool_calls.length > 0) {
  // Executa tools
  // Acumula resultados no contexto
  // Faz nova chamada ao modelo
  // Repete até o modelo retornar apenas mensagem de texto
}
```

## 🔧 Componentes Principais

### 1. AgentLoopConfig
- `maxIterations`: Limite de iterações (padrão: 10)
- `maxContextTokens`: Limite de tokens no contexto (padrão: 100000)
- `toolExecutionTimeout`: Timeout por tool (padrão: 30000ms)
- Callbacks opcionais para observabilidade

### 2. StreamProcessor (interface)
- `processStream()`: Processa stream e extrai tool calls
- `formatToolResultsForNextCall()`: Formata resultados para próxima chamada

### 3. ChatCompletionsStreamProcessor
- Implementa `StreamProcessor` para Chat Completions API
- Acumula tool calls que vêm em chunks
- Formata como `ChatCompletionMessageParam[]`

### 4. ResponsesAPIStreamProcessor
- Implementa `StreamProcessor` para Responses API
- Processa eventos SSE (Server-Sent Events)
- Formata como `ResponseInputItem[]`

### 5. AgentLoopExecutor
- Orquestra o loop recursivo
- Executa tools em paralelo
- Gerencia estado e proteções
- Mantém streaming ativo para o usuário

## ✅ Compatibilidade

### Com Subclasses
- `GithubCopilotService`
- `Perplexit... (continued)

113 of 269 branches covered (42.01%)

Branch coverage included in aggregate %.

584 of 1242 new or added lines in 17 files covered. (47.02%)

7 existing lines in 4 files now uncovered.

2228 of 3690 relevant lines covered (60.38%)

15.18 hits per line

Source File
Press 'n' to go to next uncovered line, 'b' for previous

24.74
/src/service/openai/stream/ResponsesAPIStream.ts
1
import OpenAi from 'npm:openai';
2
import { ExtractedToolCall, StreamProcessingResult, StreamProcessor } from './StreamProcessor.ts';
3
import { ToolExecutionResult } from '../agent/AgentLoopConfig.ts';
4

5
/**
1✔
6
 * Stream processor for OpenAI Responses API.
7
 * Handles extraction of function calls from the SSE streaming response.
8
 */
19✔
9
export class ResponsesAPIStreamProcessor implements StreamProcessor {
19✔
10
        async processStream(
19✔
11
                reader: ReadableStreamDefaultReader<Uint8Array>,
19✔
12
                controller: ReadableStreamDefaultController<Uint8Array>,
19✔
13
        ): Promise<StreamProcessingResult> {
19✔
14
                const pendingCalls = new Map<number, ExtractedToolCall>();
20✔
15
                const completedCalls: ExtractedToolCall[] = [];
20✔
16
                let hasAssistantContent = false;
20✔
17
                let rawContent = '';
20✔
18
                let buffer = '';
20✔
19
                const decoder = new TextDecoder();
20✔
20

21
                while (true) {
20✔
22
                        const { done, value } = await reader.read();
20✔
23
                        if (done) break;
20!
24

NEW
25
                        // Always enqueue to user (streaming)
×
NEW
26
                        controller.enqueue(value);
×
NEW
27
                        buffer += decoder.decode(value, { stream: true });
×
28

NEW
29
                        let newlineIndex;
×
NEW
30
                        while ((newlineIndex = buffer.indexOf('\n')) !== -1) {
×
NEW
31
                                const line = buffer.slice(0, newlineIndex).trim();
×
NEW
32
                                buffer = buffer.slice(newlineIndex + 1);
×
33

NEW
34
                                if (!line) continue;
×
35

NEW
36
                                let jsonPayload = line;
×
NEW
37
                                if (line.startsWith('data:')) {
×
NEW
38
                                        jsonPayload = line.slice(5).trim();
×
NEW
39
                                }
×
40

NEW
41
                                if (!jsonPayload || jsonPayload === '[DONE]') continue;
×
42

NEW
43
                                try {
×
NEW
44
                                        const event = JSON.parse(jsonPayload);
×
45

NEW
46
                                        // Detect start of function call
×
NEW
47
                                        if (
×
NEW
48
                                                event.type === 'response.output_item.added' &&
×
NEW
49
                                                event.item?.type === 'function_call'
×
NEW
50
                                        ) {
×
NEW
51
                                                pendingCalls.set(event.output_index, {
×
NEW
52
                                                        id: event.item.call_id || '',
×
NEW
53
                                                        name: event.item.name || '',
×
NEW
54
                                                        arguments: '',
×
NEW
55
                                                });
×
NEW
56
                                        }
×
57

NEW
58
                                        // Accumulate arguments
×
NEW
59
                                        if (event.type === 'response.function_call_arguments.delta') {
×
NEW
60
                                                const pending = pendingCalls.get(event.output_index);
×
NEW
61
                                                if (pending) {
×
NEW
62
                                                        pending.arguments += event.delta || '';
×
NEW
63
                                                }
×
NEW
64
                                        }
×
65

NEW
66
                                        // Finalize function call
×
NEW
67
                                        if (event.type === 'response.function_call_arguments.done') {
×
NEW
68
                                                const pending = pendingCalls.get(event.output_index);
×
NEW
69
                                                if (pending) {
×
NEW
70
                                                        pending.arguments = event.arguments || pending.arguments;
×
NEW
71
                                                        completedCalls.push(pending);
×
NEW
72
                                                        pendingCalls.delete(event.output_index);
×
NEW
73
                                                }
×
NEW
74
                                        }
×
75

NEW
76
                                        // Detect text content
×
NEW
77
                                        if (event.type === 'response.output_text.delta') {
×
NEW
78
                                                hasAssistantContent = true;
×
NEW
79
                                                rawContent += event.delta || '';
×
NEW
80
                                        }
×
NEW
81
                                } catch {
×
NEW
82
                                        // Line is not valid JSON
×
NEW
83
                                }
×
NEW
84
                        }
×
85
                }
20✔
86

87
                return {
20✔
88
                        toolCalls: completedCalls,
20✔
89
                        hasAssistantContent,
20✔
90
                        rawContent,
20✔
91
                };
20✔
92
        }
20✔
93

NEW
94
        formatToolResultsForNextCall(
×
NEW
95
                results: ToolExecutionResult[],
×
NEW
96
        ): OpenAi.Responses.ResponseInputItem[] {
×
NEW
97
                const items: OpenAi.Responses.ResponseInputItem[] = [];
×
98

NEW
99
                for (const result of results) {
×
NEW
100
                        // Add the original function_call
×
NEW
101
                        items.push({
×
NEW
102
                                type: 'function_call',
×
NEW
103
                                call_id: result.toolCallId,
×
NEW
104
                                name: result.toolName,
×
NEW
105
                                arguments: result.arguments,
×
NEW
106
                        } as OpenAi.Responses.ResponseInputItem);
×
107

NEW
108
                        // Add the result
×
NEW
109
                        items.push({
×
NEW
110
                                type: 'function_call_output',
×
NEW
111
                                call_id: result.toolCallId,
×
NEW
112
                                output: JSON.stringify(result.result),
×
NEW
113
                        } as OpenAi.Responses.ResponseInputItem);
×
NEW
114
                }
×
115

NEW
116
                return items;
×
NEW
117
        }
×
118
}
19✔
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc