• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

deepset-ai / haystack / 15206737205

23 May 2025 09:13AM UTC coverage: 90.072% (-0.1%) from 90.203%
15206737205

Pull #9431

github

web-flow
Merge ada554e21 into 720cc19d7
Pull Request #9431: feat: Improve formatting in print streaming chunk

11341 of 12591 relevant lines covered (90.07%)

0.9 hits per line

Source File
Press 'n' to go to next uncovered line, 'b' for previous

18.18
haystack/components/generators/utils.py
1
# SPDX-FileCopyrightText: 2022-present deepset GmbH <info@deepset.ai>
2
#
3
# SPDX-License-Identifier: Apache-2.0
4

5
from typing import Any, Dict
1✔
6

7
from openai.types.chat.chat_completion_chunk import ChoiceDeltaToolCall
1✔
8

9
from haystack.dataclasses import StreamingChunk
1✔
10

11

12
def print_streaming_chunk(chunk: StreamingChunk) -> None:
1✔
13
    """
14
    Callback function to handle and display streaming output chunks.
15

16
    This function processes a `StreamingChunk` object by:
17
    - Printing tool call metadata (if any), including function names and arguments, as they arrive.
18
    - Printing tool call results when available.
19
    - Printing the main content (e.g., text tokens) of the chunk as it is received.
20

21
    The function outputs data directly to stdout and flushes output buffers to ensure immediate display during
22
    streaming.
23

24
    :param chunk: A chunk of streaming data containing content and optional metadata, such as tool calls and
25
        tool results.
26
    """
27
    # Print tool call metadata if available (from ChatGenerator)
28
    if chunk.meta.get("tool_calls"):
×
29
        for tool_call in chunk.meta["tool_calls"]:
×
30
            # Convert to dict if tool_call is a ChoiceDeltaToolCall
31
            tool_call_dict: Dict[str, Any]
32
            if isinstance(tool_call, ChoiceDeltaToolCall):
×
33
                tool_call_dict = tool_call.to_dict()
×
34
            else:
35
                tool_call_dict = tool_call
×
36

37
            if tool_call_dict.get("function"):
×
38
                # print the tool name
39
                if tool_call_dict["function"].get("name"):
×
40
                    print("\n\n[TOOL CALL]\n", flush=True, end="")
×
41
                    print(f"Tool: {tool_call_dict['function']['name']} ", flush=True, end="")
×
42
                    print("\nArguments: ", flush=True, end="")
×
43

44
                # print the tool arguments
45
                if tool_call_dict["function"].get("arguments"):
×
46
                    print(tool_call_dict["function"]["arguments"], flush=True, end="")
×
47

48
    # Print tool call results if available (from ToolInvoker)
49
    if chunk.meta.get("tool_result"):
×
50
        print(f"\n\n[TOOL RESULT]\n{chunk.meta['tool_result']}", flush=True, end="")
×
51

52
    # Print the main content of the chunk (from ChatGenerator)
53
    if chunk.content:
×
54
        print(chunk.content, flush=True, end="")
×
55

56
    # End of LLM assistant message so we add two new lines
57
    # This ensures spacing between multiple LLM messages (e.g. Agent)
58
    if chunk.meta.get("finish_reason") is not None:
×
59
        print("\n\n", flush=True, end="")
×
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc