• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

ContinualAI / avalanche / 4993189103

pending completion
4993189103

Pull #1370

github

Unknown Committer
Unknown Commit Message
Pull Request #1370: Add base elements to support distributed comms. Add supports_distributed plugin flag.

258 of 822 new or added lines in 27 files covered. (31.39%)

80 existing lines in 5 files now uncovered.

15585 of 21651 relevant lines covered (71.98%)

2.88 hits per line

Source File
Press 'n' to go to next uncovered line, 'b' for previous

76.0
/avalanche/logging/base_logger.py
1
from abc import ABC
4✔
2

3
from typing import TYPE_CHECKING, List
4✔
4

5
from avalanche.distributed.distributed_helper import DistributedHelper
4✔
6

7
if TYPE_CHECKING:
4✔
8
    from avalanche.evaluation.metric_results import MetricValue
×
9
    from avalanche.training.templates import SupervisedTemplate
×
10

11

12
class BaseLogger(ABC):
4✔
13
    """Base class for loggers.
4✔
14

15
    Strategy loggers receive MetricValues from the Evaluation plugin and
16
    decide when and how to log them. MetricValues are processed
17
    by default using `log_metric` and `log_single_metric`.
18

19
    Additionally, loggers may implement any callback's handlers supported by
20
    the plugin's system of the template in use, which will be called
21
    automatically during the template's execution.
22
    This allows to control when the logging happen and how. For example,
23
    interactive loggers typically prints at the end of an
24
    epoch/experience/stream.
25

26
    Each child class should implement the `log_single_metric` method, which
27
    logs a single MetricValue.
28
    """
29

30
    def __init__(self):
4✔
31
        super().__init__()
4✔
32

33
        if not DistributedHelper.is_main_process:
4✔
34

NEW
35
            raise RuntimeError(
×
36
                'You are creating a logger in a non-main process during a '
37
                'distributed training session. '
38
                'Jump to this error for an example on how to fix this.')
39
        
40
        # You have to create the loggers in the main process only. Otherwise,
41
        # metrics will end up duplicated in your log files and consistency
42
        # errors may arise. When creating the EvaluationPlugin in a
43
        # non-main process, just pass loggers=None.
44
        #
45
        # Recommended way:
46
        # if not DistributedHelper.is_main_process
47
        #     # Define the loggers
48
        #     loggers = [...]
49
        # else:
50
        #     loggers = None
51
        #
52
        # # Instantiate the evaluation plugin
53
        # eval_plugin = EvaluationPlugin(metricA, metricB, ..., loggers=loggers)
54
        #
55
        # # Instantiate the strategy
56
        # strategy = MyStrategy(..., evaluator=eval_plugin)
57

58
    def log_single_metric(self, name, value, x_plot):
4✔
59
        """Log a metric value.
60

61
        This method is called whenever new metrics are available.
62
        By default, all the values are ignored.
63

64
        :param name: str, metric name
65
        :param value: the metric value, will be ignored if
66
            not supported by the logger
67
        :param x_plot: an integer representing the x value
68
            associated to the metric value
69
        """
70
        pass
4✔
71

72
    def log_metrics(self, metric_values: List["MetricValue"]) -> None:
4✔
73
        """Receive a list of MetricValues to log.
74

75
        This method is called whenever new metrics are available.
76

77
        :param metric_values: list of MetricValues to log.
78
        :param callback: The name of the callback (event) during which the
79
            metric value was collected.
80
        :return: None
81
        """
82
        for mval in metric_values:
4✔
83
            name = mval.name
4✔
84
            value = mval.value
4✔
85
            x_plot = mval.x_plot
4✔
86

87
            if isinstance(value, dict):
4✔
88
                for k, v in value.items():
×
89
                    n = f"{name}/{k}"
×
90
                    self.log_single_metric(n, v, x_plot)
×
91
            else:
92
                self.log_single_metric(name, value, x_plot)
4✔
93

94

95
__all__ = ["BaseLogger"]
4✔
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2025 Coveralls, Inc