• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

f-dangel / backpack / 8116261751

01 Mar 2024 07:30PM UTC coverage: 98.375%. Remained the same
8116261751

Pull #323

github

web-flow
Merge 610195223 into e9b1dd361
Pull Request #323: [FIX | FMT] RTD build, apply latest `black` and `isort`

97 of 97 new or added lines in 97 files covered. (100.0%)

43 existing lines in 18 files now uncovered.

4420 of 4493 relevant lines covered (98.38%)

11.77 hits per line

Source File
Press 'n' to go to next uncovered line, 'b' for previous

91.67
/backpack/extensions/firstorder/gradient/batchnorm_nd.py
1
"""Gradient extension for BatchNorm."""
2
from typing import Tuple, Union
12✔
3

4
from torch import Tensor
12✔
5
from torch.nn import BatchNorm1d, BatchNorm2d, BatchNorm3d
12✔
6

7
from backpack.core.derivatives.batchnorm_nd import BatchNormNdDerivatives
12✔
8
from backpack.extensions.backprop_extension import BackpropExtension
12✔
9
from backpack.utils.errors import batch_norm_raise_error_if_train
12✔
10

11
from .base import GradBaseModule
12✔
12

13

14
class GradBatchNormNd(GradBaseModule):
12✔
15
    """Gradient extension for BatchNorm."""
16

17
    def __init__(self):
12✔
18
        """Initialization."""
19
        super().__init__(
12✔
20
            derivatives=BatchNormNdDerivatives(), params=["bias", "weight"]
21
        )
22

23
    def check_hyperparameters_module_extension(
12✔
24
        self,
25
        ext: BackpropExtension,
26
        module: Union[BatchNorm1d, BatchNorm2d, BatchNorm3d],
27
        g_inp: Tuple[Tensor],
28
        g_out: Tuple[Tensor],
29
    ) -> None:  # noqa: D102
UNCOV
30
        batch_norm_raise_error_if_train(module)
×
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc