|
Ran
|
Files
120
|
Run time
2s
|
Badge
README BADGES
|
push
github
Remove presence of grad_sample from optimizer for FGC (#756) Summary: Pull Request resolved: https://github.com/pytorch/opacus/pull/756 In case of FGC, grad_samples is set to None in the backward hook after computing the norm per layer. There is no need to set p.grad_samples to None in the optimizer. Reviewed By: EnayatUllah Differential Revision: D74418221 fbshipit-source-id: 0f91288e0
5216 of 6086 relevant lines covered (85.7%)
0.86 hits per line
| Coverage | ∆ | File | Lines | Relevant | Covered | Missed | Hits/Line |
|---|