|
Ran
|
Files
66
|
Run time
1s
|
Badge
README BADGES
|
push
github
Remove presence of grad_sample from optimizer for FGC (#756) Summary: Pull Request resolved: https://github.com/pytorch/opacus/pull/756 In case of FGC, grad_samples is set to None in the backward hook after computing the norm per layer. There is no need to set p.grad_samples to None in the optimizer. Reviewed By: EnayatUllah Differential Revision: D74418221 fbshipit-source-id: 0f91288e0
1398 of 2881 relevant lines covered (48.52%)
0.49 hits per line
| Coverage | ∆ | File | Lines | Relevant | Covered | Missed | Hits/Line |
|---|