|
Ran
|
Files
118
|
Run time
410min
|
Badge
README BADGES
|
push
github
Add per-sample gradient norm computation as a functionality (#724) Summary: Pull Request resolved: https://github.com/pytorch/opacus/pull/724 Per-sample gradient norm is computed for Ghost Clipping, but it can be useful generally. Exposed it as a functionality. ``` ... loss.backward() per_sample_norms = model.per_sample_gradient_norms ``` Reviewed By: iden-kalemaj Differential Revision: D68634969 fbshipit-source-id: 7d5cb8a05
5047 of 5928 relevant lines covered (85.14%)
0.85 hits per line
| Coverage | ∆ | File | Lines | Relevant | Covered | Missed | Hits/Line |
|---|