|
Ran
|
Jobs
3
|
Files
121
|
Run time
1min
|
Badge
README BADGES
|
push
github
Remove presence of grad_sample from optimizer for FGC (#756) Summary: Pull Request resolved: https://github.com/pytorch/opacus/pull/756 In case of FGC, grad_samples is set to None in the backward hook after computing the norm per layer. There is no need to set p.grad_samples to None in the optimizer. Reviewed By: EnayatUllah Differential Revision: D74418221 fbshipit-source-id: 0f91288e0
1 of 1 new or added line in 1 file covered. (100.0%)
5310 of 6184 relevant lines covered (85.87%)
1.91 hits per line
| ID | Job ID | Ran | Files | Coverage | |
|---|---|---|---|---|---|
| 1 | run-3 - 14963631569.1 | 66 |
48.52 |
GitHub Action Run | |
| 2 | run-2 - 14963631569.2 | 120 |
85.7 |
GitHub Action Run | |
| 3 | run-1 - 14963631569.3 | 120 |
85.7 |
GitHub Action Run |
| Coverage | ∆ | File | Lines | Relevant | Covered | Missed | Hits/Line |
|---|