• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

pytorch / opacus / 13233359187 / 3
80%
main: 80%

Build:
DEFAULT BRANCH: main
Ran 10 Feb 2025 04:25AM UTC
Files 65
Run time 2s
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

06 Feb 2025 01:59AM UTC coverage: 48.788% (-0.03%) from 48.819%
13233359187.3

push

github

facebook-github-bot
Add per-sample gradient norm computation as a functionality (#724)

Summary:
Pull Request resolved: https://github.com/pytorch/opacus/pull/724

Per-sample gradient norm is computed for Ghost Clipping, but it can be useful generally. Exposed it as a functionality.

```
...

loss.backward()
per_sample_norms  = model.per_sample_gradient_norms

```

Reviewed By: iden-kalemaj

Differential Revision: D68634969

fbshipit-source-id: 7d5cb8a05

1389 of 2847 relevant lines covered (48.79%)

0.49 hits per line

Source Files on job run-3 - 13233359187.3
  • Tree
  • List 65
  • Changed 1
  • Source Changed 1
  • Coverage Changed 1
Coverage ∆ File Lines Relevant Covered Missed Hits/Line
  • Back to Build 13233359187
  • 0d186a4e on github
  • Prev Job for on main (#13123053632.3)
  • Next Job for on main (#13255676524.2)
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc