• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

pytorch / opacus / 13277871533 / 3
80%
main: 80%

Build:
DEFAULT BRANCH: main
Ran 12 Feb 2025 09:52PM UTC
Files 118
Run time 410min
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

06 Feb 2025 01:59AM UTC coverage: 85.138% (+0.04%) from 85.096%
13277871533.3

push

github

facebook-github-bot
Add per-sample gradient norm computation as a functionality (#724)

Summary:
Pull Request resolved: https://github.com/pytorch/opacus/pull/724

Per-sample gradient norm is computed for Ghost Clipping, but it can be useful generally. Exposed it as a functionality.

```
...

loss.backward()
per_sample_norms  = model.per_sample_gradient_norms

```

Reviewed By: iden-kalemaj

Differential Revision: D68634969

fbshipit-source-id: 7d5cb8a05

5047 of 5928 relevant lines covered (85.14%)

0.85 hits per line

Source Files on job run-1 - 13277871533.3
  • Tree
  • List 118
  • Changed 2
  • Source Changed 1
  • Coverage Changed 2
Coverage ∆ File Lines Relevant Covered Missed Hits/Line
  • Back to Build 13277871533
  • 0d186a4e on github
  • Prev Job for on main (#13123053632.2)
  • Next Job for on main (#13293246437.3)
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc