• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

pytorch / opacus / 13255676524 / 3
80%
main: 80%

Build:
DEFAULT BRANCH: main
Ran 12 Feb 2025 09:27PM UTC
Files 118
Run time 87min
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

06 Feb 2025 01:59AM UTC coverage: 85.02% (-0.08%) from 85.096%
13255676524.3

push

github

facebook-github-bot
Add per-sample gradient norm computation as a functionality (#724)

Summary:
Pull Request resolved: https://github.com/pytorch/opacus/pull/724

Per-sample gradient norm is computed for Ghost Clipping, but it can be useful generally. Exposed it as a functionality.

```
...

loss.backward()
per_sample_norms  = model.per_sample_gradient_norms

```

Reviewed By: iden-kalemaj

Differential Revision: D68634969

fbshipit-source-id: 7d5cb8a05

5040 of 5928 relevant lines covered (85.02%)

0.85 hits per line

Source Files on job run-1 - 13255676524.3
  • Tree
  • List 118
  • Changed 4
  • Source Changed 1
  • Coverage Changed 4
Coverage ∆ File Lines Relevant Covered Missed Hits/Line
  • Back to Build 13255676524
  • 0d186a4e on github
  • Prev Job for on main (#13123053632.2)
  • Next Job for on main (#13277871533.1)
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc