• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

pytorch / opacus / 13255676524
80%

Build:
DEFAULT BRANCH: main
Ran 11 Feb 2025 04:13AM UTC
Jobs 3
Files 0
Run time –
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

pending completion
13255676524

push

github

facebook-github-bot
Add per-sample gradient norm computation as a functionality (#724)

Summary:
Pull Request resolved: https://github.com/pytorch/opacus/pull/724

Per-sample gradient norm is computed for Ghost Clipping, but it can be useful generally. Exposed it as a functionality.

```
...

loss.backward()
per_sample_norms  = model.per_sample_gradient_norms

```

Reviewed By: iden-kalemaj

Differential Revision: D68634969

fbshipit-source-id: 7d5cb8a05
Jobs
ID Job ID Ran Files Coverage
1 run-2 - 13255676524.1 11 Feb 2025 04:24AM UTC 118
85.04
GitHub Action Run
2 run-3 - 13255676524.2 11 Feb 2025 04:13AM UTC 65
48.79
GitHub Action Run
3 run-1 - 13255676524.3 11 Feb 2025 04:24AM UTC 118
85.02
GitHub Action Run
Source Files on build 13255676524
Detailed source file information is not available for this build.
  • Back to Repo
  • Github Actions Build #13255676524
  • 0d186a4e on github
  • Prev Build on main (#13123053632)
  • Next Build on main (#13277871533)
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc