• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

pytorch / opacus / 14372116584
80%

Build:
DEFAULT BRANCH: main
Ran 10 Apr 2025 04:13AM UTC
Jobs 3
Files 121
Run time 1min
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

10 Apr 2025 02:08AM UTC coverage: 85.633% (-0.01%) from 85.647%
14372116584

push

github

facebook-github-bot
Fix Fast Gradient Clipping bias gradient calculation for three dim data (#751)

Summary:
Pull Request resolved: https://github.com/pytorch/opacus/pull/751

The bias grad calculation for three dim data was incorect.

Let `G = g^Tg`, where `g`, of dimensions `Txd` be the per-sample activation gradient, where `T` is the number of tokens and `d` dimension.

The per-sample gradient norm  with respect to bias is
`vec(G)^T vec(1)`, instead of the erroneous,`vec(G)^T vec(G)` before. This diff fixes it.

Reviewed By: aparna-aketi, HuanyuZhang

Differential Revision: D70823094

fbshipit-source-id: c1fe1dd7f

15 of 15 new or added lines in 2 files covered. (100.0%)

5299 of 6188 relevant lines covered (85.63%)

1.91 hits per line

Jobs
ID Job ID Ran Files Coverage
1 run-2 - 14372116584.1 10 Apr 2025 04:24AM UTC 120
85.46
GitHub Action Run
2 run-1 - 14372116584.2 10 Apr 2025 04:25AM UTC 120
85.47
GitHub Action Run
3 run-3 - 14372116584.3 10 Apr 2025 04:13AM UTC 66
48.49
GitHub Action Run
Source Files on build 14372116584
  • Tree
  • List 121
  • Changed 2
  • Source Changed 2
  • Coverage Changed 2
Coverage ∆ File Lines Relevant Covered Missed Hits/Line
  • Back to Repo
  • Github Actions Build #14372116584
  • 7264cd73 on github
  • Prev Build on main (#14324926077)
  • Next Build on main (#14388730763)
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc