• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

pytorch / opacus / 14370790931 / 3
80%
main: 80%

Build:
DEFAULT BRANCH: main
Ran 10 Apr 2025 02:28AM UTC
Files 66
Run time 2s
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

10 Apr 2025 02:08AM UTC coverage: 48.492% (+0.02%) from 48.475%
14370790931.3

push

github

facebook-github-bot
Fix Fast Gradient Clipping bias gradient calculation for three dim data (#751)

Summary:
Pull Request resolved: https://github.com/pytorch/opacus/pull/751

The bias grad calculation for three dim data was incorect.

Let `G = g^Tg`, where `g`, of dimensions `Txd` be the per-sample activation gradient, where `T` is the number of tokens and `d` dimension.

The per-sample gradient norm  with respect to bias is
`vec(G)^T vec(1)`, instead of the erroneous,`vec(G)^T vec(G)` before. This diff fixes it.

Reviewed By: aparna-aketi, HuanyuZhang

Differential Revision: D70823094

fbshipit-source-id: c1fe1dd7f

1399 of 2885 relevant lines covered (48.49%)

0.48 hits per line

Source Files on job run-3 - 14370790931.3
  • Tree
  • List 66
  • Changed 1
  • Source Changed 1
  • Coverage Changed 1
Coverage ∆ File Lines Relevant Covered Missed Hits/Line
  • Back to Build 14370790931
  • 7264cd73 on github
  • Prev Job for on main (#14324926077.2)
  • Next Job for on main (#14372116584.3)
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc