• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

pytorch / opacus / 12981869401 / 3
80%
main: 80%

Build:
DEFAULT BRANCH: main
Ran 27 Jan 2025 04:25AM UTC
Files 65
Run time 1s
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

26 Jan 2025 01:01AM UTC coverage: 48.819% (-0.1%) from 48.94%
12981869401.3

push

github

facebook-github-bot
Modifying DPLossFastGradientClipping to add support for generative tasks with ghost clipping (#722)

Summary:
Pull Request resolved: https://github.com/pytorch/opacus/pull/722

Generative tasks for NLP output predictions of shape (B,T,C) i.e., (batch_size, sequence_length, vocab_size). To compute the cross-entropy loss in this case, usually the predictions are reshaped to (BxT, C) and targets to (BxT). This creates an issue with Ghost Clipping per sample loss computation as BxT is seen as the batch_size. In particular, the current implementation of Ghost Clipping results in loss_per_sample, coeff variables to have a shape of BxT and B respectively. This causes a shape mismatch error. This diff fixes that error by collapsing the loss_per_sample variable to shape B i.e., the loss across the sequence_length dim is averaged/summed.

Reviewed By: EnayatUllah

Differential Revision: D68047256

fbshipit-source-id: ad7614e2c

1385 of 2837 relevant lines covered (48.82%)

0.49 hits per line

Source Files on job run-3 - 12981869401.3
  • Tree
  • List 65
  • Changed 1
  • Source Changed 1
  • Coverage Changed 1
Coverage ∆ File Lines Relevant Covered Missed Hits/Line
  • Back to Build 12981869401
  • c7d61443 on github
  • Prev Job for on main (#12943075866.3)
  • Next Job for on main (#13003435829.3)
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc