|
Ran
|
Files
115
|
Run time
3s
|
Badge
README BADGES
|
push
github
Modifying DPLossFastGradientClipping to add support for generative tasks with ghost clipping (#722) Summary: Pull Request resolved: https://github.com/pytorch/opacus/pull/722 Generative tasks for NLP output predictions of shape (B,T,C) i.e., (batch_size, sequence_length, vocab_size). To compute the cross-entropy loss in this case, usually the predictions are reshaped to (BxT, C) and targets to (BxT). This creates an issue with Ghost Clipping per sample loss computation as BxT is seen as the batch_size. In particular, the current implementation of Ghost Clipping results in loss_per_sample, coeff variables to have a shape of BxT and B respectively. This causes a shape mismatch error. This diff fixes that error by collapsing the loss_per_sample variable to shape B i.e., the loss across the sequence_length dim is averaged/summed. Reviewed By: EnayatUllah Differential Revision: D68047256 fbshipit-source-id: ad7614e2c
5005 of 5771 relevant lines covered (86.73%)
0.87 hits per line
| Coverage | ∆ | File | Lines | Relevant | Covered | Missed | Hits/Line |
|---|