|
Ran
|
Jobs
3
|
Files
121
|
Run time
1min
|
Badge
README BADGES
|
push
github
How to initialize dataloader with DPDDP when poisson sampling is set to False (#745) Summary: Pull Request resolved: https://github.com/pytorch/opacus/pull/745 https://github.com/pytorch/opacus/issues/444 When poisson sampling is enabled, it acts as a distributed sampler by default i.e., splits the dataset across the devices equally and then each device selects each sample from its subset independently with probability sample_rate. However, when poisson sampling is disabled, the user needs to explicitly include the distributed sampler in the dataloader and adjust the batch size based on the world size as shown below. ``` dataloader = DataLoader(dataset, batch_size=batch_size//world_size, sampler=DistributedSampler(dataset)) ``` This is expected behavior and therefore requires no fix. Updating the document to include this information. Reviewed By: HuanyuZhang Differential Revision: D71654351 fbshipit-source-id: d7ce3aeb7
5310 of 6193 relevant lines covered (85.74%)
1.91 hits per line
| Lines | Coverage | ∆ | File |
|---|---|---|---|
| 9 |
87.5 |
-11.25% | opacus/utils/tensor_utils.py |
| ID | Job ID | Ran | Files | Coverage | |
|---|---|---|---|---|---|
| 1 | run-3 - 14090700915.1 | 66 |
48.44 |
GitHub Action Run | |
| 2 | run-1 - 14090700915.2 | 120 |
85.56 |
GitHub Action Run | |
| 3 | run-2 - 14090700915.3 | 120 |
85.45 |
GitHub Action Run |
| Coverage | ∆ | File | Lines | Relevant | Covered | Missed | Hits/Line |
|---|