Skip to content
This repository has been archived by the owner on Sep 28, 2022. It is now read-only.

Commit

Permalink
Revert "Update ddp_spawn.py"
Browse files Browse the repository at this point in the history
This reverts commit f172101.
  • Loading branch information
shuyingsunshine21 committed Mar 24, 2021
1 parent 6c095b2 commit 8222dc9
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions pytorch_lightning/plugins/training_type/ddp_spawn.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@
import torch.multiprocessing as mp
from torch.nn.parallel.distributed import DistributedDataParallel
from torch.optim import Optimizer
import numpy

from pytorch_lightning.distributed.dist import LightningDistributed
from pytorch_lightning.overrides import LightningDistributedModule
Expand Down Expand Up @@ -78,6 +79,7 @@ def distributed_sampler_kwargs(self):

def setup(self, model):
os.environ["MASTER_PORT"] = str(self.cluster_environment.master_port())
os.environ["MKL_SERVICE_FORCE_INTEL"] = "1"

# pass in a state q
smp = mp.get_context("spawn")
Expand Down

0 comments on commit 8222dc9

Please sign in to comment.