You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi I am very new to torch.
As I read from torch/cutorch#42 , the multi-gpu solution is not real data parallel computation, right? Actually the forward, backward can be done in different gpus. I read the imagenet-Multigpu example, how did you copy data cross gpus in the trainBatch function?
The text was updated successfully, but these errors were encountered:
Hi I am very new to torch.
As I read from torch/cutorch#42 , the multi-gpu solution is not real data parallel computation, right? Actually the forward, backward can be done in different gpus. I read the imagenet-Multigpu example, how did you copy data cross gpus in the trainBatch function?
The text was updated successfully, but these errors were encountered: