Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MultiGPU and DataParallel in torch(without fbcunn) #449

Closed
lglhuada opened this issue Nov 9, 2015 · 2 comments
Closed

MultiGPU and DataParallel in torch(without fbcunn) #449

lglhuada opened this issue Nov 9, 2015 · 2 comments

Comments

@lglhuada
Copy link

lglhuada commented Nov 9, 2015

Hi I am very new to torch.
torch_multigpu
As I read from torch/cutorch#42 , the multi-gpu solution is not real data parallel computation, right? Actually the forward, backward can be done in different gpus. I read the imagenet-Multigpu example, how did you copy data cross gpus in the trainBatch function?

@lglhuada lglhuada closed this as completed Nov 9, 2015
@lglhuada lglhuada reopened this Nov 9, 2015
@lglhuada lglhuada closed this as completed Nov 9, 2015
@soumith
Copy link
Member

soumith commented Nov 9, 2015

please use our forums https://groups.google.com/forum/#!forum/torch7 for support.

@lglhuada
Copy link
Author

hi soumith, thanks for your information. Do you have any ideas to my questions( fbcunn is not available)? Thanks in advance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants