Replies: 2 comments 5 replies
-
The VRAM doesn't combine to the best of my knowledge, since a GPU needs to have all data for a single WAV file for it to procees it as a whole (if I'm wrong, please someone correct me). That'd be why we need to play with number of batches and That said, I successfully used |
Beta Was this translation helpful? Give feedback.
-
hi @martinambrus How can I train the first stage with multiple GPUs? I tried renting 2 A6000s for testing, but it didn't start; it got stuck. When I use multiple GPUs, it freezes and doesn't continue However, when I use just one GPU, it works fine. What am I doing wrong? i use command
and freeze say something about the kernel only working when i run like this but i have 2 A6000
can you please tell me how you make work ? |
Beta Was this translation helpful? Give feedback.
-
When renting GPUs we have a choice of multiple GPUs. Does the VRAM combine to make training with max_length 800 possible? Or is the VRAM limit per each GPU?
Beta Was this translation helpful? Give feedback.
All reactions