-
Notifications
You must be signed in to change notification settings - Fork 249
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CUDA error when sorting big data #79
Comments
I have the same problem on a RTX 2070 GPU with CUDA 10.0 on Windows 10 and Matlab 2019a. |
Hi, I have no problem running more than 40 GB data with GTX 960 (2G memory), CUDA 10.0, Windows 10, Matlab 2019a. Try to change the batch size. |
I tried to change the batch size but still ran into error. Changed the following line on config file but still getting out of memory error both using GUI and master script. My file was about 50 GB, ran with Quadro K1200 (3.5 G memory), CUDA 8 Windows 10, Matlab 2017b. ops.NT = 32X1024+ ops.ntbuff; (tried 1X1024+ops.ntbuff as well!) Any idea how to fix this? @shoringvip @siaahmadi |
Hi, @Yliew6-gt try to increase the batch size, because it will create a matrix of [nbatch x nbatch] if your recording time is long this matrix will be too big. |
Hi @shoringvip did you fix this issue? |
Closed from lack of activity. |
Hi!
First of all thanks for the really useful piece of software you have built.
When sorting data with size less than 6 GB, kilosort run well.
However, when sorting data with size of 8GB, kilosort throw the following error:
The environment are following:
GPU: Tesla K40C
OS: ubuntu 18.04
CUDA: 9.1
matlab: 2018b
Has anyone else seen anything like this before or have ideas on how to troubleshoot it?
Thanks in advance!
best
Zhen
The text was updated successfully, but these errors were encountered: