You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've been running into this error, while trying to run kilosort2. It occurs in the isolated_peaks_new function, but I suspect the issue is caused much earlier in the code. From some earlier entries (45 & 142) I gathered that it may have something to do with the number of batches. However, setting ops.NT to a lower value does not resolve it. The file I'm trying to process has an ops.sampsToRead value of 68046020. I could try downsampling but I don't know if/when this will affect the quality of the outcome.
Any thoughts on how to mitigate this issue?
The text was updated successfully, but these errors were encountered:
It probably has to do with empty batches. If you really have batches with no spikes, then you should probably increase ops.NT substantially (by a factor of 8 or more). If you switch to kilosort3, then this shouldn't happen any more.
It turned out to be (partially) due to the fact that the GPU on our cluster loads the wrong CUDA version by default. I can get my data through now but only with very lenient criteria, and the result is all noise. If I high-pass filter the data at 150Hz and inspect the trace I also can't identify any spikes. Given that one of the first steps in KS2 is a threshold-based detection of z-scored data, I'm afraid I have to conclude that my data quality is just too poor for spike detection.
Hi,
I've been running into this error, while trying to run kilosort2. It occurs in the isolated_peaks_new function, but I suspect the issue is caused much earlier in the code. From some earlier entries (45 & 142) I gathered that it may have something to do with the number of batches. However, setting ops.NT to a lower value does not resolve it. The file I'm trying to process has an ops.sampsToRead value of 68046020. I could try downsampling but I don't know if/when this will affect the quality of the outcome.
Any thoughts on how to mitigate this issue?
The text was updated successfully, but these errors were encountered: