-
-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LMMS does not allow samples larger than 300MB and/or longer than 90 minutes #3205
Comments
Found a cached page here:
That's pretty unspecific. Maybe just beef up the size of the file or set it relative to available ram? Can you make it crash if you remove the limit? |
@Umcaruje we can bump it to 200MB if you like, but I just tested against a 900MB WAV and there are some usability issues with it. Sample Track (900MB WAV)
Audio File Processor (900MB WAV)
Biggest observed problems
@zonkmachine good find. It inspired me to test this out, since there was no specific reasoning mentioned in the cached bug report for the 100MB ceiling. |
@tresf How much of an increase in RAM usage is there with the 2Gb
spike/what's your idle usage?
Would it be possible to have a RAM limit setting in LMMS like Blender has
for the VSE? Someone with 4Gb of RAM might not want >2Gb used by LMMS, but
I wouldn't mind 10-12Gb usage (if its necessary for a big sample or speeds
things up) as I have 16Gb RAM.
|
If the ticket system in SourceForge is reenabled, will the bug archive come back? Regarding the 100 MiB limit (which limits the file size, not the uncompressed audio), my idea was to use a file reader in those cases: temporary uncompressed file, file seeks, etc. |
Yeah I talked with @BaraMGB about this, the drawing routine is very inefficient when dealing with larger samples, as we're drawing this huge polyline for every sample point, which works very well with smaller samples, but eats up the CPU for larger ones.
Or we actually make it unlimited and warn about instability, maybe? Databenders would probably appreciate this, and later on we could implement @jasp00's idea, or set the limit relative to RAM as @zonkmachine suggested |
I had a deeper look into the drawing routine in the meantime. It isn't that much inefficient as I thought. It calculate the accuracy (frames per pixels) dynamically according to the zoom level Perhaps a better way were to calculate the drawing once and save it into a QPainter.path. But drawing paths costs cpu time too. I haven't tested this. |
Ideally, it should be threaded off.
|
Both. We... I think @Umcaruje should go ahead and bump up the limit to 200MB right now for rc2. It could reduce the number of users that ever see this message conciderably. People who are on weaker machines, like I am, already expect things to slow down once in a while. |
Or alleviate it completely, open enhancement for the other items identified. > 90min/900MB waves are quite rare when composing. In 2009, the average ram was around 1GB. The crash should never occur, just very bad UX. |
Just drop the file limit and join the new millennium. |
Ok I'll drop the limit, but I think we should keep this issue open, until the UX is fixed. |
@Umcaruje , At the risk of being obtuse, who in the world is going to load a file that big in LMMS? The biggest WAV files I encounter are nowhere near that. |
Per https://www.facebook.com/groups/GACToolTime/permalink/1696492363975805/, per the OP. |
I just removed the check, and I get a crash every time I want to load a ~600MB wav file:
At first I thought it was because I didn't have enough ram available, but crash happened even when I had around 1,5GB of ram available. Edit: OS info: Elementary OS Loki, Ubuntu 16.04 based, running on a laptop with an intel core i7 + 8GB of RAM |
The error message states that it tries to allocate 39855375 new chunks. That number of chunks is determined in the following line in int requiredChunks = size / MM_CHUNK_SIZE + ( size % MM_CHUNK_SIZE > 0 ? 1 : 0 );
I wonder whether the |
Here's some dialog on the testing that was performed: #1088 (comment) And feedback from @jasp00 #1088 (comment) |
void * MemoryManager::alloc( size_t size )
{
if( !size )
{
return NULL;
} But it doesn't look like we test to see if the operation worked: void SampleBuffer::directFloatWrite ( sample_t * & _fbuf, f_cnt_t _frames, int _channels)
{
m_data = MM_ALLOC( sampleFrame, _frames );
const int ch = ( _channels > 1 ) ? 1 : 0; Suggestion to test m_data: bool SampleBuffer::directFloatWrite ( sample_t * & _fbuf, f_cnt_t _frames, int _channels)
{
m_data = MM_ALLOC( sampleFrame, _frames );
if ( ! m_data )
{
return false;
}
else
{
const int ch = ( _channels > 1 ) ? 1 : 0;
... |
|
Freeze/crash on opening a ~93 MB file that is 5 hour lo-fi. So we may have to test the actual length in samples too.
|
Also. After refusing to open a file that is over 100 MB it will crash when I delete the track that failed to load the file or closes lmms.
|
I have a fix for both of the last crashes above but need to tidy it up "a bit". |
@Umcaruje wrote
what would he result be, if there was no pattern drawing, for files of a specified max size. |
It's already drawing based on zoom level |
@musikBear if you just read the next comment by @BaraMGB:
So the drawing routine is not as bad as I tought. The real issue here is that LMMS crashes when I import a large file. And that's the main thing that needs to get fixed. |
| So the drawing routine is not as bad as I tought. I wonder if it's necessary in most cases to draw lines for both channels? The maximum or average should be pretty enough, at least for the Song Editor. And then there's the small detail of having one |
@tresf well this is still a problem though, LMMS should allow samples of any size, this is just saving that problem to be adressed later. I could probably edit out this issue or make a new one along the lines of 'problems when operating with big sample files'. |
@Umcaruje I agree. There are two issues that I see...
Since performance problems in general are already plaguing LMMS currently the decision to file this as an individual bug is up to you. In regards to the original bug report, the investigation as well as a conscious decision (#3293) was made to keep it, so for the foreseeable future, we won't be fixing this problem, which is why I've asked. |
I agree. I think this deserves a new bug. I didn't intend to close the door on large files but in case this gets to be a lengthy thing those issues needed fixing. Also, I just stayed with 100MB and a proportional sample length check. This was based on my computers limits and the earlier values as they agreed with each other. You could perhaps make this system dependent, at least to some degree, so users with stronger machines can use them full out. |
Edit: OT comment removed
QTractor is using libsndfile too. |
Sorry. I seem to have drawn some wrong conclusions about file sizes. I'm now running two 300MB files side by side with a 90 minute sample and it works just fine. A bit slow to load of course. I suggest to just bump the limits to some new maximum like at least 300 MB and 90 minutes. |
I'm thinking of just increasing the limits a bit. 300 MB/ 90 minutes?
Instead of just a message we could have a warning over a certain size and an OK/Cancel button so it's up to the user to proceed and test the limits of their own system. |
This issue popped up as I was browsing the Glitch Artists Collective: Tool Time facebook group. One of the users said it was unable to load files that were >100MB, so I rendered out 2 hours of white noise in audacity and tried to load it, and voilà, this message popped up in the terminal:
This error is generated in
SampleBuffer.cpp
where the loading breaks if the file is too large:lmms/src/core/SampleBuffer.cpp
Lines 204 to 208 in 919ca8b
Now, this change was added by @tobydox back in '09 to fix a bug -
#2458375
, which I can't find and it's either forever gone or buried on sourceforge somewhere.Now, I don't want to blindly remove that break without knowing the context of the bug, but 100MB is very limiting when dealing with uncompressed audio, not just for the purpose of databending, but also when remixing longer tracks and importing stems, so input on this is appreciated.
The text was updated successfully, but these errors were encountered: