[rlgl] rlGetShaderBufferSize
seems to always return 0
#4154
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
The
rlGetShaderBufferSize
function seems to always return 0, even after uploading, processing and downloading data with an SSBO (all successfully). Looking at the code, the function is implemented usingglGetInteger64v
:After looking around a bit, it looks like
glGetBufferParameteri64v
is probably the correct function to be calling in order to get the size of the SSBO, see below:With this implementation, the actual size of the buffer is correctly returned. From what I can tell,
glGetInteger64v
is used to query the maximum value of some property (I'm not exactly sure). Is the current implementation the intended behaviour, or should it actually return the current size of the buffer?Environment
Windows 11, desktop:
Code Example
With the
glGetInteger64v
implementation, the code always outputstarget size: 100, actual size: 0
, but with theglGetBufferParameteri64v
implementation, it outputstarget size: 100, actual size: 100
.