-
Notifications
You must be signed in to change notification settings - Fork 114
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
String encoding and decoding #253
Comments
@wongoo we have met such problems in my memory. It is not so easy to fix this problem. |
@zonghaishang pls check this issue, I will also go into sometime later |
ref: #252 |
the current chunk string decoding algorithm is complex, and hard to maintain. I will try to refactor it. |
#254 does not actually fix this case. I've created a pull request. |
AlexStocks
added a commit
that referenced
this issue
Jan 12, 2021
Fix #253: Acquire sufficient bytes for string encoding buffers
zhaoyunxing92
pushed a commit
that referenced
this issue
Sep 4, 2021
zhaoyunxing92
pushed a commit
that referenced
this issue
Sep 4, 2021
Fix #253: Acquire sufficient bytes for string encoding buffers
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
There are 2 known issues in string encoding and decoding.
The first one is an out-of-range problem that the code below doesn't handle edge cases well:
dubbo-go-hessian2/string.go
Line 164 in e2494da
The inner loop
dubbo-go-hessian2/string.go
Line 176 in e2494da
could actually exit with
charCount > CHUNK_SIZE
(or more precisely,charCount == CHUNK_SIZE + 1
). The maximum bytes taken could be(CHUNK_SIZE + 1) * 3
.A simple reproducible test case:
After a quick fix with
I encountered the second issue with the same test case above:
After bisection, I assume this was introduced in dea1174 because the same test could be passed if I apply the quick fix on 8dcaa20, which is the parent of dea1174.
I haven't dived into the commit yet since it's a bit complicated.
The text was updated successfully, but these errors were encountered: