Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Don't understand codec tests for SInt64 / UInt64. #4

Open
ETiV opened this issue Aug 4, 2013 · 0 comments
Open

Don't understand codec tests for SInt64 / UInt64. #4

ETiV opened this issue Aug 4, 2013 · 0 comments

Comments

@ETiV
Copy link

ETiV commented Aug 4, 2013

https://github.com/pomelonode/pomelo-protobuf/blob/master/test/codecTest.js

as the code for uInt64 codec test:

var limit = 0x7fffffffffffffff;

Why use a 0~1 random number to multiple 0x7FFF FFFF FFFF FFFF as a limitation? (63bits)
WHY NOT using 0xFFFF FFFF FFFF FFFF? (64bits)

AND otherwise ...

the code for sInt64 codec test:

    var limit = 0xfffffffffffff;

It's 54bits.

But for signed int 64bit, because it's been multiple 2 inside the encode function, the limit should be 0x3FFF FFFF FFFF FFFF. (62bits, the head bit for sign, second bit for *2).

So, is there any limitation on UInt64 / SInt64 codec?


I've tested on my Objective-C codec. 0xFFFF...(64bits) works fine on UInt64, and 0x3FFF...(62bits) works find on SInt64.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant