Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem with buffers after upgrade from v7.5.0 to v10.7.0 #21956

Closed
Somebi opened this issue Jul 24, 2018 · 10 comments
Closed

Problem with buffers after upgrade from v7.5.0 to v10.7.0 #21956

Somebi opened this issue Jul 24, 2018 · 10 comments
Labels
buffer Issues and PRs related to the buffer subsystem.

Comments

@Somebi
Copy link

Somebi commented Jul 24, 2018

v10.7.0

// If length is more than 2 bytes, then write it's length integer into next two bytes, 7 bytes will be filled with "length code" 127 (7 bits + 2 bytes)
          else if(data_length<=self.max_8_byte) { // self.max_8_byte = 0xFFFFFFFFFFFFFFFF
            frame[1]+=127;
            var len_buff=new Buffer(8);
            len_buff.writeUIntBE(data_length,0,8);
            var frame=Buffer.concat([frame,len_buff]);
          }

Throws error:

RangeError [ERR_OUT_OF_RANGE]: The value of "byteLength" is out of range. It must be >= 1 and <= 6. Received 8
    at boundsError (internal/buffer.js:55:9)
    at Buffer.writeUIntBE (internal/buffer.js:588:3)

Same code works fine in v7.5.0. There were some big, breaking changes without backwards compatibility ?

@Somebi
Copy link
Author

Somebi commented Jul 24, 2018

Aha! It seems, no core support for unsigned 64 big endian integers.
For those who will have the same issue, here is working package:
https://www.npmjs.com/package/int64-buffer

@mscdex
Copy link
Contributor

mscdex commented Jul 24, 2018

The errors were introduced in node v10.0.0. For proper 64-bit value support, see #19691.

@ChALkeR
Copy link
Member

ChALkeR commented Jul 24, 2018

That code indeed worked fine before v10.0.0, and apparently supported all safe uint values:

$ ~/tmp/nodejs/node-v6.14.1-linux-x64/bin/node 
> Number.MAX_SAFE_INTEGER
9007199254740991
> x = Buffer.alloc(8, 0x42); x
<Buffer 42 42 42 42 42 42 42 42>
> x.writeUIntBE(Number.MAX_SAFE_INTEGER, 0, 8); x
<Buffer 00 1f ff ff ff ff ff ff>
> x.readUIntBE(0, 8);
9007199254740991

The docs stated «Must satisfy: 0 < byteLength <= 6» though.

@vsemozhetbyt vsemozhetbyt added the buffer Issues and PRs related to the buffer subsystem. label Jul 24, 2018
@BridgeAR
Copy link
Member

The former behavior was somewhat undefined and risky since almost all values above Number.MAX_SAFE_INTEGER would have been wrong. Soon, Node.js will hopefully support BigInt but otherwise, I can not see what could be improved here.

We could theoretically accept higher byteLength in case the passed in value is <= Number.MAX_SAFE_INTEGER. What do others think about that?

@ChALkeR
Copy link
Member

ChALkeR commented Aug 22, 2018

The former behavior was somewhat undefined and risky since almost all values above Number.MAX_SAFE_INTEGER would have been wrong.

Number.MAX_SAFE_INTEGER is 2**53, but the current cap that was introduced in code in v10.0.0 is 2**48 (though present in documentation before that), which is 32 times less.

Numbers between 2**48 and 2**53 were not risky, but (undocumented) support for writing them has been removed.

Note: I am not proposing to change that (yet), I have not thought about it much. I am just outlining things.

hkniberg added a commit to sveasmart/modbusmeter that referenced this issue Oct 14, 2018
…curs in node10.

Actually it was a problem from the beginning, just the node10 is pickier about it.
nodejs/node#21956
@apapirovski
Copy link
Member

Given that this is documented, seems reasonable to me to close this out. Feel free to reopen if you disagree although I do think that should be accompanied by a PR if so.

@leimao
Copy link

leimao commented Dec 8, 2018

Has there been any solution if I want to read binary bigInts, such as "readIntBE(0, 8)"?

@mscdex
Copy link
Contributor

mscdex commented Dec 8, 2018

@leimao You could do:

BigInt(`0x${buf.toString('hex', 0, 8)}`);

That's one of the fastest methods I found awhile back. Just be aware of possible endianness issues.

@leimao
Copy link

leimao commented Dec 8, 2018

Hello mscdex,

Thanks for the quick response. The binary bigInts I am reading is from HBase. I also assume that the integer I got from HBase will not exceed the 32bit integer precision that JavaScript uses. At first I thought I could do

buff.readIntBE(2, 6)

But later I realize that this might only work for non-negative integers. I later found an package "'node-int64'" which might be very useful to achieve my goal.

var Int64 = require('node-int64');
var int64 = new Int64(buff);
var num = int64.toNumber(true);

If you have any further suggestions or recommendations, please let me know.

PS: I think your solution is more compatible with the future versions of NodeJS since it does not rely on external libraries and the precision of value is not likely limited to 32bit as converted from the node-in64 library.

Best,

Lei

@loretoparisi
Copy link

@leimao You could do:

BigInt(`0x${buf.toString('hex', 0, 8)}`);

That's one of the fastest methods I found awhile back. Just be aware of possible endianness issues.

@mscdex Not sure how to apply your solution in my case

    const extract = tar.extract();
    const gunzip = zlib.createGunzip();

    var chunks = [];
    extract.on('entry', function (header, stream, next) {
        stream.on('data', function (chunk) {
            chunks.push(chunk);
        });
        stream.on('end', function () {
            next();
        });
        stream.resume();
    });
    extract.on('finish', async function () {
        if (chunks.length) {
            var data = Buffer.concat(chunks);
            await writeFile(destPath, data);
            consoleLogger.info("wrote %s", destPath);
        }
    })
        .on('error', (error) => {
            consoleLogger.warn("gunzip error:%@", error.toString());
        })

    fs.createReadStream(tmpPath)
        .pipe(gunzip)
        .pipe(extract)

here I get a

RangeError [ERR_OUT_OF_RANGE]: The value of "length" is out of range. It must be >= 0 && <= 2147483647. Received 3179560960

I assume when declaring data = Buffer.concat(chunks);

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
buffer Issues and PRs related to the buffer subsystem.
Projects
None yet
Development

No branches or pull requests

8 participants