Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue related to server side deserialization in protobufJS #192

Closed
barbareek opened this issue Oct 5, 2014 · 6 comments
Closed

Issue related to server side deserialization in protobufJS #192

barbareek opened this issue Oct 5, 2014 · 6 comments
Labels

Comments

@barbareek
Copy link

In my proto file i have int32 type message field, i encode the details and send it over to a java client where in i am decoding this value , values are being decode properly till 126 but beyond that i.e for a value of 127 corresponding decoded value is 253

@dcodeIO
Copy link
Member

dcodeIO commented Oct 5, 2014

This is probably happening because the data is converted to a string somewhere on its way.

See: https://github.com/dcodeIO/ProtoBuf.js/wiki/How-to-read-binary-data-in-the-browser-or-under-node.js%3F

@barbareek
Copy link
Author

well, I have now placed a nodejs server to decode the protobuf data sent from browser(sending 300 value).
In nodejs server i am able to get 300(correct value ) but when i send the same stream to java client where in i am using corresponding getter method to retrieve the value i am getting 381.

hex value sent from browser
08ac02200028003800

value recieved at nodejs server:
Buffer 08 ac 02 20 00 28 00 38 00

value decoded in java tcp client
381

@dcodeIO
Copy link
Member

dcodeIO commented Oct 5, 2014

Could you put some code examples together for me that contain the following?

  • The full process from encoding the message and sending it to a socket on the JavaScript side
  • The full process from receiving the message and decoding it on the Java side

As all values up to 127 seem to work properly, I assume that there is some sort of string conversion in between (which is a common cause of such issues) that corrupts non-ASCII values (> 127). Hence the reference to the binary FAQ. It's mandatory to prevent unchecked string conversions to occur or, if not preventable, to use base64 encoding for example.

@barbareek
Copy link
Author

var builder =  dcodeIO.ProtoBuf.loadJsonFile('json/test.json');
var package = builder.build("protocol");
var message = package.messageName;
var Message = new message(test);
// ajax request to server
$.ajax({
                                url: 'api/GPBSend',
                                type: 'POST',
                                data: {
                                    selectedType: 'Wrapper',
                                    jsonData: JSON.stringify(test),
                                    amg_serial:container.find('.amg_serial')[0].value,
                                    buffer:Wrapper.encodeHex()
                                },
                                success: function(data, textStatus, jqXHR) {
                                    Util.showMessage('UI_Container', "GPB Sent successfully", 'alert-success');
                                },
                                error: function(jqXHR, textStatus, errorThrown) {
                                    Util.showMessage('UI_Container', jqXHR.responseJSON.response, 'alert-danger');
                                }
                            });

// above request is processed by node.js

...
 queryObj = queryString.parse(body),
....
 buffer1 = queryObj.buffer;
var protoBufferSent = new Buffer(buffer1, 'hex'); // Is this the issue please suggest
.....
// sending request to tcp java client
// I am sending the length of protobuf as well 
  arrayOfBytes.push(getByteArray(protoBufferSent.length, 4));
  arrayOfBytes.push(protoBufferSent);
  console.log("connection established ......",Buffer.concat(arrayOfBytes));
  srvSocket.write(Buffer.concat(arrayOfBytes));
 srvSocket.end();
// In the java side 

//... get protobuf byte[] chunk(protobyte) from stream

Message v = Message.parseFrom(protobyte);
System.out.println("message id that had to be 300 "+v.getMessageId()); // but displayed as 381

@dcodeIO
Copy link
Member

dcodeIO commented Oct 5, 2014

Well, this looks OK so far but, unfortunately, I have no simple solution for you. Looks like something else is going wrong and that the only way to figure this out for sure is to evaluate the contents of the actual byte arrays in between the several conversion steps. As node.js decodes it properly, I'd assume the cause is located somewhere near the Java side of things.

See: How to validate / decode / reverse engineer a protobuf buffer by hand?

@barbareek
Copy link
Author

got it! I think best way to send data over tcp would be message.encode().toBuffer() now i am getting the values approproiately.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants