Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Runtime parsing speed #26

Open
madasebrof opened this issue Oct 29, 2021 · 0 comments
Open

Runtime parsing speed #26

madasebrof opened this issue Oct 29, 2021 · 0 comments

Comments

@madasebrof
Copy link

Hi! Per #14, I'm creating a new issue.

Here's a quick example.

import protobuf
import std/streams
import std/monotimes
import std/strformat
import std/random

# Define our protobuf specification and generate Nim code to use it
const protoSpec = """
syntax = "proto3";

message ExampleMessage {
  int32 number = 1;
  string text = 2;
  bytes pixels = 3;
}
"""
parseProto(protoSpec)

# Create our message
var 
  msg = new ExampleMessage
  pixels: seq[uint8]
msg.number = 10
msg.text = "Hello world"
let bufferSize = 512 * 512
for i in 0..<bufferSize:
  pixels.add((uint8)rand(0..255)) # B channel (0)
  pixels.add((uint8)rand(0..255)) # G channel (1)
  pixels.add((uint8)rand(0..255)) # R channel (2)
  pixels.add((uint8)rand(0..255)) # A channel (3)

msg.pixels = pixels

# Write it to a stream
var stream = newStringStream()
stream.write msg

# Read the message from the stream and output the data, if it's all present
stream.setPosition(0)

# start timer
var t0 = getMonoTime().ticks()

var readMsg = stream.readExampleMessage()
if readMsg.has(number, text, pixels):
  echo readMsg.number
  echo readMsg.text
  echo readMsg.pixels.len()

let t1 = (float64)(getMonoTime().ticks() - t0) * 0.000001

echo &"{t1:.2f}ms - time taken to unpack ExampleMessage"

The output is:

10
Hello world
1048576
6.95ms - time taken to unpack ExampleMessage

I ended up switching to using Zstandard to compress the bytes -> encode base64 -> JSON. Takes about 0.9ms to decode this pipeline on the client, but I switched mainly because the other end of the protobuf was so twitchy (it is in C++) and it's so much easier to be able to see the output in text vs using a hex editor to try to debug!

Also, the other issues was when sending protobuf, you have to roll your own network protocol (because it's a binary format), e.g. read the byte count of the message, cast that to a big endian uint32, prepend that to the message, then on the client, unpack the length, then parse the message, whereas with json you can just add a \r\L to the json string and use a recvLine on the client.

Feel free to close if this is behaving as expected. I was unable to reproduce the super long times I posted earlier, and as I said, have moved on to json, so am not going to dig too much more.

Thanks!

ps amazing use of Nim macros!

@madasebrof madasebrof changed the title Parsing speed Runtime parsing speed Oct 29, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant