Which good practice to test benchmark call Tarantool store prodecdure? #7506
-
I am new with Tarantool I test benchmark Rust project Actix Web by cloned code from repo Link
With the benchmark between ~5k/s and ~40k/s it so difference I don't know problem from which part. Note: beside Rust I tested with Python less than ~5k/s
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
It seems that the benchmark code just waits for a response most of time:
(Since I'm not familiar with the programming language, the framefork and the connector you're using, I can completely misunderstand the code.) I can't suggest how exactly you should reorganize the code to use the power of asynchronous interation with the network. However I can sketchy describe an approach, which should give better numbers:
It will not utilize network and CPU ideally, but it is simple and should be much better than the synchronous approach (I mean, waiting for each response individually without any CPU utilization). (Maybe the Tokio framework offers something that allows to better balance network/CPU utilization and write more elegant code. I just don't know, sorry.) There is a StackOverflow question, where I talk about synchronous and asynchronous processing. It may be useful to understand those concepts. However the idea is simple and mostly the same as pipelining in other computer systems with slow and fast operations. |
Beta Was this translation helpful? Give feedback.
It seems that the benchmark code just waits for a response most of time:
client.call_fn()
sends a request to the network..await?
waits for a response. Here we spend most of time. And here we do nothing.(Since I'm not familiar with the programming language, the framefork and the connector you're using, I can completely misunderstand the code.)
I can't suggest how exactly you should reorganize the code to use the power of asynchronous interation with the network. However I can sketchy describe an approach, which should give better numbers:
BATCH_SIZE
(say, 1000) requests asynchronously. Save future/promise objects. Let's name it batchN
.