-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bench: instrument hash counts for verifier #19
Conversation
Merge branch 'main' into feat/main_trace_parts
* feat: add backend readme * fix: link * fix: link * fix: latex * chore: fix typo * chore: add code references --------- Co-authored-by: Yi Sun <yi-sun@users.noreply.github.com>
No need for tracing on non-benchmark tests
* fix: `eval_permutation_constraints` must handle partitioned_main * feat: integration test with cached trace * fix: forgot to add test files * chore: rename folder to `partitioned_sum_air` * chore: add test utils module * feat: add indexless cached lookup test for partitioned Air with interactions * chore: fix typo * feat: `add_main_matrix(width)` placeholder with mat width * chore: switch from RefCell to Arc Mutex
Some preliminary stats:
|
To confirm, in case 2, is the logUp still done? |
4467ed8
to
8163e1c
Compare
In both cases, the logUp is exactly the same: same 2 virtual columns, same constraints. The logUp treats the main trace as one big matrix, and doesn't care about the partition. |
Great, I think that's the correct relevant comparison |
Uploaded instrumented verifier csv to https://docs.google.com/spreadsheets/d/1AEwbUvsge07qq9ASi0BL4Or0bbuwq8Iy9cy5cNxiLj4/edit?usp=sharing |
* fix: p3-maybe-rayon `parallel` feature was not on feature is `parallel` not `rayon` * chore: simplify code
Moved `run_simple_test` into `config::baby_bear_poseidon2` because it depends on the engine
@OsamaAlkhodairy FYI I'm going to merge this one. I added some stuff to test-utils/src/config to start some tooling. I added a StarkEngine trait with the idea we can make a different engine for each field x hash combo. I moved your |
* wip: refactor to use commit pointers * wip: permutation trace from partitioned main * wip: fix previous rebase * wip: show new interface * chore: rename ChipsetProver to MultiTraceStarkProver * feat: quotient refactor done * feat: prover done * chore: split out prove function into general post-trace part * feat: finished verifier * feat: keygen builder * chore: move fib triple test into same file * feat: preliminary docs on STARK backend scope/support (#16) * feat: add backend readme * fix: link * fix: link * fix: latex * chore: fix typo * chore: add code references --------- Co-authored-by: Yi Sun <yi-sun@users.noreply.github.com> * chore: make DebugBuilder `after_challenge` consistent with other builders * chore: clean up if statement * chore: fix fmt * chore: fix previous merge and remove tracing No need for tracing on non-benchmark tests * chore: fix lint * feat: Partitioned Air Builder (#17) * fix: `eval_permutation_constraints` must handle partitioned_main * feat: integration test with cached trace * fix: forgot to add test files * chore: rename folder to `partitioned_sum_air` * chore: add test utils module * feat: add indexless cached lookup test for partitioned Air with interactions * chore: fix typo * feat: `add_main_matrix(width)` placeholder with mat width * chore: switch from RefCell to Arc Mutex * chore: address review comment * chore: rename `MatrixCommitmentPointers` * feat: add instrumented hash/compress * chore: return instrument counters since fields are private * feat: prelim verifier instrumentation * feat: full benchmark writing to csv * chore: turn off instrumenting during proving * fix: instrument not properly on * fix: flush writer * chore: clean up config module * chore(chips): remove old test config module * feat: add `StarkEngine` for convenience * feat: add cached_lookup prover benchmark * fix: p3-maybe-rayon `parallel` feature was not on (#28) * fix: p3-maybe-rayon `parallel` feature was not on feature is `parallel` not `rayon` * chore: simplify code --------- Co-authored-by: Yi Sun <yi-sun@users.noreply.github.com>
I was able to find a way to do this without forking Plonky3. The two traits that need to be implemented for PCS hashes are
PseudoCompressionFunction
andCryptographicHasher
. I make a wrapper structInstrumented
that does the normal operation and also updates the counter. For logging / debugging purposes, I store the lengths of inputs by the input type usingtype_name
. To translate into actual counts, some specific logic about the absorb rate, etc of the specific hash (currently poseidon2) are used.Notes:
CryptographicHasher
is for flat hashing long arrays of inputs. I think this is mostly used on the merkle leaf.PseudoCompressionFunction
is what is used on the other layers of merkle tree, since it's designed so 2-to-1 hashes have same input and output shape.What is currently benchmarked: given a matrix with
height
rows andfield_width + 1
columns, we make two AIRs:field_width
with count the column 0 value.height
-by-field_width
matrix as a cached trace, so it is committed to separately. The count matrix that isheight
-by-1
is a separate matrix that is committed.We run both of these and log the hashes used, where we count based on the number of Poseidon2 permutations done.