Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(nargo)!: use faster hash function for checking preprocessed keys #1094

Merged
merged 3 commits into from
Apr 5, 2023

Conversation

TomAFrench
Copy link
Member

Related issue(s)

Resolves #870

Description

Summary of changes

We now use the faster checksum_constraint_system function over hash_constraint_system when preprocessing programs to generate their keys.

One important thing to note here is that it means that we'll be using a different hash function when saving programs to disk in comparison to the hash function we'll be using for contract functions in the A3 context. We'll need to ensure that we keep these separate. As part of that, I have updated any references to "hashes" to "checksums".

Dependency additions / changes

Test additions / changes

Checklist

  • I have tested the changes locally.
  • I have formatted the changes with Prettier and/or cargo fmt with default settings.
  • I have linked this PR to the issue(s) that it resolves.
  • I have reviewed the changes on GitHub, line by line.
  • I have ensured all changes are covered in the description.

Documentation needs

  • This PR requires documentation updates when merged.

Additional context

@TomAFrench TomAFrench changed the title chore: use faster hash function for checking keys chore: use faster hash function for checking preprocessed keys Apr 4, 2023
kevaundray
kevaundray previously approved these changes Apr 4, 2023
@TomAFrench
Copy link
Member Author

Let's merge this after #1056

@kevaundray
Copy link
Contributor

@TomAFrench Isn't this a breaking change?

@TomAFrench
Copy link
Member Author

Maybe? Depends on your definition of breaking. If someone had existing keys and updated to use this then nargo would stop using those keys and preprocess new keys on the fly.

@kevaundray
Copy link
Contributor

Maybe? Depends on your definition of breaking. If someone had existing keys and updated to use this then nargo would stop using those keys and preprocess new keys on the fly.

Hmm good point, lets leave it as non-breaking since its still compatible with older nargo versions

@TomAFrench
Copy link
Member Author

Actually, I tell a lie. Nargo will throw an error if the hashes don't match. It's only if you don't pass a circuit name that it will generate ephemeral keys.

@TomAFrench TomAFrench changed the title chore: use faster hash function for checking preprocessed keys chore!: use faster hash function for checking preprocessed keys Apr 4, 2023
@TomAFrench
Copy link
Member Author

Making this PR breaking as the next release is a breaking release anyway.

@TomAFrench TomAFrench changed the title chore!: use faster hash function for checking preprocessed keys chore(nargo)!: use faster hash function for checking preprocessed keys Apr 4, 2023
@TomAFrench
Copy link
Member Author

Hmm, I'm thinking that in combination with #1056, this may be more trouble than its worth. If we're compiling a contract we'll want to do a sha256 hash to insert into the contract artifact so it doesn't really make sense to do another one to save to disk as a checksum.

We then can only benefit from this faster hash function for noir programs, but we'd have to have a parallel preprocessing flow for just this particular case (we can't reuse the PreprocessedData struct that we currently have).

I think a better route for us to follow would be for us to push more heavily for the main proving/verifying flow to be that we compile noir into a single artifact file which contains ABI, circuit, pk, vk and try to just read from this file as much as possible.

@TomAFrench
Copy link
Member Author

Scratch that, I thought we were including the ACIR hash in the contract artifact.

* master:
  feat: Changes serialization for contract functions (#1056)
  chore: move `allow(dead_code)` to only cover `CyclicDependenciesError` (#1093)
@TomAFrench
Copy link
Member Author

That said, I do think it's worth us pushing for a greater focus on having a single artifact as source of truth for programs as we do for contracts.

@kevaundray
Copy link
Contributor

That said, I do think it's worth us pushing for a greater focus on having a single artifact as source of truth for programs as we do for contracts.

Fair point -- Can you open an issue for this? The prover key can get very large, so I don't know if we want to embed it in the ABI like we do for contracts (possibly?)

@kevaundray
Copy link
Contributor

For the ACIR hash, just for reference, its used only to check if the user code has changed -- We could have chosen a hash of the Noir source code or the AST, but ACIR was sort of the most convenient object to use.

@TomAFrench
Copy link
Member Author

TomAFrench commented Apr 5, 2023

For the ACIR hash, just for reference, its used only to check if the user code has changed.

Currently yes, but I'm thinking about how we're going to be embedding the ACIR hash into the contract tree in A3. This means that in some situations we're not free to switch out the hash function for a different one (or at least it won't make sense to as we'll using sha256 as well anyway).

Can you open an issue for this? The prover key can get very large, so I don't know if we want to embed it in the ABI like we do for contracts (possibly?)

Yep, can do. My point is that we should aim for a generic artifact with "all the stuff" and then have simple tools to break out application specific subsets of this, e.g Nargo shouldn't target A3's contract artifact format directly but it should include enough information in its contract output to easily generate the A3 contract artifacts (by selecting a subset of fields or computing values based on the artifact fields). This helps avoid situations like the above with hashes which has a special meaning outside of Noir.

Small terminology note, I don't think that we should refer to the structure added in #1056 as the "contract ABI" but the "contract artifact". This matches up better with solidity (as ABI doesn't include bytecode, etc) and things can get confusing otherwise.

@kevaundray
Copy link
Contributor

Currently yes, but I'm thinking about how we're going to be embedding the ACIR hash into the contract tree in A3. This means that in some situations we're not free to switch out the hash function for a different one (or at least it won't make sense to as we'll using sha256 as well anyway).

Hmm they are for two different purposes -- one is for checksumming the program, so the compiler can see whether we need to recompile and the other is for usage in an application

Small terminology note, I don't think that we should refer to the structure added in #1056 as the "contract ABI" but the "contract artifact". This matches up better with solidity (as ABI doesn't include bytecode, etc) and things can get confusing otherwise.

Thanks for the correction good ser :)

@TomAFrench
Copy link
Member Author

Added an issue here #1102

one is for checksumming the program, so the compiler can see whether we need to recompile and the other is for usage in an application

This is kind of my point with saying that we need a generic artifact. We shouldn't need to be concerned with what hash function Aztec 3 uses but it's not unlikely that the A3 artifact will contain the ACIR hash (sha256) which means we'll end up with multiple different hashes of the same data floating around for no reason.

@TomAFrench
Copy link
Member Author

@kevaundray This PR should be gtg now

@kevaundray kevaundray added this pull request to the merge queue Apr 5, 2023
Merged via the queue into master with commit a69758c Apr 5, 2023
@kevaundray kevaundray deleted the fast-checksum branch April 5, 2023 19:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Move ACIR checksum from using Sha to a faster checksum
2 participants