-
Notifications
You must be signed in to change notification settings - Fork 11.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[DRAFT FOR COMMENTS] move: compile deps with prior compiler binary versions #15245
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
3 Ignored Deployments
|
7af1148
to
ec4bb12
Compare
ec4bb12
to
aff2ab5
Compare
aff2ab5
to
699ddbb
Compare
let (deps_for_current_compiler, deps_for_prior_compiler) = | ||
partition_deps_by_toolchain(deps_package_paths.clone())?; | ||
|
||
let flags = resolution_graph.build_options.compiler_flags(); | ||
// invoke the compiler | ||
let mut paths = deps_package_paths.clone(); | ||
let mut paths = deps_for_current_compiler; | ||
paths.push(sources_package_paths.clone()); | ||
|
||
let compiler = Compiler::from_package_paths(paths, vec![]) | ||
let compiler = Compiler::from_package_paths(paths, deps_for_prior_compiler) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Request for input: this is the "main" change. In effect, we compile deps whose Move.lock
specify a prior compiler version other than the one we're on. For those deps, we pass their package paths in the second arg to from_package_paths
, which is one route I found that appears to work well and completes the build.
One change in the build
directory output I observe though, is that, prior to this change, the build
directory would include dependency .mv
files in build/dependencies
. With this change, .mv
files of dependencies compiled with a prior compiler do not end up in build/dependencies
(and I'm not sure they need to?).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it possible to test this change? That is, add a new Move project that uses Move 2024, but relies on a Move Legacy version of code that is already published? This would allow use to ensure the change supports on-chain verification while building correctly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, I think moving the .mv
files to build/dependencies/
is a good and worthwhile goal if we can. it will allow us to cache dependencies by-version, etc., in the future.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it possible to test this change? That is, add a new Move project that uses Move 2024, but relies on a Move Legacy version of code that is already published? This would allow use to ensure the change supports on-chain verification while building correctly.
I tested locally to get a successful build, and will check on-chain verification with a proper PR. This PR was meant to get a sense of how to do the build, which I think is ~mostly clear to me now.
Also, I think moving the .mv files to build/dependencies/ is a good and worthwhile goal if we can. it will allow us to cache dependencies by-version, etc., in the future.
Right, so this is the kind of thing that I was a bit unsure about re how the internals work when build/dependencies
exist. I.e., if we copy / move .mv
to build/dependencies
would caching "just work"?
let dest_binary_os = OsStr::new(dest_binary.as_path()); | ||
|
||
if !dest_binary.exists() { | ||
// Download if binary does not exist. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm just using curl
/tar
to test this end-to-end, it can and will probably be replaced with native Rust reqwest
, etc.
OsStr::new(root.as_path()), | ||
]) | ||
.output() | ||
.expect("failed to build package"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
.expect
just a shortcut for end-to-end testing, will format proper errors after comments
if !lock_file.exists() { | ||
// Q: Behavior to pick when package has no lock file. Right now we choose the current compiler (we could use the "legacy" one instead). | ||
deps_for_current_compiler.push(dep); | ||
continue; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Implementation detail: which compiler version makes sense to pick when there is no lock file (which may be the case for existing packages). My intuition is we go with the latest / current one.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that makes sense? Though I don't fully understand under what situations we don't have a lock file
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think any time we don't have a lock file, we should generate one? And if that is the case, yes, we should use the latest.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep, we generate lock files if they don't exist.
situations we don't have a lock file
People might not check it in on a repo
None => { | ||
// Q: Behavior to pick when package has no [move.toolchain-version] info. Right now we choose the current compiler (we could use the "legacy" one instead). | ||
deps_for_current_compiler.push(dep) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Implementation detail: ditto for when there's no toolchain version info.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we fix the version for what version of the lock file added this feature? (which I assume is this one?)
if !lock_file.exists() { | ||
// Q: Behavior to pick when package has no lock file. Right now we choose the current compiler (we could use the "legacy" one instead). | ||
deps_for_current_compiler.push(dep); | ||
continue; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that makes sense? Though I don't fully understand under what situations we don't have a lock file
None => { | ||
// Q: Behavior to pick when package has no [move.toolchain-version] info. Right now we choose the current compiler (we could use the "legacy" one instead). | ||
deps_for_current_compiler.push(dep) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we fix the version for what version of the lock file added this feature? (which I assume is this one?)
} | ||
Some(ToolchainVersion { | ||
compiler_version, .. | ||
}) if compiler_version == env!("CARGO_PKG_VERSION") => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we want to parse or format these at all?
I suppose it is not critical, but longterm we might need to if the format changes here.
// Compile and mark that we compiled this dep with a prior compiler. | ||
download_and_compile(root, toolchain_version)?; | ||
deps_for_prior_compiler.push(dep) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we make this a map? Like collect all deps for version X and compile them all at once.
Or is it not worth it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Echoing this suggestion: assemble each dep by what version it wants, as Map<Version, Vec<Dep>>
or similar. While it likely isn't necessary for this logic to work, it may allow future extensions (blacklisting versions, allowing users to filter versions, etc.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Noted! I think it makes sense to keep a Map
that groups by version, but for V0 of the implementation, in the interest of speed, will probably leave "compile all at once by version" on the table for a second iteration.
} | ||
Some(ToolchainVersion { | ||
compiler_version, .. | ||
}) if compiler_version == env!("CARGO_PKG_VERSION") => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit, bind this constant somewhere
flavor, | ||
}: ToolchainVersion, | ||
) -> Result<()> { | ||
let binaries_path = std::env::var("TOOLCHAIN_BINARIES")?; // E.g., ~/.move/binaries |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit bind this constant somewhere
|
||
if !dest_binary.exists() { | ||
// Download if binary does not exist. | ||
let url = format!("https://github.com/MystenLabs/sui/releases/download/mainnet-v{}/sui-mainnet-v{}-{}.tgz", compiler_version, compiler_version, platform); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should probably extract out this and the path information somewhere
} | ||
|
||
println!( | ||
"[+] sui move build --default-move-edition {} --default-move-flavor {} -p {}", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I worry about these flags here just in case we accidentally break them. I mean... we shouldn't but we might. Thoughts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IMO flag shifting is a real possibility. Perhaps a file that defines invocations per-version?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We'll need (a different?) way to build if we can't rely on flags--I'm not aware of another way right now if we're invoking the binaries? I'd love if we can have a high degree of confidence about flags, at least for medium to long term (6+ months?). That way, if they do change, one way to approach this is to have version detection logic added in new releases, that case out and use appropriate flags. Basically, build it into this logic.
Perhaps a file that defines invocations per-version?
Proposal for what this would look like? (Will this mean another file users need to check in, and do we really want that?)
@@ -756,3 +765,123 @@ pub(crate) fn make_source_and_deps_for_compiler( | |||
}; | |||
Ok((source_package_paths, deps_package_paths)) | |||
} | |||
|
|||
/// partitions `deps` by whether we need to compile dependent packages with a |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you explain if we handle this case?
A
requiresB
at version0.1.0
A
requiresC
at version0.2.0
B
requiresC
at version0.1.0
If not, I do think we need a map of versions to modules to compile, so any individual module can find its dependencies at the right version.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Based on how we currently resolve dependencies, I'm not sure we can / want to / need to support this case.
If packages in the dependency tree require C
at different versions (more plainly, if different C
dependencies are specified in Move.toml
s based on package name) we detect a conflict. So my understanding based on how we do resolution (which is not entirely complete :P) is that if C
exists in the dependency graph it can only be one version.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems like a good initial draft, modulo some concerns about dependency compilation in the limit and generality.
I'd really like to see an actual test on the diff.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Description
We want to enable compiling packages with dependencies that specify prior compiler versions in their
Move.lock
file (with respectiveedition
andflavor
flags). The main thing this does is allow to check byte-for-byte equality of a package build / its dependents against published on-chain code.Example of the info we store in
Move.lock
(implemented in #15166):This PR implements an end-to-end package build that will pick up prior
compiler-version
s for dependencies, download a prior Sui binary, compile them, and "add" them to the build. The (most viable?) approach that seems to enable that is adding deps to the logic inexternal-crates/move/crates/move-package/src/compilation/compiled_package.rs
after compiling with prior versions.This is the main question for reviewers: does this approach check out? Does this approach violate internal assumptions or impact other build / applications that I may not be aware of?
See inline comments for more.
Test Plan
How did you test the new or updated feature?
If your changes are not user-facing and not a breaking change, you can skip the following section. Otherwise, please indicate what changed, and then add to the Release Notes section as highlighted during the release process.
Type of Change (Check all that apply)
Release notes