-
Notifications
You must be signed in to change notification settings - Fork 73
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Upgrade the World #358
base: main
Are you sure you want to change the base?
Upgrade the World #358
Conversation
2968699
to
840a385
Compare
840a385
to
10b193e
Compare
VSS CI failure is unrelated and should be fixed by #357. |
10b193e
to
912e657
Compare
…ock must succeed 3ae9ecb test: fix off-by-one in `checkpoint_insert_existing` (valued mammal) e6aeaea doc(core): document panic for `CheckPoint::insert` (valued mammal) 2970b83 fix(core): calling `CheckPoint::insert` with existing block must succeed (志宇) Pull request description: ### Description Previously, we were panicking when the caller tried to insert a block at height 0. However, this should succeed if the block hash does not change. ### Notes to the reviewers For context: * lightningdevkit/ldk-node#358 * https://discord.com/channels/753336465005608961/978744259693916230/1283050849429356629 ### Changelog notice * Fix `CheckPoint::insert` to not panic when inserting existing genesis block (same block height and hash). ### Checklists #### All Submissions: * [x] I've signed all my commits * [x] I followed the [contribution guidelines](https://github.com/bitcoindevkit/bdk/blob/master/CONTRIBUTING.md) * [x] I ran `cargo fmt` and `cargo clippy` before committing #### Bugfixes: ~~* [ ] This pull request breaks the existing API~~ * [x] I've added tests to reproduce the issue which are now passing ACKs for top commit: ValuedMammal: Self ACK 3ae9ecb notmandatory: ACK 3ae9ecb Tree-SHA512: 638d8aacac59ad18b5ee369d08f39bb12cc37fa9b3f936404b60dbf44938ef850ca341d00566840b5a77909d31c9291ab56299d986d832005ca96cd91a0b0e55
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Almost through the LDK upgrade commits.
/// transaction fee) this value will be zero. For channels created prior to LDK Node 0.4 | ||
/// the channel is always treated as outbound (and thus this value is never zero). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the channel is always treated as outbound (and thus this value is never zero).
Not sure what is meant by this. Is because LDK Node only allowed outbound channels? If so, why do we need to say this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, it has nothing to do with LDK Node per se. This too is directly copied from the LDK docs to account for the behavior change:
Note that if this channel is inbound (and thus our counterparty pays the commitment transaction fee) this value will be zero. For ChannelMonitors created prior to LDK 0.0.124, the channel is always treated as outbound (and thus this value is never zero).
/// The amount available to claim, in satoshis, excluding the on-chain fees which will be | ||
/// required to do so. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this mention other amounts no longer included here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mh, I left the docs to mirror LDK's docs, and there they seem unchanged?
@@ -164,7 +166,8 @@ where | |||
ConfirmationTarget::OnchainPayment => 5000, | |||
ConfirmationTarget::ChannelFunding => 1000, | |||
ConfirmationTarget::Lightning(ldk_target) => match ldk_target { | |||
LdkConfirmationTarget::OnChainSweep => 5000, | |||
LdkConfirmationTarget::MaximumFeeEstimate => 8000, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What went into picking this number out of curiosity?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mhh, not much tbh. For the maximum I mostly put a number that's higher in relation to the prior confirmation target, as I'm not super sure how to estimate these fallbacks best. That said, note that we a) current should never use these number in the first place as we enforce/block on updating fee rate estimations on startup (could even consider putting a debug_assert
here for the time being and b) potentially should re-evaluate all fee-rate related magic numbers in a follow-up eventually to make sure they still reflect the current fee market. Although, the latter should probably go hand-in-hand with updates to LDK to re-enable interop with other implementations on regtest/signet/testnet which currently seems more or less broken for many users (cf. lightningdevkit/rust-lightning#2925).
We will be adding some wallet persistence/serialization related types and in a separate module down the line, so here we perepare for it by already moving the wallet code to a module directory.
912e657
to
9db3046
Compare
Rebased to resolve minor conflict. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reviewed up to but not including the first persistence commit.
... we update LDK, lightning-liquidity, BDK, rust-bitcoin, rust-esplora-client, rust-electrum-client, etc.
a0e7224
to
98fc9b5
Compare
.. and drop the balance cache.
98fc9b5
to
97784c5
Compare
Made a few more changes (in the respective fixup commits) to make diff --git a/src/builder.rs b/src/builder.rs
index 97c34f0f..123f9a0d 100644
--- a/src/builder.rs
+++ b/src/builder.rs
@@ -343,4 +343,6 @@ impl NodeBuilder {
#[cfg(any(vss, vss_test))]
pub fn build_with_vss_store(&self, url: String, store_id: String) -> Result<Node, BuildError> {
+ use bitcoin::key::Secp256k1;
+
let logger = setup_logger(&self.config)?;
@@ -352,12 +354,11 @@ impl NodeBuilder {
let config = Arc::new(self.config.clone());
- let xprv = bitcoin::bip32::ExtendedPrivKey::new_master(config.network.into(), &seed_bytes)
- .map_err(|e| {
- log_error!(logger, "Failed to derive master secret: {}", e);
- BuildError::InvalidSeedBytes
- })?;
+ let xprv = bitcoin::bip32::Xpriv::new_master(config.network, &seed_bytes).map_err(|e| {
+ log_error!(logger, "Failed to derive master secret: {}", e);
+ BuildError::InvalidSeedBytes
+ })?;
let vss_xprv = xprv
- .ckd_priv(&Secp256k1::new(), ChildNumber::Hardened { index: 877 })
+ .derive_priv(&Secp256k1::new(), &[ChildNumber::Hardened { index: 877 }])
.map_err(|e| {
log_error!(logger, "Failed to derive VSS secret: {}", e);
diff --git a/src/io/vss_store.rs b/src/io/vss_store.rs
index d7308ae6..474f7dbc 100644
--- a/src/io/vss_store.rs
+++ b/src/io/vss_store.rs
@@ -138,5 +138,12 @@ impl KVStore for VssStore {
// unwrap safety: resp.value must be always present for a non-erroneous VSS response, otherwise
// it is an API-violation which is converted to [`VssError::InternalServerError`] in [`VssClient`]
- let storable = Storable::decode(&resp.value.unwrap().value[..])?;
+ let storable = Storable::decode(&resp.value.unwrap().value[..]).map_err(|e| {
+ let msg = format!(
+ "Failed to decode data read from key {}/{}/{}: {}",
+ primary_namespace, secondary_namespace, key, e
+ );
+ Error::new(ErrorKind::Other, msg)
+ })?;
+
Ok(self.storable_builder.deconstruct(storable)?.0)
} |
src/wallet/ser.rs
Outdated
let txs_len = ChangeSetSerWrapper(&self.0.txs).serialized_length() as u64; | ||
let txouts_len = self.0.txouts.serialized_length() as u64; | ||
let anchors_len = ChangeSetSerWrapper(&self.0.anchors).serialized_length() as u64; | ||
let last_seen_len = self.0.last_seen.serialized_length() as u64; | ||
let total_len = BigSize(txs_len + txouts_len + anchors_len + last_seen_len); | ||
total_len.write(writer)?; | ||
|
||
ChangeSetSerWrapper(&self.0.txs).write(writer)?; | ||
self.0.txouts.write(writer)?; | ||
ChangeSetSerWrapper(&self.0.anchors).write(writer)?; | ||
self.0.last_seen.write(writer)?; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we use the TLV macros instead? Or at least write a length for each field? That way we can accommodate removed fields.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we use the TLV macros instead?
I don't think we can, at least not simply impl_writeable_tlv_based
if I'm not mistaken? I don't want to write even more boilerplate newtypes for all the individual types that the ChangeSet
s are composed of (many of which we don't use and hence don't have serialization logic for in LDK), but we're also unable implement Writeable
/Readable
directly for them. So it's more convenient to implement Readable
/Writeable
manually and using the De/SerWrapper
s where needed?
Or at least write a length for each field? That way we can accommodate removed fields.
Hmm, yeah, we could. I now added a fixup that replaces the total length with per-field lengths using some new read/write macros to DRY it up a bit. Note that all this might be over the top as BDK committed to make only additive, backwards compatible changes to the ChangeSet
types in form of Option
s. But I'd rather be safe than sorry here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mh, nevermind the "I don't think we can" part. Went down the rabbit hole to clean up the serialization based on macros, until I realized that everything I wrote was already available in LDK. While I still don't think we can just use the high-level macros like impl_writeable_tlv_based
, I now switched to use encode_tlv_stream
/decode_tlv_stream
.
// We require a descriptor and return `None` to signal creation of a new wallet otherwise. | ||
if let Some(descriptor) = | ||
read_bdk_wallet_descriptor(Arc::clone(&kv_store), Arc::clone(&logger))? | ||
{ | ||
change_set.descriptor = Some(descriptor); | ||
} else { | ||
return Ok(None); | ||
} | ||
|
||
// We require a change_descriptor and return `None` to signal creation of a new wallet otherwise. | ||
if let Some(change_descriptor) = | ||
read_bdk_wallet_change_descriptor(Arc::clone(&kv_store), Arc::clone(&logger))? | ||
{ | ||
change_set.change_descriptor = Some(change_descriptor); | ||
} else { | ||
return Ok(None); | ||
} | ||
|
||
// We require a network and return `None` to signal creation of a new wallet otherwise. | ||
if let Some(network) = read_bdk_wallet_network(Arc::clone(&kv_store), Arc::clone(&logger))? { | ||
change_set.network = Some(network); | ||
} else { | ||
return Ok(None); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So this is saying if we ever failed to write one of these, then we will start with a fresh wallet? Presumably this is safe because the data below would never have been written?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Presumably this is safe because the data below would never have been written?
Right, descriptor
/change_descriptor
/network
are mandatory and we'll always write them before any of the other data. I now added a debug_assert
to check this invariant in persist
.
97784c5
to
4687ee2
Compare
.. this checks that these fields are indeed mandatory and always present in memory before we go ahead writing other fields.
.. when necessary. Or at least when we *think* it may be necessary, i.e., every time we take `mut locked_wallet`.
4687ee2
to
0fcd231
Compare
src/wallet/ser.rs
Outdated
let len = BigSize(self.0.len() as u64); | ||
len.write(writer)?; | ||
for (time, txid) in self.0.iter() { | ||
ChangeSetSerWrapper(time).write(writer)?; | ||
txid.write(writer)?; | ||
write_len_prefixed_field!(writer, ChangeSetSerWrapper(time))?; | ||
write_len_prefixed_field!(writer, txid)?; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe we could use WithoutLength
and Iterable
using encoding
in encode_tlv_stream
? Iterable
is pub(crate)
unfortunately, so would need to copy the code. Likewise elsewhere.
Closes #195.
... we update LDK, lightning-liquidity, BDK, rust-bitcoin, rust-esplora-client,
rust-electrum-client, etc.
Notably, we mostly maintain the functionalities here, any feature upgrades and refactors (e.g., adding additional chain data sources which we now can finally do) will happen in follow-up PRs.
Should be ready-for-review. The last commit can be dropped if BDK ships their next release during the review process, otherwise we'll (partially) revert it afterwards (see #359).