Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

failing link: internal error with error: remote error #8315

Closed
SunnySarahNode opened this issue Dec 25, 2023 · 4 comments
Closed

failing link: internal error with error: remote error #8315

SunnySarahNode opened this issue Dec 25, 2023 · 4 comments
Labels
bug Unintended code behaviour

Comments

@SunnySarahNode
Copy link

SunnySarahNode commented Dec 25, 2023

Me : LND 0.17.3, bitcoind 25 (full indexed, mempool = 3Gb), Ubuntu, clearnet only
My peer : LND 0.17.3, bitcoind 25, Linux, hybrid (clearnet+tor)

Immediately after the connection with my peer is restored, my HSWC fails the link.

The log from my side is quite short :

 [ERR] HSWC: ChannelLink(74e877619dca2ef862218fd688b4c422a7e6d6d4b5fb64998851a89ef5085b3e:1): failing link: ChannelPoint(74e877619dca2ef862218fd688b4c422a7e6d6d4b5fb64998851a89ef5085b3e:1): received error from peer: chan_id=3e5b08f59ea851889964fbb5d4d6e6a722c4b488d68f2162f82eca9d6177e875, err=internal error with error: remote error
 [ERR] HSWC: ChannelLink(74e877619dca2ef862218fd688b4c422a7e6d6d4b5fb64998851a89ef5085b3e:1): link failed, exiting htlcManager

The log from my peer's side (UPD: there are more logs below) :

 [DBG] PEER: Peer(03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a): Received ChannelReestablish(chan_id=3e5b08f59ea851889964fbb5d4d6e6a722c4b488d68f2162f82eca9d6177e875, next_local_height=298654, remote_tail_height=299282) from 03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a@77.72.85.173:9735
 [DBG] PEER: Peer(03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a): Received NodeAnnouncement(node=03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a, update_time=2023-12-25 20:07:36 +0000 UTC) from 03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a@77.72.85.173:9735
 [DBG] CRTR: Waiting for dependent on job=lnwire.NodeAnnouncement, pub=03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a
 [DBG] DISC: Processing NodeAnnouncement: peer=03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a@77.72.85.173:9735, timestamp=2023-12-25 20:07:36 +0000 UTC, node=03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a
 [DBG] PEER: Peer(03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a): Sending UpdateFailHTLC(chan_id=3e5b08f59ea851889964fbb5d4d6e6a722c4b488d68f2162f82eca9d6177e875, id=66407, reason=[very long string with numbers and letters]) to 03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a@77.72.85.173:9735
 [DBG] PEER: Peer(03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a): Sending Error(chan_id=3e5b08f59ea851889964fbb5d4d6e6a722c4b488d68f2162f82eca9d6177e875, err=internal error) to 03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a@77.72.85.173:9735
 [DBG] DISC: Processed NodeAnnouncement: peer=03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a@77.72.85.173:9735, timestamp=2023-12-25 20:07:36 +0000 UTC, node=03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a
 [DBG] PEER: Peer(03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a): Received Error(chan_id=3e5b08f59ea851889964fbb5d4d6e6a722c4b488d68f2162f82eca9d6177e875, err=remote error) from 03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a@77.72.85.173:9735

There are some pending HTLCs in our channel :


"pending_htlcs": [
                {
                    "incoming": false,
                    "amount": "21392",
                    "hash_lock": "90...52",
                    "expiration_height": 823363,
                    "htlc_index": "66405",
                    "forwarding_channel": "828096082988040192",
                    "forwarding_htlc_index": "91351"
                },
                {
                    "incoming": false,
                    "amount": "50699",
                    "hash_lock": "15...7e",
                    "expiration_height": 823842,
                    "htlc_index": "66406",
                    "forwarding_channel": "0",
                    "forwarding_htlc_index": "0"
                },
                {
                    "incoming": false,
                    "amount": "50000",
                    "hash_lock": "d6...05",
                    "expiration_height": 823388,
                    "htlc_index": "66407",
                    "forwarding_channel": "837643142481641473",
                    "forwarding_htlc_index": "1219794"
                },
                {
                    "incoming": false,
                    "amount": "77067",
                    "hash_lock": "8c...d4",
                    "expiration_height": 823394,
                    "htlc_index": "66408",
                    "forwarding_channel": "0",
                    "forwarding_htlc_index": "0"
                },
                {
                    "incoming": false,
                    "amount": "9265",
                    "hash_lock": "ff...c1",
                    "expiration_height": 823468,
                    "htlc_index": "66409",
                    "forwarding_channel": "889657739176509441",
                    "forwarding_htlc_index": "73580"
                },
                {
                    "incoming": true,
                    "amount": "17782",
                    "hash_lock": "4b...17",
                    "expiration_height": 823386,
                    "htlc_index": "89415",
                    "forwarding_channel": "0",
                    "forwarding_htlc_index": "0"
                }
            ]

Please note the HTLC with "htlc_index": "66407" because in the log :

UpdateFailHTLC(chan_id=3e5b08f59ea851889964fbb5d4d6e6a722c4b488d68f2162f82eca9d6177e875, id=66407

The state of the channel (from my side) :

            "capacity": "5500000",
            "local_balance": "5171962",
            "remote_balance": "58063",
            "commit_fee": "43110",
            "commit_weight": "2148",
            "fee_per_kw": "19994",
            "unsettled_balance": "226205"
            "csv_delay": 660,
            "chan_status_flags": "ChanStatusDefault",
            "local_chan_reserve_sat": "55000",
            "remote_chan_reserve_sat": "55000",
            "static_remote_key": false,
            "commitment_type": "ANCHORS",
            "close_address": "",
            "push_amount_sat": "0",
            "thaw_height": 0,
            "zero_conf": false,
            "zero_conf_confirmed_scid": "0",

My peer's node restart doesn't help, the channel remains disabled.

This channel will be FCed soon because of a HTLC expiration.
Can I (or my peer) do something to prevent it?

We are ready to provide any necessary information.

@SunnySarahNode SunnySarahNode added bug Unintended code behaviour needs triage labels Dec 25, 2023
@SunnySarahNode
Copy link
Author

SunnySarahNode commented Dec 25, 2023

I received additional information from my peer :

2023-12-25 20:50:41.068 [INF] SRVR: Established connection to: 03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a@77.72.85.173:9735
2023-12-25 20:50:41.068 [INF] SRVR: Finalizing connection to 03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a@77.72.85.173:9735, inbound=false
2023-12-25 20:50:41.352 [INF] PEER: Peer(03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a): Loading ChannelPoint(74e877619dca2ef862218fd688b4c422a7e6d6d4b5fb64998851a89ef5085b3e:1), isPending=false
2023-12-25 20:50:41.352 [INF] HSWC: ChannelLink(74e877619dca2ef862218fd688b4c422a7e6d6d4b5fb64998851a89ef5085b3e:1): starting
2023-12-25 20:50:41.352 [INF] CNCT: Attempting to update ContractSignals for ChannelPoint(74e877619dca2ef862218fd688b4c422a7e6d6d4b5fb64998851a89ef5085b3e:1)
2023-12-25 20:50:41.352 [INF] PEER: Peer(03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a): Negotiated chan series queries
2023-12-25 20:50:41.352 [INF] HSWC: ChannelLink(74e877619dca2ef862218fd688b4c422a7e6d6d4b5fb64998851a89ef5085b3e:1): HTLC manager started, bandwidth=0 mSAT
2023-12-25 20:50:41.352 [INF] DISC: Creating new GossipSyncer for peer=03423790614f023e3c0cdaa654a3578e919947e4c3a14bf5044e7c787ebd11af1a
2023-12-25 20:50:41.352 [INF] HSWC: ChannelLink(74e877619dca2ef862218fd688b4c422a7e6d6d4b5fb64998851a89ef5085b3e:1): Attempting to re-synchronize channel: SCID=804644:917:1, status=ChanStatusDefault, initiator=true, pending=false, local commitment has height=299282, local_htlc_index=89415, local_log_index=156127, remote_htlc_index=66413, remote_log_index=155828, remote commitment has height=298653, local_htlc_index=89416, local_log_index=156128, remote_htlc_index=66410, remote_log_index=155825
2023-12-25 20:50:41.353 [INF] HSWC: ChannelLink(74e877619dca2ef862218fd688b4c422a7e6d6d4b5fb64998851a89ef5085b3e:1): received re-establishment message from remote side
2023-12-25 20:50:41.433 [ERR] HSWC: ChannelLink(74e877619dca2ef862218fd688b4c422a7e6d6d4b5fb64998851a89ef5085b3e:1): failing link: unable to update commitment: commitment transaction dips peer below chan reserve: our balance below chan reserve with error: internal error
2023-12-25 20:50:41.433 [ERR] HSWC: ChannelLink(74e877619dca2ef862218fd688b4c422a7e6d6d4b5fb64998851a89ef5085b3e:1): link failed, exiting htlcManager
2023-12-25 20:50:41.433 [INF] HSWC: ChannelLink(74e877619dca2ef862218fd688b4c422a7e6d6d4b5fb64998851a89ef5085b3e:1): exited
2023-12-25 20:50:41.433 [INF] HSWC: ChannelLink(74e877619dca2ef862218fd688b4c422a7e6d6d4b5fb64998851a89ef5085b3e:1): stopping

commitment transaction dips peer below chan reserve: our balance below chan reserve with error: internal error

My peer added on my advice this line :

max-channel-fee-allocation=1

...to his lnd.conf file and restarted the node.

Unfortunately, it doesn't help.

@yyforyongyu
Copy link
Member

The remote node has a balance of 58063 and a reserve requirement of 55000, plus an outgoing htlc of 50000, which caused this line to be executed:

case ourBalance < ourInitialBalance && ourBalance < ourReserve:

Unfornately I don't think there's an easy fix, as the channel has already dipped below chan reserve. However this should not happen again once #8096 is merged.

@jvxis
Copy link

jvxis commented Dec 31, 2023

Hey @yyforyongyu , can you guys release a hotfix for those cases to enable the channel to get online? Then, we can send an HTLC to balance the channel manually. I'm going to the 5th lost channel to this bug.

@Roasbeef
Copy link
Member

Roasbeef commented Jan 2, 2024

Unfortunately, today nodes will just retransmit exactly what they sent before the last connection. In this case, it'll keep sending that HTLC that violates the reserve constraints. This is a known edge case in the spec. The PR linked above should prevent it, but since there's no way to revert a commitment in the protocol today, the channel will need to be force closed in order to continue.

My peer added on my advice this line :
max-channel-fee-allocation=1

That config will allow 100% of your balance to go to fees, which I don't think is what you actually want.

What's happening is that you don't have enough balance in the channel (or your peer) to pay for the HTLC.

Closing for now as we have #8096 which will work around this quirk in the spec to prevent this from happening in the future.

This might be happening more frequently due to higher fees persistently on chain. One way to work around this, would be to start to increase the reserve of the channels you start to open. This gives you more buffer room to pay for HTLC fees. Ultimately, there's a circular requirement here: as you need fees to be able to pay for HTLCs, even when sending.

We can continue in a discussion if y'all would like.

@lightningnetwork lightningnetwork locked and limited conversation to collaborators Jan 2, 2024
@Roasbeef Roasbeef converted this issue into discussion #8332 Jan 2, 2024

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
bug Unintended code behaviour
Projects
None yet
Development

No branches or pull requests

5 participants