Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

backports: lotus-provider: fixes caught in rc1-testing #11496

Closed
wants to merge 13 commits into from

Conversation

rjan90
Copy link
Contributor

@rjan90 rjan90 commented Dec 7, 2023

Proposed Changes

This PR backports lotus-provider bugs caught in testing of v1.25.1-rc1 into the release/v1.25.1 branch. The backports are:

Additional Info

Checklist

Before you mark the PR ready for review, please make sure that:

  • Commits have a clear commit message.
  • PR title is in the form of of <PR type>: <area>: <change being made>
    • example: fix: mempool: Introduce a cache for valid signatures
    • PR type: fix, feat, build, chore, ci, docs, perf, refactor, revert, style, test
    • area, e.g. api, chain, state, market, mempool, multisig, networking, paych, proving, sealing, wallet, deps
  • If the PR affects users (e.g., new feature, bug fix, system requirements change), update the CHANGELOG.md and add details to the UNRELEASED section.
  • New features have usage guidelines and / or documentation updates in
  • Tests exist for new functionality or change in behavior
  • CI is green

@rjan90 rjan90 changed the title backports: lotus-provider: backport fixes caught in rc1-testing backports: lotus-provider: fixes caught in rc1-testing Dec 8, 2023
@rjan90 rjan90 changed the base branch from release/v1.25.1 to release/v1.25.2 December 11, 2023 11:07
Stebalien and others added 8 commits December 11, 2023 15:18
Also explicitly limit how many bytes we're willing to read in one go
such that we're capable of reading a worst-case tipset (like, really,
never going to happen worst-case). Previously, this wasn't an issue.
However, we've bumped the max number of messages from 8,192 to 150,000
and need to limit allocations somewhere else.
It's not a const for the testground build, and needs to be an int 99%
of the time. So we might as well just cast here.
Also explicitly limit how many bytes we're willing to read in one go
such that we're capable of reading a worst-case tipset (like, really,
never going to happen worst-case). Previously, this wasn't an issue.
However, we've bumped the max number of messages from 8,192 to 150,000
and need to limit allocations somewhere else.
It's not a const for the testground build, and needs to be an int 99%
of the time. So we might as well just cast here.
@rjan90
Copy link
Contributor Author

rjan90 commented Dec 11, 2023

Closing due to bumping in v1.25.1-rc1 --> v1.25.2-rc2, to make room for the hot-fix. Will open in new PR based of v1.25.2-rc1

@rjan90 rjan90 closed this Dec 11, 2023
@rjan90 rjan90 deleted the phi-lp-backports branch December 11, 2023 14:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants