Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove removed past PathHashes #7080

Closed
wants to merge 2 commits into from
Closed

Remove removed past PathHashes #7080

wants to merge 2 commits into from

Conversation

benaadams
Copy link
Member

@benaadams benaadams commented May 26, 2024

Changes

  • When memory pruning an item; remove its entry from the LruCache (as its just been pruned)
  • Shrink the size of the LruCache by default from 2.1M items to 1M items

Otherwise after a couple prunes the cache quickly grows to the maximum 2.1M items; some of which will never be used again.

image

Types of changes

What types of changes does your code introduce?

  • Optimization

Testing

Requires testing

  • Yes
  • No

If yes, did you write tests?

  • Yes
  • No

Notes on testing

Optional. Remove if not applicable.

Documentation

Requires documentation update

  • Yes
  • No

If yes, link the PR to the docs update or the issue with the details labeled docs. Remove if not applicable.

Requires explanation in Release Notes

  • Yes
  • No

If yes, fill in the details here. Remove if not applicable.

Remarks

Optional. Remove if not applicable.

@@ -104,11 +104,15 @@ public Task Execute(CancellationToken cancellationToken)
if (_logger.IsWarn) _logger.Warn($"Detected {pruningConfig.CacheMb}MB of pruning cache config. Pruning cache more than 2000MB is not recommended as it may cause long memory pruning time which affect attestation.");
}

// 8 bytes Node pointer, (LinkedListNode 8 pointer + 8 pointer) + (HashAndTinyPath: 32 bytes for value + 8 bytes for path) + 32 bytes ValueHash256
int KeyEntryMemorySize = 8 + (8 + 8) + (32 + 8) + 32; // 96 bytes
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you double the default config also.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does it need 2.1M hashes?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

GC time is a function of number of objects (rather than size) I can redesign an Lru so it uses a giant array and indexes rather than pointers, if you want to go this high

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I've specifically, tune this amount.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR to eradicate the objects #7100

{
TreePath fullPath = key.path.ToTreePath(); // Micro op to reduce double convert
if (CanRemove(key.addr, key.path, fullPath, prevHash, keyValuePair.Value))
{
Metrics.RemovedNodeCount++;
writeBatch.Remove(key.addr, fullPath, prevHash);
_pastPathHash.Delete(key);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will this make mem pruning slower, (since it does more work), or faster? (since it become smaller).

@benaadams benaadams closed this May 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants