-
Notifications
You must be signed in to change notification settings - Fork 189
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(store-sync): sync to RECS #1197
Merged
Merged
Changes from all commits
Commits
Show all changes
13 commits
Select commit
Hold shift + click to select a range
a9032fc
feat(store-sync): add syncToRecs strategy and utils
holic 3e359ef
remove done TODO
holic 75552db
remove other todo
holic b6cb5e4
Create great-cooks-dream.md
holic 797b878
Update great-cooks-dream.md
holic 9c6623f
Merge remote-tracking branch 'origin/main' into holic/store-sync-recs
holic 9f94d84
rework waitForTransaction
holic 6ba5a5f
refactor a bit
holic 509380d
move types around
holic b85e56c
only update sync progress every 1000 records hydrated
holic 2e7da90
update roughly once per percent
holic 84c54cd
move off of behavior subject
holic 543334c
Merge remote-tracking branch 'origin/main' into holic/store-sync-recs
holic File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,26 @@ | ||
--- | ||
"@latticexyz/store-sync": patch | ||
--- | ||
|
||
Add RECS sync strategy and corresponding utils | ||
|
||
```ts | ||
import { createPublicClient, http } from 'viem'; | ||
import { syncToRecs } from '@latticexyz/store-sync'; | ||
import storeConfig from 'contracts/mud.config'; | ||
import { defineContractComponents } from './defineContractComponents'; | ||
|
||
const publicClient = createPublicClient({ | ||
chain, | ||
transport: http(), | ||
pollingInterval: 1000, | ||
}); | ||
|
||
const { components, singletonEntity, latestBlock$, blockStorageOperations$, waitForTransaction } = await syncToRecs({ | ||
world, | ||
config: storeConfig, | ||
address: '0x...', | ||
publicClient, | ||
components: defineContractComponents(...), | ||
}); | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,13 @@ | ||
import { KeySchema, ValueSchema } from "@latticexyz/store"; | ||
|
||
export type StoreComponentMetadata = { | ||
keySchema: KeySchema; | ||
valueSchema: ValueSchema; | ||
}; | ||
|
||
export enum SyncStep { | ||
INITIALIZE = "initialize", | ||
SNAPSHOT = "snapshot", | ||
RPC = "rpc", | ||
LIVE = "live", | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
import { debug as parentDebug } from "../debug"; | ||
|
||
export const debug = parentDebug.extend("recs"); |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,23 @@ | ||
import { Entity } from "@latticexyz/recs"; | ||
import { StaticAbiType } from "@latticexyz/schema-type"; | ||
import { Hex, decodeAbiParameters } from "viem"; | ||
import { SchemaToPrimitives } from "@latticexyz/store"; | ||
import { entityToHexKeyTuple } from "./entityToHexKeyTuple"; | ||
|
||
export function decodeEntity<TKeySchema extends Record<string, StaticAbiType>>( | ||
keySchema: TKeySchema, | ||
entity: Entity | ||
): SchemaToPrimitives<TKeySchema> { | ||
const hexKeyTuple = entityToHexKeyTuple(entity); | ||
if (hexKeyTuple.length !== Object.keys(keySchema).length) { | ||
throw new Error( | ||
`entity key tuple length ${hexKeyTuple.length} does not match key schema length ${Object.keys(keySchema).length}` | ||
); | ||
} | ||
return Object.fromEntries( | ||
Object.entries(keySchema).map(([key, type], index) => [ | ||
key, | ||
decodeAbiParameters([{ type }], hexKeyTuple[index] as Hex)[0], | ||
]) | ||
) as SchemaToPrimitives<TKeySchema>; | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,25 @@ | ||
import { World, defineComponent, Type } from "@latticexyz/recs"; | ||
import { Table } from "../common"; | ||
import { StoreComponentMetadata } from "./common"; | ||
|
||
// eslint-disable-next-line @typescript-eslint/explicit-function-return-type | ||
export function defineInternalComponents(world: World) { | ||
return { | ||
TableMetadata: defineComponent<{ table: Type.T }, StoreComponentMetadata, Table>( | ||
world, | ||
{ table: Type.T }, | ||
{ metadata: { keySchema: {}, valueSchema: {} } } | ||
), | ||
SyncProgress: defineComponent( | ||
world, | ||
{ | ||
step: Type.String, | ||
message: Type.String, | ||
percentage: Type.Number, | ||
}, | ||
{ | ||
metadata: { keySchema: {}, valueSchema: {} }, | ||
} | ||
), | ||
}; | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,19 @@ | ||
import { Entity } from "@latticexyz/recs"; | ||
import { StaticAbiType } from "@latticexyz/schema-type"; | ||
import { encodeAbiParameters } from "viem"; | ||
import { SchemaToPrimitives } from "@latticexyz/store"; | ||
import { hexKeyTupleToEntity } from "./hexKeyTupleToEntity"; | ||
|
||
export function encodeEntity<TKeySchema extends Record<string, StaticAbiType>>( | ||
keySchema: TKeySchema, | ||
key: SchemaToPrimitives<TKeySchema> | ||
): Entity { | ||
if (Object.keys(keySchema).length !== Object.keys(key).length) { | ||
throw new Error( | ||
`key length ${Object.keys(key).length} does not match key schema length ${Object.keys(keySchema).length}` | ||
); | ||
} | ||
return hexKeyTupleToEntity( | ||
Object.entries(keySchema).map(([keyName, type]) => encodeAbiParameters([{ type }], [key[keyName]])) | ||
); | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,13 @@ | ||
import { Entity } from "@latticexyz/recs"; | ||
import { Hex, sliceHex, size, isHex } from "viem"; | ||
|
||
export function entityToHexKeyTuple(entity: Entity): readonly Hex[] { | ||
if (!isHex(entity)) { | ||
throw new Error(`entity ${entity} is not a hex string`); | ||
} | ||
const length = size(entity); | ||
if (length % 32 !== 0) { | ||
throw new Error(`entity length ${length} is not a multiple of 32 bytes`); | ||
} | ||
return new Array(length / 32).fill(0).map((_, index) => sliceHex(entity, index * 32, (index + 1) * 32)); | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,8 @@ | ||
import { Address, getAddress } from "viem"; | ||
import { Table, TableName, TableNamespace } from "../common"; | ||
|
||
export type TableKey = `${Address}:${TableNamespace}:${TableName}`; | ||
|
||
export function getTableKey(table: Pick<Table, "address" | "namespace" | "name">): TableKey { | ||
return `${getAddress(table.address)}:${table.namespace}:${table.name}`; | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,6 @@ | ||
import { Entity } from "@latticexyz/recs"; | ||
import { Hex, concatHex } from "viem"; | ||
|
||
export function hexKeyTupleToEntity(hexKeyTuple: readonly Hex[]): Entity { | ||
return concatHex(hexKeyTuple as Hex[]) as Entity; | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,7 @@ | ||
export * from "./common"; | ||
export * from "./decodeEntity"; | ||
export * from "./encodeEntity"; | ||
export * from "./entityToHexKeyTuple"; | ||
export * from "./hexKeyTupleToEntity"; | ||
export * from "./recsStorage"; | ||
export * from "./syncToRecs"; |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,96 @@ | ||
import { BlockLogsToStorageOptions } from "../blockLogsToStorage"; | ||
import { StoreConfig } from "@latticexyz/store"; | ||
import { debug } from "./debug"; | ||
import { | ||
ComponentValue, | ||
Entity, | ||
Component as RecsComponent, | ||
Schema as RecsSchema, | ||
getComponentValue, | ||
removeComponent, | ||
setComponent, | ||
updateComponent, | ||
} from "@latticexyz/recs"; | ||
import { isDefined } from "@latticexyz/common/utils"; | ||
import { TableId } from "@latticexyz/common"; | ||
import { schemaToDefaults } from "../schemaToDefaults"; | ||
import { hexKeyTupleToEntity } from "./hexKeyTupleToEntity"; | ||
import { defineInternalComponents } from "./defineInternalComponents"; | ||
import { getTableKey } from "./getTableKey"; | ||
import { StoreComponentMetadata } from "./common"; | ||
|
||
// TODO: should we create components here from config rather than passing them in? | ||
|
||
export function recsStorage<TConfig extends StoreConfig = StoreConfig>({ | ||
components, | ||
}: { | ||
components: ReturnType<typeof defineInternalComponents> & | ||
Record<string, RecsComponent<RecsSchema, StoreComponentMetadata>>; | ||
config?: TConfig; | ||
}): BlockLogsToStorageOptions<TConfig> { | ||
// TODO: do we need to store block number? | ||
|
||
const componentsByTableId = Object.fromEntries( | ||
Object.entries(components).map(([id, component]) => [component.id, component]) | ||
); | ||
|
||
return { | ||
async registerTables({ tables }) { | ||
for (const table of tables) { | ||
// TODO: check if table exists already and skip/warn? | ||
setComponent(components.TableMetadata, getTableKey(table) as Entity, { table }); | ||
} | ||
}, | ||
async getTables({ tables }) { | ||
// TODO: fetch schema from RPC if table not found? | ||
return tables | ||
.map((table) => getComponentValue(components.TableMetadata, getTableKey(table) as Entity)?.table) | ||
.filter(isDefined); | ||
}, | ||
async storeOperations({ operations }) { | ||
for (const operation of operations) { | ||
const table = getComponentValue( | ||
components.TableMetadata, | ||
getTableKey({ | ||
address: operation.log.address, | ||
namespace: operation.namespace, | ||
name: operation.name, | ||
}) as Entity | ||
)?.table; | ||
if (!table) { | ||
debug( | ||
`skipping update for unknown table: ${operation.namespace}:${operation.name} at ${operation.log.address}` | ||
); | ||
continue; | ||
} | ||
|
||
const tableId = new TableId(operation.namespace, operation.name).toString(); | ||
const component = componentsByTableId[operation.log.args.table]; | ||
if (!component) { | ||
debug(`skipping update for unknown component: ${tableId}. Available components: ${Object.keys(components)}`); | ||
continue; | ||
} | ||
|
||
const entity = hexKeyTupleToEntity(operation.log.args.key); | ||
|
||
if (operation.type === "SetRecord") { | ||
debug("setting component", tableId, entity, operation.value); | ||
setComponent(component, entity, operation.value as ComponentValue); | ||
} else if (operation.type === "SetField") { | ||
debug("updating component", tableId, entity, { | ||
[operation.fieldName]: operation.fieldValue, | ||
}); | ||
updateComponent( | ||
component, | ||
entity, | ||
{ [operation.fieldName]: operation.fieldValue } as ComponentValue, | ||
schemaToDefaults(table.valueSchema) as ComponentValue | ||
); | ||
} else if (operation.type === "DeleteRecord") { | ||
debug("deleting component", tableId, entity); | ||
removeComponent(component, entity); | ||
} | ||
} | ||
}, | ||
} as BlockLogsToStorageOptions<TConfig>; | ||
} |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was gonna move these into RECS but it's a bigger refactor and we have to figure out where to draw lines in terms of type definitions (where should
KeySchema
andValueSchema
live and, if not centralized, is it okay to duplicate them in a few packages?).Will follow up and move these around later, but keeping them here for simplicity for now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
After thinking more about this I think it makes sense to keep these utils in here instead of in
recs
. Inrecs
land the entity is just a string, while the encoding and decoding based on key schema is a concept on top of it.