- kinesis-dms-record-generator
- Dependencies
- Running as bin
- Pattern on filenames
- CLI options
- Running Tests
- Contributing
This is simple CLI, to help developers simulate the records that Amazon DMS sends to a kinesis stream, when running Migration Tasks.
When developers are working with DMS + Kinesis, it's very difficult to simulate the stream locally, therefore, developer's productivity is impaired. This tool aims in improving the developer experience and productivity
By using tools such as Data Grip, the developer can right-click the table, and export the table content to a JSON File. This tool will feed from this JSON file, and it will automatically generate the Kinesis Record, and use the aws-cli to put the records in the stream
Before using this tool, you must have the following tools configured in your machine
- aws cli
- Localstack (with kinesis enabled)
All the files inside the folder should follow the following pattern.
loadOrder.schema.table.json
For example
1.OT.CUSTOMER.json
2.OT.PRODUCTS.json
The bigger the load-order, first it will be loaded, following the same patter from Amazon DMS Docs.
For the example files, the order will be:
2.OT.PRODUCTS.json
1.OT.CUSTOMER.json
The JSON can have a single record, or an array of records, for example:
[
{
"ID": 1,
"NAME": "Joselito Silva"
}
]
This JSON file will generate the following Kinesis Record
{
"data": {
"ID": 1,
"NAME": "Joselito Silva"
},
"metadata": {
"timestamp": "2021-02-21T23:02:36.5680Z",
"record-type": "data",
"operation": "load",
"partition-key-type": "primary-key",
"schema-name": "OT",
"table-name": "CUSTOMER"
}
}
Option | Description | Required | Default |
---|---|---|---|
-d, --directory | The folder where the JSON files are located | Yes | |
-s, --stream-name | The name of the kinesis stream | Yes | |
-p, --partition-key | The partition key | No | 1 |
-e, --localstack-endpoint | The localstack endpoint | No | http://localhost:4566 |
-o, --operation | The operation* name you want to simulate | No | LOAD |
-b, --batch-size | batch-size for batch processing | No | 1 |
- *Valid operations: LOAD, INSERT, UPDATE and DELETE
- *Valid batch sizes: 1 to 500 (AWS limits)
The cli will load all files inside the files folder, and load them to the kinesis stream By default it will load one by one but you can also add the --batch-size to put multiple records in batches
npx dms-kinesis-gen -d C:/Users/my-user/Documents/cdc-files -s my-stream-name
or install it globally with
npm i -g dms-kinesis-gen
And then use the command
dms-kinesis-gen -d C:/Users/my-user/Documents/cdc-files -s my-stream-name
yarn test
yarn lint
# or with fix
yarn lint:fix
Check the contributing.md file for more information