Load pino logs into Elasticsearch.
npm install pino-elasticsearch -g
pino-elasticsearch
To send pino logs to elasticsearch:
cat log | pino-elasticsearch --host 192.168.1.42
If using AWS Elasticsearch:
cat log | pino-elasticsearch --host https://your-url.us-east-1.es.amazonaws.com --port 443 -c ./aws_config.json
Flags
-h | --help Display Help
-v | --version display Version
-H | --host the IP address of elasticsearch; default: 127.0.0.1
-p | --port the port of elasticsearch; default: 9200
-i | --index the name of the index to use; default: pino
-t | --type the name of the type to use; default: log
-b | --size the number of documents for each bulk insert
-l | --trace-level trace level for the elasticsearch client, default 'error' (info, debug, trace).
-c | --aws-credentials path to aws_config.json (is using AWS Elasticsearch)
You can then use Kibana to browse and visualize your logs.
Setting up pino-elasticsearch is easy, and you can use the bundled
docker-compose.yml
file to bring up both
Elasticsearch and
Kibana.
You will need docker and
docker-compose, then in this project
folder, launch docker-compose up
.
You can test it by launching node example | pino-elasticsearch
, in
this project folder. You will need to have pino-elasticsearch
installed globally.
This project was kindly sponsored by nearForm.
Licensed under MIT.