[Ingest Processors] [JSON] Add max_depth option to JSON processor #69134
Labels
:Data Management/Ingest Node
Execution or management of Ingest Pipelines including GeoIP
>enhancement
Team:Data Management
Meta label for data/management team
For a case where there are multiple applications going through an ingest pipeline with many unique nested fields multiple levels deep, it would be helpful if there was an option to limit how deep the JSON parsing would go. This would be similar to the Filebeat
decode_json_fields
processormax_depth
option.For example,
After going through a dissect processor, could be sent through a JSON processor defined as
would produce a JSON document to be indexed with the "order" details not parsed into fields.
There is the option to limit the mapping depth via
index.mapping.depth.limit
, but it will reject the document from being indexed.This is similar to the logstash-plugins/logstash-filter-json#43 request for Logstash.
The text was updated successfully, but these errors were encountered: