Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: Copy Processor #7987

Closed
anandsinghkunwar opened this issue Aug 16, 2018 · 5 comments
Closed

Feature Request: Copy Processor #7987

anandsinghkunwar opened this issue Aug 16, 2018 · 5 comments

Comments

@anandsinghkunwar
Copy link

anandsinghkunwar commented Aug 16, 2018

I found the rename processor for filebeat, couldn't find anything related to a copy field processor. My usecase is I want to copy some fields from the kubernetes processor to a root field with the original fields remaining intact. I have a similar usecase in the case of journalbeat as well.
I tried using the fields, fields_under_root which instead of deriving the name of the variable use the variable name itself. For eg.

fields:
  test: "%{[kubernetes][namespace]}"

makes a field test with value %{[kubernetes][namespace]} instead of derived value of that actual namespace. Requesting this feature. Also are you willing to accept a PR for the same?

Discussion link here

@ruflin
Copy link
Member

ruflin commented Aug 17, 2018

I wonder if the upcoming field alias feature could solve your problem: elastic/elasticsearch#32172 to not have to duplicate the data? An other approach would be using copy_to in the template. https://www.elastic.co/guide/en/elasticsearch/reference/current/copy-to.html

Not against the copy processor but would like to understand where the other two proposals would fall short for your use case.

@anandsinghkunwar
Copy link
Author

anandsinghkunwar commented Aug 17, 2018

The usecase I see which doesn't fulfill this is when I have a kafka bus that sits between filebeat and ES. Logstash consumes from this kafka bus and pushes data to ES. I needed some sort of common field for all my logs which may be derived from other fields. Logstash reads this common field and understands what to do with the log, essentially what index to push, ignore it etc. I wanted this field to be made in the producer, i.e., filebeat and journalbeat in my case instead of on the logstash side. I wanted even my kafka topic name to be derived from this common field name, hence the need of a copy/derived field processor. My feature req. has a copy processor instead of a derived processor(baby steps) as I felt it would be easier to implement as there already is a rename processor.

@webmat
Copy link
Contributor

webmat commented Aug 17, 2018

One of my initial attempts at doing the Suricata module with the ECS schema assumed there was a copy processor in Beats. I do agree that it could be useful in some scenarios. Sending to a different transport is another good example.

@urso
Copy link

urso commented Aug 24, 2018

@anandsinghkunwar +1 for copy processor. A PR would be very welcome. Make sure you have signed the CLAs when contributing.

@kvch
Copy link
Contributor

kvch commented Apr 11, 2019

Added in: #11303

@kvch kvch closed this as completed Apr 11, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants