-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Filebeat S3 input support for AWS WAF logs (support application/octet-stream) #25296
Comments
Pinging @elastic/integrations (Team:Integrations) |
Can you provide more details? Are u saying the WAF logs aren't plain text files? Can you provide sample logs? Looking at this, https://www.wafcharm.com/en/blog/aws-waf-full-log-s3-output/, it should be json logs in the s3 bucket? |
AWS WAF logs are in JSON format but when Kinesis Firehose put them in s3 bucket it set the metadata "content-type" to "application/octet-stream" instead of "application/json". So when filebeat s3 plugin read that data it gets its content type as "application/octet-stream" and not able to expand to get the JSON field read properly in Elasticsearch. The whole JSON log get inside "message" field. |
The |
I think we can close this given we have a solution for the Content-Type. There's a separate request in #28121 to add AWS WAF to the Filebeat AWS module. |
Fixed by #25772 |
Thank you ! That is really helpful. I will try it and post my observations. |
Filebeat S3 input support for AWS WAF logs (support application/octet-stream)
As of now, there is Filebeat s3 input that doesn't support AWS WAF logs. AWS WAF logs use Kinesis Firehose to get to S3 and the "Content-type" is set to "application/octet-stream". Due to this, the logs didn't get expanded in the Elasticsearch.
If that support can be added it will resolve a big problem and many users looking forward to getting AWS WAF logs to the ELK stack for the analysis.
Thanks in Advance
The text was updated successfully, but these errors were encountered: