-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix checkpoint @timestamp already exists #20567
Conversation
Since this is a community submitted pull request, a Jenkins build has not been kicked off automatically. Can an Elastic organization member please verify the contents of this patch and then kick off a build manually? |
1 similar comment
Since this is a community submitted pull request, a Jenkins build has not been kicked off automatically. Can an Elastic organization member please verify the contents of this patch and then kick off a build manually? |
❕ Build Aborted
Expand to view the summary
Build stats
Steps errorsExpand to view the steps failures
Log outputExpand to view the last 100 lines of log output
|
@Bernhard-Fluehmann Could you resolve the conflicting changelog? It should be doable through this github web UI in worst case scenario. Also please add a more descriptive entry in the changelog. Like: |
Pinging @elastic/siem (Team:SIEM) |
@P1llus Done |
Jenkins test this please |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, though need one more review
@Bernhard-Fluehmann Please do not remove the timestamp rename, the script compilation is something that is an issue on multiple modules on older versions and wont be a problem after 7.9-7.10. The rename processor itself is not the cause of the issue there. |
@P1llus Done |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
jenkins run tests |
Hi @marc-gr Can you please let me know if it is ok now or what other steps it takes to have this done? |
It is in the right place, but it seems to be conflicting with master, please merge/rebase your branch and it should be good to go after 👍 |
jenkins run tests |
@Bernhard-Fluehmann could you fix the merge conflict so we can get this in? |
Will fix it again |
jenkins run tests |
2dc211b
to
a03787f
Compare
@marc-gr could you let me know why the builds break? I suppose the testsuite fails on the checkpoint tests. Unfortunately I can not run testsuite locally since it is somehow broken. If you may give me a hint on how to test locally I could sort the problem out. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hello @Bernhard-Fluehmann ! I pulled your branch to see what was wrong and found a typo in the pipeline and few tips to improve the pipeline changes, which I added in the review. Also a merge or a rebase is required (mind the CHANGELOG entry is at the end, since merges usually end up messing with them).
Regarding how to add the new test case, it seems the test suite only checks the firs 100 events in the example log file. For now you can add your log in a separate file as I suggest, since changing the tests to check all events have some implications that we might need to deal in a new issue/PR if we decide to change it. Once you have your new file created and your log line is in it, you can go into the x-pack/filebeat
folder and run TESTING_FILEBEAT_MODULES=checkpoint TESTING_FILEBEAT_FILESETS=firewall GENERATE=true MODULES_PATH=module mage -v pythonIntegTest
this will regenerate the expected test files with your new pipeline, then you can commit the changes if they are okay. To run the module tests locally, just omit the GENERATE=true
.
Please let us know if you need something else or if you want us to take over from this point and do the required commits to your branch. Thanks a lot for taking the time to do this PR ! :)
- rename: | ||
field": "@timestamp" | ||
target_field: "event.created" | ||
if: "ctx.@timestamp != null" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if: "ctx.@timestamp != null" | |
ignore_missing: true |
if: "ctx.@timestamp != null" | ||
- date: | ||
field: "syslog5424_ts" | ||
formats: ["ISO8601"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
formats: ["ISO8601"] | |
formats: ["ISO8601", "UNIX"] |
not sure if this can come as unix at some point also
value: '{{syslog5424_ts}}' | ||
if: ctx.checkpoint?.time == null | ||
- rename: | ||
field": "@timestamp" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
field": "@timestamp" | |
field: "@timestamp" |
- date: | ||
field: "checkpoint.time" | ||
formats: ["ISO8601", "UNIX"] | ||
if: "ctx.checkpoint?.time != null" | ||
- rename: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would delete this rename block, since the previous date
one will take care of this
@@ -10037,3 +10037,4 @@ | |||
<134>1 2020-03-30T07:19:22Z gw-da58d3 CheckPoint 8363 - [action:"Accept"; flags:"411908"; ifdir:"inbound"; ifname:"eth0"; logid:"0"; loguid:"{0x5e819d7a,0x0,0x353707c7,0xee78a1dc}"; origin:"192.168.1.100"; originsicname:"cn=cp_mgmt,o=gw-da58d3..tmn8s8"; sequencenum:"1"; version:"5"; __policy_id_tag:"product=VPN-1 & FireWall-1[db_tag={880771B0-FD92-2C4F-82FC-B96FC3DE5A07};mgmt=gw-da58d3;date=1585502566;policy_name=Standard\]"; dst:"192.168.1.255"; inzone:"External"; layer_name:"Network"; layer_uuid:"63b7fe60-76d2-4287-bca5-21af87337b0a"; match_id:"1"; parent_rule:"0"; rule_action:"Accept"; rule_uid:"1fde807b-6300-4b1a-914f-f1c1f3e2e7d2"; outzone:"Local"; product:"VPN-1 & FireWall-1"; proto:"17"; s_port:"50024"; service:"137"; service_id:"nbname"; src:"192.168.1.196"] | |||
<134>1 2020-03-30T07:20:33Z gw-da58d3 CheckPoint 8363 - [action:"Accept"; flags:"411908"; ifdir:"inbound"; ifname:"eth0"; logid:"0"; loguid:"{0x5e819dc1,0x0,0x353707c7,0xee78a1dc}"; origin:"192.168.1.100"; originsicname:"cn=cp_mgmt,o=gw-da58d3..tmn8s8"; sequencenum:"1"; version:"5"; __policy_id_tag:"product=VPN-1 & FireWall-1[db_tag={880771B0-FD92-2C4F-82FC-B96FC3DE5A07};mgmt=gw-da58d3;date=1585502566;policy_name=Standard\]"; dst:"192.168.1.100"; inzone:"External"; layer_name:"Network"; layer_uuid:"63b7fe60-76d2-4287-bca5-21af87337b0a"; match_id:"1"; parent_rule:"0"; rule_action:"Accept"; rule_uid:"1fde807b-6300-4b1a-914f-f1c1f3e2e7d2"; outzone:"Local"; product:"VPN-1 & FireWall-1"; proto:"6"; s_port:"60226"; service:"22"; service_id:"ssh"; src:"192.168.1.205"] | |||
<134>1 2020-03-30T07:20:35Z gw-da58d3 CheckPoint 8363 - [action:"Accept"; flags:"444676"; ifdir:"outbound"; ifname:"eth0"; logid:"0"; loguid:"{0x5e819dc3,0x0,0x353707c7,0xee78a1dc}"; origin:"192.168.1.100"; originsicname:"cn=cp_mgmt,o=gw-da58d3..tmn8s8"; sequencenum:"1"; version:"5"; __policy_id_tag:"product=VPN-1 & FireWall-1[db_tag={880771B0-FD92-2C4F-82FC-B96FC3DE5A07};mgmt=gw-da58d3;date=1585502566;policy_name=Standard\]"; dst:"192.168.1.153"; inzone:"Local"; layer_name:"Network"; layer_uuid:"63b7fe60-76d2-4287-bca5-21af87337b0a"; match_id:"1"; parent_rule:"0"; rule_action:"Accept"; rule_uid:"1fde807b-6300-4b1a-914f-f1c1f3e2e7d2"; outzone:"External"; product:"VPN-1 & FireWall-1"; proto:"17"; s_port:"43103"; service:"514"; service_id:"syslog"; src:"192.168.1.100"] | |||
<134>1 2020-03-30T07:20:35Z gw-da58d3 CheckPoint 7776 - [action:"Accept"; flags:"444676"; ifdir:"outbound"; ifname:"eth0"; logid:"0"; loguid:"{0x5e819dc3,0x0,0x353707c7,0xee78a1dc}"; origin:"192.168.1.100"; originsicname:"cn=cp_mgmt,o=gw-da58d3..tmn8s8"; sequencenum:"1"; time:"1594646954"; version:"5"; __policy_id_tag:"product=VPN-1 & FireWall-1[db_tag={880771B0-FD92-2C4F-82FC-B96FC3DE5A07};mgmt=gw-da58d3;date=1585502566;policy_name=Standard\]"; dst:"192.168.1.153"; inzone:"Local"; layer_name:"Network"; layer_uuid:"63b7fe60-76d2-4287-bca5-21af87337b0a"; match_id:"1"; parent_rule:"0"; rule_action:"Accept"; rule_uid:"1fde807b-6300-4b1a-914f-f1c1f3e2e7d2"; outzone:"External"; product:"VPN-1 & FireWall-1"; proto:"17"; s_port:"43103"; service:"514"; service_id:"syslog"; src:"192.168.1.100"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tests are only checking the first 100 events, to let this get tested, you need to put this log line in a separate file e.g.: checkpoint_with_time.log
Hi Marc,
Thank you for your answer and helpful information.
I am on holiday this week, so if you get a chance to take over I would appreciate it.
If not I will proceed next week.
To make sure I got your comment on date processor right.
As far as I understand you, the date processor is save against multiple use and updates existing fields instead of throwing errors. I.e. the same behavior like set procesor. Is this correct?
Regards
Bernhard
Bernhard Flühmann, Senior IT-Consultant
------------------------------------------------------------------------------------------------------------
RealStuff Informatik AG, Kasernenstrasse 27, CH-3013 Bern
(office) +41 31 382 44 44 | (direct) +41 30 382 80 03 | (mobile) +41 79 253 34 21
www.realstuff.ch<http://www.realstuff.ch/> | bernhard.fluehmann@realstuff.ch<mailto:bernhard.fluehmann@realstuff.ch> | www.twitter.com/realstuffch<http://www.twitter.com/realstuffch>
------------------------------------------------------------------------------------------------------------
3rd Berne Analytics & Monitoring Conference / November 25, 2020
www.realstuff.ch/en/anmoco<http://www.realstuff.ch/en/anmoco>
________________________________
From: Marc Guasch <notifications@github.com>
Sent: Monday, September 14, 2020 14:42
To: elastic/beats <beats@noreply.github.com>
Cc: Bernhard Flühmann <Bernhard.Fluehmann@realstuff.ch>; Mention <mention@noreply.github.com>
Subject: Re: [elastic/beats] fix checkpoint @timestamp already exists (#20567)
@marc-gr requested changes on this pull request.
Hello @Bernhard-Fluehmann<https://github.com/Bernhard-Fluehmann> ! I pulled your branch to see what was wrong and found a typo in the pipeline and few tips to improve the pipeline changes, which I added in the review. Also a merge or a rebase is required (mind the CHANGELOG entry is at the end, since merges usually end up messing with them).
Regarding how to add the new test case, it seems the test suite only checks the firs 100 events in the example log file. For now you can add your log in a separate file as I suggest, since changing the tests to check all events have some implications that we might need to deal in a new issue/PR if we decide to change it. Once you have your new file created and your log line is in it, you can go into the x-pack/filebeat folder and run TESTING_FILEBEAT_MODULES=checkpoint TESTING_FILEBEAT_FILESETS=firewall GENERATE=true MODULES_PATH=module mage -v pythonIntegTest this will regenerate the expected test files with your new pipeline, then you can commit the changes if they are okay. To run the module tests locally, just omit the GENERATE=true.
Please let us know if you need something else or if you want us to take over from this point and do the required commits to your branch. Thanks a lot for taking the time to do this PR ! :)
________________________________
In x-pack/filebeat/module/checkpoint/firewall/ingest/pipeline.yml<#20567 (comment)>:
@@ -40,10 +40,14 @@ processors:
- message
- host
ignore_missing: true
-- set:
- field: '@timestamp'
- value: '{{syslog5424_ts}}'
- if: ctx.checkpoint?.time == null
+- rename:
+ field": "@timestamp"
+ target_field: "event.created"
+ if: "ctx.@timestamp != null"
⬇️ Suggested change
- if: "ctx.@timestamp != null"
+ ignore_missing: true
________________________________
In x-pack/filebeat/module/checkpoint/firewall/ingest/pipeline.yml<#20567 (comment)>:
@@ -40,10 +40,14 @@ processors:
- message
- host
ignore_missing: true
-- set:
- field: '@timestamp'
- value: '{{syslog5424_ts}}'
- if: ctx.checkpoint?.time == null
+- rename:
+ field": "@timestamp"
+ target_field: "event.created"
+ if: "ctx.@timestamp != null"
+- date:
+ field: "syslog5424_ts"
+ formats: ["ISO8601"]
⬇️ Suggested change
- formats: ["ISO8601"]
+ formats: ["ISO8601", "UNIX"]
not sure if this can come as unix at some point also
________________________________
In x-pack/filebeat/module/checkpoint/firewall/ingest/pipeline.yml<#20567 (comment)>:
@@ -40,10 +40,14 @@ processors:
- message
- host
ignore_missing: true
-- set:
- field: '@timestamp'
- value: '{{syslog5424_ts}}'
- if: ctx.checkpoint?.time == null
+- rename:
+ field": "@timestamp"
⬇️ Suggested change
- field": "@timestamp"
+ field: "@timestamp"
________________________________
In x-pack/filebeat/module/checkpoint/firewall/ingest/pipeline.yml<#20567 (comment)>:
@@ -578,6 +582,10 @@ processors:
field: checkpoint.industry_reference
target_field: vulnerability.id
ignore_missing: true
+- date:
+ field: "checkpoint.time"
+ formats: ["ISO8601", "UNIX"]
+ if: "ctx.checkpoint?.time != null"
- rename:
I would delete this rename block, since the previous date one will take care of this
________________________________
In x-pack/filebeat/module/checkpoint/firewall/test/checkpoint.log<#20567 (comment)>:
@@ -10037,3 +10037,4 @@
<134>1 2020-03-30T07:19:22Z gw-da58d3 CheckPoint 8363 - [action:"Accept"; flags:"411908"; ifdir:"inbound"; ifname:"eth0"; logid:"0"; loguid:"{0x5e819d7a,0x0,0x353707c7,0xee78a1dc}"; origin:"192.168.1.100"; originsicname:"cn=cp_mgmt,o=gw-da58d3..tmn8s8"; sequencenum:"1"; version:"5"; __policy_id_tag:"product=VPN-1 & FireWall-1[db_tag={880771B0-FD92-2C4F-82FC-B96FC3DE5A07};mgmt=gw-da58d3;date=1585502566;policy_name=Standard\]"; dst:"192.168.1.255"; inzone:"External"; layer_name:"Network"; layer_uuid:"63b7fe60-76d2-4287-bca5-21af87337b0a"; match_id:"1"; parent_rule:"0"; rule_action:"Accept"; rule_uid:"1fde807b-6300-4b1a-914f-f1c1f3e2e7d2"; outzone:"Local"; product:"VPN-1 & FireWall-1"; proto:"17"; s_port:"50024"; service:"137"; service_id:"nbname"; src:"192.168.1.196"]
<134>1 2020-03-30T07:20:33Z gw-da58d3 CheckPoint 8363 - [action:"Accept"; flags:"411908"; ifdir:"inbound"; ifname:"eth0"; logid:"0"; loguid:"{0x5e819dc1,0x0,0x353707c7,0xee78a1dc}"; origin:"192.168.1.100"; originsicname:"cn=cp_mgmt,o=gw-da58d3..tmn8s8"; sequencenum:"1"; version:"5"; __policy_id_tag:"product=VPN-1 & FireWall-1[db_tag={880771B0-FD92-2C4F-82FC-B96FC3DE5A07};mgmt=gw-da58d3;date=1585502566;policy_name=Standard\]"; dst:"192.168.1.100"; inzone:"External"; layer_name:"Network"; layer_uuid:"63b7fe60-76d2-4287-bca5-21af87337b0a"; match_id:"1"; parent_rule:"0"; rule_action:"Accept"; rule_uid:"1fde807b-6300-4b1a-914f-f1c1f3e2e7d2"; outzone:"Local"; product:"VPN-1 & FireWall-1"; proto:"6"; s_port:"60226"; service:"22"; service_id:"ssh"; src:"192.168.1.205"]
<134>1 2020-03-30T07:20:35Z gw-da58d3 CheckPoint 8363 - [action:"Accept"; flags:"444676"; ifdir:"outbound"; ifname:"eth0"; logid:"0"; loguid:"{0x5e819dc3,0x0,0x353707c7,0xee78a1dc}"; origin:"192.168.1.100"; originsicname:"cn=cp_mgmt,o=gw-da58d3..tmn8s8"; sequencenum:"1"; version:"5"; __policy_id_tag:"product=VPN-1 & FireWall-1[db_tag={880771B0-FD92-2C4F-82FC-B96FC3DE5A07};mgmt=gw-da58d3;date=1585502566;policy_name=Standard\]"; dst:"192.168.1.153"; inzone:"Local"; layer_name:"Network"; layer_uuid:"63b7fe60-76d2-4287-bca5-21af87337b0a"; match_id:"1"; parent_rule:"0"; rule_action:"Accept"; rule_uid:"1fde807b-6300-4b1a-914f-f1c1f3e2e7d2"; outzone:"External"; product:"VPN-1 & FireWall-1"; proto:"17"; s_port:"43103"; service:"514"; service_id:"syslog"; src:"192.168.1.100"]
+<134>1 2020-03-30T07:20:35Z gw-da58d3 CheckPoint 7776 - [action:"Accept"; flags:"444676"; ifdir:"outbound"; ifname:"eth0"; logid:"0"; loguid:"{0x5e819dc3,0x0,0x353707c7,0xee78a1dc}"; origin:"192.168.1.100"; originsicname:"cn=cp_mgmt,o=gw-da58d3..tmn8s8"; sequencenum:"1"; time:"1594646954"; version:"5"; __policy_id_tag:"product=VPN-1 & FireWall-1[db_tag={880771B0-FD92-2C4F-82FC-B96FC3DE5A07};mgmt=gw-da58d3;date=1585502566;policy_name=Standard\]"; dst:"192.168.1.153"; inzone:"Local"; layer_name:"Network"; layer_uuid:"63b7fe60-76d2-4287-bca5-21af87337b0a"; match_id:"1"; parent_rule:"0"; rule_action:"Accept"; rule_uid:"1fde807b-6300-4b1a-914f-f1c1f3e2e7d2"; outzone:"External"; product:"VPN-1 & FireWall-1"; proto:"17"; s_port:"43103"; service:"514"; service_id:"syslog"; src:"192.168.1.100"]
Tests are only checking the first 100 events, to let this get tested, you need to put this log line in a separate file e.g.: checkpoint_with_time.log
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub<#20567 (review)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AOGOO4DFPZPIZ6JHBCE3MWDSFYFTPANCNFSM4P4INMMQ>.
|
Sure will push the changes myself, but I need you to give me push permission to the PR 👍
In this case, we were using a Once again thanks a lot for the contribution and your time ❤️ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
@Bernhard-Fluehmann hope your holidays were nice! Do you still want me to do the changes or are you going to? Thanks! |
e42fd14
to
b4c7a93
Compare
@ Hi Marc, I have tried to proceed with my work and managed to screw up the branch due to updating. I am very sorry on that. Do you have an idea how to fix this and proceed? Regards |
💔 Build FailedExpand to view the summary
Build stats
Steps errorsExpand to view the steps failures
Log outputExpand to view the last 100 lines of log output
|
No worries, will open a new PR from yours so you do not lose authorship |
What does this PR do?
Resolves conflicting set of timestamp field
Why is it important?
Some Checkpoint logs contain a time field and the pipeline contained a processor for it. As far as I can see the current implementation causes two errors. 1st the presence of a time field breaks the pipeline since the logs received from filebeat contain a timestamp already. 2nd problem is that the format of the time field is unix time and thus needs to be converted into a iso timestamp first.
The reason why the problem was not detected by the testsuite seems to be that none of the logs of the checkpoint.log file contains the time field. I have added such a log line at the end now.
Checklist
- [ ] I have made corresponding changes to the documentation- [ ] I have made corresponding change to the default configuration files- [ ] I have added tests that prove my fix is effective or that my feature worksCHANGELOG.next.asciidoc
orCHANGELOG-developer.next.asciidoc
.Related issues