This is my first experience with Elastic. I am trying to set up a Kibana dashboard and create dashboards for my Checkpoint firewall. The built in checkpoint module does not parse logs from my firewall. I have looked into creating custom patterns, and have failed at every attempt.
Jan 04 13:18:12--6:00 10.1.0.1 Action="drop" inzone="External" outzone="Local" service_id="Any_TCP" src="81.91.190.60" dst="1.2.3.4" proto="6" user="" src_user_name="" src_machine_name="" src_user_dn="" snid="" dst_user_name="" dst_machine_name="" dst_user_dn="" UP_match_table="TABLE_START" ROW_START="0" match_id="7" layer_uuid="9fced3b3-5da9-494d-b7f1-3242694d99f8" layer_name="internal" rule_uid="0000077F-0000-0000-0000-000000000000" rule_name="Incoming/Internal Default Policy" ROW_END="0" UP_match_table="TABLE_END" UP_action_table="TABLE_START" ROW_START="0" action="0" ROW_END="0" UP_action_table="TABLE_END" ProductName="VPN-1 & FireWall-1" svc="1367" sport_svc="57408" ProductFamily=""
Here is a sample. I don't need half of this. All I want is Action, inzone, outzone, source IP, dst IP, dst port.
What should I be doing here? Grok pattern or dissect?
This is the output from Elastic from the CheckPoint Filebeat module
Provided Grok expressions do not match field value: [<85>Jan 04 15:02:58--6:00 10.1.0.1 Action="drop" inzone="External" outzone="Local" service_id="Any_TCP" src="74.120.14.64" dst="1.2.3.4" proto="6" user="" src_user_name="" src_machine_name="" src_user_dn="" snid="" dst_user_name="" dst_machine_name="" dst_user_dn="" UP_match_table="TABLE_START" ROW_START="0" match_id="7" layer_uuid="9fced3b3-5da9-494d-b7f1-3242694d99f8" layer_name="internal" rule_uid="0000077F-0000-0000-0000-000000000000" rule_name="Incoming/Internal Default Policy" ROW_END="0" UP_match_table="TABLE_END" UP_action_table="TABLE_START" ROW_START="0" action="0" ROW_END="0" UP_action_table="TABLE_END" ProductName="VPN-1 & FireWall-1" svc="1387" sport_svc="25082" ProductFamily="" \n]
Tried to be as detailed as possible, please let me know if anymore information is needed.
Badger
January 4, 2021, 9:56pm
2
I think this is a good place to use grok (which is massively overused).
grok {
break_on_match => false
match => {
"message" => [
" Action=%{WORD:action}",
' inzone="%{WORD:inzone}"',
' outzone="%{WORD:outzone}"',
' src="%{IPV4:src}"',
' dst="%{IPV4:dst}"'
]
}
}
Not sure where to get "dst port" from.
Thanks. I'll give that a shot.
{
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "Provided Grok expressions do not match field value: [<85>Jan 04 16:28:50--6:00 10.1.0.1 Action="drop" inzone="External" outzone="Local" service_id="Any_TCP" src="89.248.165.71" dst="1.12.3.4" proto="6" user="" src_user_name="" src_machine_name="" src_user_dn="" snid="" dst_user_name="" dst_machine_name="" dst_user_dn="" UP_match_table="TABLE_START" ROW_START="0" match_id="7" layer_uuid="9fced3b3-5da9-494d-b7f1-3242694d99f8" layer_name="internal" rule_uid="0000077F-0000-0000-0000-000000000000" rule_name="Incoming/Internal Default Policy" ROW_END="0" UP_match_table="TABLE_END" UP_action_table="TABLE_START" ROW_START="0" action="0" ROW_END="0" UP_action_table="TABLE_END" ProductName="VPN-1 & FireWall-1" svc="6477" sport_svc="42260" ProductFamily="" \n]"
}
],
"type": "illegal_argument_exception",
"reason": "Provided Grok expressions do not match field value: [<85>Jan 04 16:28:50--6:00 10.1.0.1 Action="drop" inzone="External" outzone="Local" service_id="Any_TCP" src="89.248.165.71" dst="1.2.3.4" proto="6" user="" src_user_name="" src_machine_name="" src_user_dn="" snid="" dst_user_name="" dst_machine_name="" dst_user_dn="" UP_match_table="TABLE_START" ROW_START="0" match_id="7" layer_uuid="9fced3b3-5da9-494d-b7f1-3242694d99f8" layer_name="internal" rule_uid="0000077F-0000-0000-0000-000000000000" rule_name="Incoming/Internal Default Policy" ROW_END="0" UP_match_table="TABLE_END" UP_action_table="TABLE_START" ROW_START="0" action="0" ROW_END="0" UP_action_table="TABLE_END" ProductName="VPN-1 & FireWall-1" svc="6477" sport_svc="42260" ProductFamily="" \n]"
}
This is the output from the filebeat module if that helps. How would I go about doing it the logstash way? Or should I try to fix the filebeat pipeline?
Badger
January 4, 2021, 10:45pm
5
I'm sorry, since you tagged the question with logstash I assumed you were using logstash, but that error message is from filebeat. You could update the question to make it clear you are doing the processing in filebeat and then move the question to the filebeat forum.
%{SYSLOG5424PRI}%{CISCOTIMESTAMP}-%{ISO8601_TIMEZONE} +(?:%{IPORHOST:syslog5424_host}|-)
I have up to there working. Now I am working on the action filter
system
(system)
Closed
February 2, 2021, 12:07am
7
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.