Can anyone please help me how can i parse this firewall log?

date stamp timestamp(data)private IP [public IP]->[public IP(client side)(protocol)

->i am not able to understand the type of log and i am new in the field so kindly please help me to parse the data of this kind of log

can you please past one line so we can able to fit the grok.

in the mean time try to use this

yes you can use this as a sample log :
2000-06-30 01:44:00: {exwire-NAT-Rules} PSERVE_SESSION_OPEN: application:none, lsi.32 101.22.954.358.84355 [115.109.546.105.62589] -> 66.451.684.74:99 (TCP)

I would start with

grok { match => { "message" => "%{DATA:[@metadata][startOfLine]} %{IPV4:ip1}.%{INT:port1} \[%{IPV4:ip2}.%{INT:port2}\] -> %{IPV4:ip3}:%{INT:port3} \(%{WORD:protocol}\)$" } }
dissect { mapping => { "[@metadata][startOfLine]" => "%{[@metadata][ts]} %{+[@metadata][ts]}: %{data}" } }
date { match => [ "[@metadata][ts]", "YYYY-MM-dd HH:mm:ss" ] }

The filter you sent me above isn't working correctly and i am not able to find the problem .Can you please explain what could be the possible reason for not expected answer.
the error coming out is that the:-
Provided Grok patterns do not match data in the input .

Have you tested it against an actual log message rather than the example you gave? The IP addresses in the example are not valid, so IPV4 does not match them.

Yes , i have tried them on actual logs. but they doesn't work.

They work for me if I change your IP addresses and port numbers to be valid

input { generator { count => 1 lines => [
'2000-06-30 01:44:00: {exwire-NAT-Rules} PSERVE_SESSION_OPEN: application:none, lsi.32 101.22.54.58.4355 [115.109.46.105.62589] -> 66.51.84.74:99 (TCP)',
'2000-06-30 01:44:00: {exwire-NAT-Rules} PSERVE_SESSION_OPEN: application:none, lsi.32 101.22.95.35.8435 [115.109.56.105.6589] -> 66.41.64.74:99 (TCP)' ] } }

filter {
    grok { match => { "message" => "%{DATA:[@metadata][startOfLine]} %{IPV4:ip1}.%{INT:port1} \[%{IPV4:ip2}.%{INT:port2}\] -> %{IPV4:ip3}:%{INT:port3} \(%{WORD:protocol}\)$" } }
    dissect { mapping => { "[@metadata][startOfLine]" => "%{[@metadata][ts]} %{+[@metadata][ts]}: %{data}" } }
    date { match => [ "[@metadata][ts]", "YYYY-MM-dd HH:mm:ss" ] }
}
output { stdout { codec => rubydebug { metadata => false } } }

gets me

{
"ip1" => "101.22.95.35",
"ip2" => "115.109.56.105",
"port2" => "6589",
"data" => "{exwire-NAT-Rules} PSERVE_SESSION_OPEN: application:none, lsi.32",
"protocol" => "TCP",

etc.

Thank you ,it is working now . But i want to actually use these logs in kibana but i am unable to connect to the elasticsearch
TAKE A LOOK AT THE CONFIGURATION FILE I MADE FOR THE SAME ( i just added yo your file some output commands) :-

output {

     elasticsearch { 
                 hosts => ["localhost:9200"]
                 user => "5d5a"
                 password => "33f7"
                 index => "syslog-%{+YYYY.MM.dd}"
                 document_type => "custom_logs"
                 }
   }
}

(and as i am doing it for small log file i am manually inputting it through the log file and not using beats)

"document_type" was deprecated from Elasticsearch 6.0+. Remove it and see If that works.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.