Use of grok and regular expressions

Hello

I'm sending an asterisk log from filebeat to logstash,

The format of the logs is as follows.

[May 19 14:57:19] NOTICE [8583] chan_sip.c: Registration from '<sip: 34541@xx.xx.xx.xx>' failed for '91.214.44.144: 2718 '- Wrong password

Download a filter from github to be able to separate the asterisk logs.

Now my question is if in the field of log_message you can take the ip that is failing in the password and separate it to generate a new field in elastic with that ip

the ip that I wish to keep would be this 91.214.44.144, I do not know if there is a way to generate a field with this info

I attach my logstash filter

filter {
if [source] == "security" {
if [message] =~ /^[/ {
grok {
match => {
"message" => "[%{SYSLOGTIMESTAMP:log_timestamp}] +(?<log_level>(?i)(?:debug|notice|warning|error|verbose|dtmf|fax|security)(?-i))[%{INT:thread_id}](?:[%{DATA:call_thread_id}])? %{DATA:module_name}:(?: +[=|-]{2})? %{GREEDYDATA:log_message}"
}
add_field => [ "received_timestamp", "%{@timestamp}" ]
add_field => [ "process_name", "asterisk" ]
}
date { match => [ "log_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ] }
if ![log_message] {
mutate {
add_field => {"log_message" => ""}
}
} # End default asterisk log fields
} # end log lines that begin with '['
} # end filter for type == asterisk_debug
} # end filter

I am a novice in the use of grok

You do not have to match the entire line

grok { match => { "message" => "%{IPV4:ip}" } }

will pull an IP address out of the message if one is in there.

You should read "Do you grok Grok?", and then anchor your pattern.

good afternoon

thanks for your answer, unque I have a question suppose this is my message

Registration from '<sip: test% 20hdjd@45.79.216.200; transport = UDP>' failed for '191.95.48.173:31031' - Wrong password

in this message there are two IPs if entry% {IP: ip} would be = 45.79.216.200 how could the two IPs take in different fields?

based on the message described above

You could use

grok { match => { "message" => "%{IPV4:ip-1}%{DATA}%{IPV4:ip-2}" } }

thank you very much I worked perfect:+1::+1:

Good afternoon

Now I find myself separating by commas a log that is the following:

SecurityEvent="InvalidPassword",EventTV="2019-06-05T14:41:13.146-0500",Severity="Error",Service="SIP",EventVersion="2",AccountID="01148585359002",SessionID="0x7f4d04085258",LocalAddress="IPV4/UDP/45.79.216.200/5090",RemoteAddress="IPV4/UDP/102.165.37.226/58906",Challenge="45241c33",ReceivedChallenge="45241c33",ReceivedHash="25eb0d44b9170f358fdab302d6a48e6f"

and use the following match:

%{GREEDYDATA:SecurityEvent},%{GREEDYDATA:EventTV},%{GREEDYDATA:Severity},%{GREEDYDATA:Service},%{GREEDYDATA:EventVersion},%{GREEDYDATA:AccountID},%{GREEDYDATA:SessionID},%{GREEDYDATA:dummy1},%{GREEDYDATA:dummy2},%{GREEDYDATA:Challenge},%{GREEDYDATA:ReceivedChallenge},%{GREEDYDATA:ReceivedHash}

when I send it to elasticsearch the problem is that each field looks like this:

SecurityEvent = "InvalidPassword"
I do not know if there is the possibility to remove SecurityEvent = "and the quote that closes"

only left = InvalidPassword

I have done several searches and I have not found this expression, so that in each separation only the value that is in quotes remains.

I would use

kv { field_split => "," }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.