I want to apply two grok patterns to a single log file.
This is my logstash.confd file :-
input {
beats {
port => 5044
}
}
filter {
if [fields][log_type] == "syslog" {
grok {
match => { "message" => "(#!TOT:)?320:%{NUMBER:q}:%{NUMBER:q1}:%{NUMBER:q2}:%{NUMBER:q3}:%{NUMBER:q4}:%{NUMBER:q5}:%{NUMBER:st2} %{NUMBER:st3} %{NUMBER:st4} %{NUMBER:st5} %{NUMBER:st6} %{NUMBER:st7} %{NUMBER:st8} %{NUMBER:st9} %{NUMBER:st10} %{NUMBER:st11} %{NUMBER:st12} %{NUMBER:st13} %{NUMBER:st14} %{NUMBER:st15} %{NUMBER:st16} %{NUMBER:st17} %{NUMBER:st18} %{NUMBER:st19} %{NUMBER:st20} %{NUMBER:st21} %{NUMBER:st22} %{NUMBER:st23} %{NUMBER:st24} %{NUMBER:st25} %{NUMBER:st26} %{NUMBER:st27} %{NUMBER:st28} %{NUMBER:st29} :" }
}
}
if ("_grokparsefailure" in [tags]) { drop {} }
else {
mutate {
remove_field => ["[q]","[q1]","[q2]","[q3]","[q4]","[st1]","[st3]","[st4]","[st5]","[st6]","[st7]","[st8]","[st9]","[st10]","[st11]","[st12]","[st14]","[st15]","[st17]","[st18]","[st19]","[st20]","[st21]","[st22]","[st23]","[st24]","[st25]","[st26]","[st28]","[st29]"]
convert => {
"st2" => "integer"
"st16" => "integer"
"st13" => "integer"
"st27" => "integer"
}
}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
sniffing => true
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
these are the two types of lines present in my log files:-
83:0:53:1:1:0:10:12 1532 129 784 744 118447 30 632134068 343041612 21074326 :
320:0:4:5:0:0:28:18 293 6 83 204 0 0 57 0 0 236 18 293 0 2 36 0 0 36 0 0 0 0 0 36 2 36 0 :
currently i am only filtering lines starting with 320,
what i want to do is apply another grok pattern to my log file that filters the lines that start with 83 and then mutate and convert specific fields accordingly.
I have my pattern ready for lines starting with 83
I don't know how to syntactically do it.
any idea how should i do it ??
Please Help!!!1