_grokparsefailure


(Maher Glenza) #1

input {
file {
path =>"C:\elk\router"
type => "logs"
start_position => beginning
sincedb_path => "C:\elkstack\ELK\logstash-5.1.2\data\plugins\inputs\file.sincedb_5eed3ff4207ce42c69ff2b34b669aa79"

}
}

filter {
mutate {
gsub => ["message","|"," "]
}

grok {
        match => ['message',' =%{DATESTAMP:time} : %{UUID:id} %{NUMBER:dateconsommation} %{NUMBER:datefintrait} %{NUMBER:delai} %{WORD:nomFlux} %{WORD:evt} %{GREEDYDATA:lst} %{NUMBER:reforigin} %{NOTSPACE:contractoidval} %{DATA:useroidval} %{NOTSPACE:servname} ']
		
}

}

output {
elasticsearch {
hosts => "localhost:9200"
index => "router-%{+YYYY.MM.dd}"
template => "C:\elkstack\ELK\elasticsearch-5.1.2\config\router_template.json"
template_name => "router_template"

}

}

is there any thing wrong plzz !!


(Magnus Bäck) #2

How can we answer the question without knowing what input Logstash is getting?


(Maher Glenza) #3

hahahahah sorry
this is a line of the input
2017-01-18 16:02:24,166 : 1e045e2f-a06b-40c9-954e-cc26b0ead93a|20170118160224|20170118160224|84|CACC|CONTRACT_CREATION|[912296384,aur][912296385,sui]|44|0x7125838BA5F500010001D64F||inpmsrtr1n


(Magnus Bäck) #4

There are multiple problems with this grok expression.

  • The very beginning doesn't match at all (there's no equal sign in the log).
  • DATESTAMP doesn't match yyyy-mm-dd dates like yours.
  • Your log contains multiple | characters, none of which are included in your expression.
  • It has a trailing space that you probably shouldn't assume is in the input data.

Start over with the simplest possible expression (e.g. ^%{TIMESTAMP_ISO8601:timestamp}. Make sure that works. Then add more and more to your expression, verifying each time that it continues to work.


(Maher Glenza) #5

excuse me but this :
mutate {
gsub => ["message","|"," "]
}

will replace the | with a trailling space


(Magnus Bäck) #6

Oh, right. Not sure why you're doing that.


(Maher Glenza) #7

cause grok is no
t able to receive a message with | i m testing with http://grokdebug.herokuapp.com/


(Maher Glenza) #8

and i want to notice now i tested the grokdebug with these inputs
2017-01-18 16:02:29,235 : f047e31d-0a4d-4839-9e7d-0c0989d94083 20170118160229 20170118160229 64 CUSE USER_CREATION [912304704,aur][912304705,clp][912304706,erb][912304707,me1] 44 0x7125838BA5F600010001D71F 0x7154C949C91484751577231E inpmsrtr1n

and this pattern
%{TIMESTAMP_ISO8601} : %{UUID:id} %{NUMBER:dateconsommation} %{NUMBER:datefintrait} %{NUMBER:delai} %{WORD:nomFlux} %{WORD:evt} %{GREEDYDATA:lst} %{NUMBER:reforigin} %{NOTSPACE:contractoidval} %{DATA:useroidval} %{NOTSPACE:servname}

and this is the result

{
"TIMESTAMP_ISO8601": [
[
"2017-01-18 16:02:29,235"
]
],
"YEAR": [
[
"2017"
]
],
"MONTHNUM": [
[
"01"
]
],
"MONTHDAY": [
[
"18"
]
],
"HOUR": [
[
"16",
null
]
],
"MINUTE": [
[
"02",
null
]
],
"SECOND": [
[
"29,235"
]
],
"ISO8601_TIMEZONE": [
[
null
]
],
"id": [
[
"f047e31d-0a4d-4839-9e7d-0c0989d94083"
]
],
"dateconsommation": [
[
"20170118160229"
]
],
"BASE10NUM": [
[
"20170118160229",
"20170118160229",
"64",
"44"
]
],
"datefintrait": [
[
"20170118160229"
]
],
"delai": [
[
"64"
]
],
"nomFlux": [
[
"CUSE"
]
],
"evt": [
[
"USER_CREATION"
]
],
"lst": [
[
"[912304704,aur][912304705,clp][912304706,erb][912304707,me1]"
]
],
"reforigin": [
[
"44"
]
],
"contractoidval": [
[
"0x7125838BA5F600010001D71F"
]
],
"useroidval": [
[
"0x7154C949C91484751577231E"
]
],
"servname": [
[
"inpmsrtr1n"
]
]
}

but it doesn t work in the logstash !!!!!


(Christian Dahlqvist) #9

As the majority of the message is a list separated by |, why don't you use grok to capture the timestamp and capture the rest of the message (excluding the initial colon) into a separate field which you can then process using the csv filter?


(Maher Glenza) #10

i don t want to use cvs i m working to devide my message the way it is in the grok debugger then to send it to elasticsearch then to visualise every part of the message to do statistics !! i don t know if you are understanding me ?


(Magnus Bäck) #11

We understand you perfectly fine. Christian's point is that using a csv filter probably is an easier way of reaching that goal.


(Maher Glenza) #12

okay i well see how to work with csv then i return to you
thank you


(Maher Glenza) #13

can i have your mail magnus plz ?


(Magnus Bäck) #14

No you may not have my email address but there's a personal message feature on this site. Note that I only offer help for publicly posted questions.


(Maher Glenza) #15

this is my question plz :
i m trying to explore 3 specific logs from an application (as you saw one line of one of them) then filter them with logstash to have only specific fields from them to save in elasticsearch then visualize them in real time wwith kibana as statistics !!
is that possible ?


(Magnus Bäck) #16

Yes. One way of doing it is listing multiple expressions in a grok filter. The filter will match the expressions one by one and break when one of them matches.


(Maher Glenza) #17

my problem now is how to test and see data received by logstash in elasticsearch (i m working with windows 7 )


(Magnus Bäck) #18

I'm sure there are "getting started" tutorials for you to follow to get the stack up and running. Once you're there adding and modifying Logstash filters is easy and not conceptually different from what you've already done.


(Maher Glenza) #19

okay thank you very much


(Maher Glenza) #20

hi magnus !
i can t find a way to combine data from 3 different logs then save them in ES
is there a way ?
thank you