_grokparsefailure

input {
file {
path =>"C:\elk\router"
type => "logs"
start_position => beginning
sincedb_path => "C:\elkstack\ELK\logstash-5.1.2\data\plugins\inputs\file.sincedb_5eed3ff4207ce42c69ff2b34b669aa79"

}
}

filter {
mutate {
gsub => ["message","|"," "]
}

grok {
        match => ['message',' =%{DATESTAMP:time} : %{UUID:id} %{NUMBER:dateconsommation} %{NUMBER:datefintrait} %{NUMBER:delai} %{WORD:nomFlux} %{WORD:evt} %{GREEDYDATA:lst} %{NUMBER:reforigin} %{NOTSPACE:contractoidval} %{DATA:useroidval} %{NOTSPACE:servname} ']
		
}

}

output {
elasticsearch {
hosts => "localhost:9200"
index => "router-%{+YYYY.MM.dd}"
template => "C:\elkstack\ELK\elasticsearch-5.1.2\config\router_template.json"
template_name => "router_template"

}

}

is there any thing wrong plzz !!

How can we answer the question without knowing what input Logstash is getting?

hahahahah sorry
this is a line of the input
2017-01-18 16:02:24,166 : 1e045e2f-a06b-40c9-954e-cc26b0ead93a|20170118160224|20170118160224|84|CACC|CONTRACT_CREATION|[912296384,aur][912296385,sui]|44|0x7125838BA5F500010001D64F||inpmsrtr1n

There are multiple problems with this grok expression.

  • The very beginning doesn't match at all (there's no equal sign in the log).
  • DATESTAMP doesn't match yyyy-mm-dd dates like yours.
  • Your log contains multiple | characters, none of which are included in your expression.
  • It has a trailing space that you probably shouldn't assume is in the input data.

Start over with the simplest possible expression (e.g. ^%{TIMESTAMP_ISO8601:timestamp}. Make sure that works. Then add more and more to your expression, verifying each time that it continues to work.

excuse me but this :
mutate {
gsub => ["message","|"," "]
}

will replace the | with a trailling space

Oh, right. Not sure why you're doing that.

cause grok is no
t able to receive a message with | i m testing with http://grokdebug.herokuapp.com/

and i want to notice now i tested the grokdebug with these inputs
2017-01-18 16:02:29,235 : f047e31d-0a4d-4839-9e7d-0c0989d94083 20170118160229 20170118160229 64 CUSE USER_CREATION [912304704,aur][912304705,clp][912304706,erb][912304707,me1] 44 0x7125838BA5F600010001D71F 0x7154C949C91484751577231E inpmsrtr1n

and this pattern
%{TIMESTAMP_ISO8601} : %{UUID:id} %{NUMBER:dateconsommation} %{NUMBER:datefintrait} %{NUMBER:delai} %{WORD:nomFlux} %{WORD:evt} %{GREEDYDATA:lst} %{NUMBER:reforigin} %{NOTSPACE:contractoidval} %{DATA:useroidval} %{NOTSPACE:servname}

and this is the result

{
"TIMESTAMP_ISO8601": [
[
"2017-01-18 16:02:29,235"
]
],
"YEAR": [
[
"2017"
]
],
"MONTHNUM": [
[
"01"
]
],
"MONTHDAY": [
[
"18"
]
],
"HOUR": [
[
"16",
null
]
],
"MINUTE": [
[
"02",
null
]
],
"SECOND": [
[
"29,235"
]
],
"ISO8601_TIMEZONE": [
[
null
]
],
"id": [
[
"f047e31d-0a4d-4839-9e7d-0c0989d94083"
]
],
"dateconsommation": [
[
"20170118160229"
]
],
"BASE10NUM": [
[
"20170118160229",
"20170118160229",
"64",
"44"
]
],
"datefintrait": [
[
"20170118160229"
]
],
"delai": [
[
"64"
]
],
"nomFlux": [
[
"CUSE"
]
],
"evt": [
[
"USER_CREATION"
]
],
"lst": [
[
"[912304704,aur][912304705,clp][912304706,erb][912304707,me1]"
]
],
"reforigin": [
[
"44"
]
],
"contractoidval": [
[
"0x7125838BA5F600010001D71F"
]
],
"useroidval": [
[
"0x7154C949C91484751577231E"
]
],
"servname": [
[
"inpmsrtr1n"
]
]
}

but it doesn t work in the logstash !!!!!

As the majority of the message is a list separated by |, why don't you use grok to capture the timestamp and capture the rest of the message (excluding the initial colon) into a separate field which you can then process using the csv filter?

i don t want to use cvs i m working to devide my message the way it is in the grok debugger then to send it to elasticsearch then to visualise every part of the message to do statistics !! i don t know if you are understanding me ?

We understand you perfectly fine. Christian's point is that using a csv filter probably is an easier way of reaching that goal.

okay i well see how to work with csv then i return to you
thank you

can i have your mail magnus plz ?

No you may not have my email address but there's a personal message feature on this site. Note that I only offer help for publicly posted questions.

this is my question plz :
i m trying to explore 3 specific logs from an application (as you saw one line of one of them) then filter them with logstash to have only specific fields from them to save in elasticsearch then visualize them in real time wwith kibana as statistics !!
is that possible ?

Yes. One way of doing it is listing multiple expressions in a grok filter. The filter will match the expressions one by one and break when one of them matches.

my problem now is how to test and see data received by logstash in elasticsearch (i m working with windows 7 )

I'm sure there are "getting started" tutorials for you to follow to get the stack up and running. Once you're there adding and modifying Logstash filters is easy and not conceptually different from what you've already done.

okay thank you very much

hi magnus !
i can t find a way to combine data from 3 different logs then save them in ES
is there a way ?
thank you