Well this could be but I do not have a config file larger than 33 lines. Any hints where this line 80 could be?
Regards
Well this could be but I do not have a config file larger than 33 lines. Any hints where this line 80 could be?
Regards
Config file means all the files in conf.d if you are using default structure. So you can just print them sequentially. Or you can re-read the error and notice that it actually states exactly where the error is:
Expected one of #, else, if, ", ', } at line 80, column 4 (byte 1795) after output {\n if "citrix-netscaler-vor" in [tags]{\n stdout {\n codec => rubydebug\n }\n }"
Dear Guys,
I have now reviewed everything i had.
_grokparsefailure is correct because some source lines had different resonspes:
Normal one:
2018-11-23 00:00:00 192.168.116.163 - HTTP 192.168.10.78 80 GET /maintenance/pacfile.dat - 304 146 235 0 HTTP/1.1 - - -
Bad one
2018-11-23 00:00:00 192.168.116.163 - HTTP http://iambad.com 80 GET /maintenance/pacfile.dat - 304 146 235 0 HTTP/1.1 - - -
So my pattern did not work because i expect there an IPV4 address. This "_grokparsefailures" are moved in an index with current datestamp. So this messages were the first which were shown in discover.
If you choose an other time-range you will find all propper messages parsed exactly.
OK! First issue solved.
The second one whats still current is that every source log file starts with following 4 lines:
#Version: 1.0
#Software: Netscaler Web Logging(NSWL)
#Date: 2018-11-23 00:00:00 --> for sure this field changes : )
#Fields: date time c-ip cs-username sc-servicename s-ip s-port cs-method cs-uri-stem cs-uri-query sc-status cs-bytes sc-bytes time-taken cs-version cs(User-Agent) cs(Cookie) cs(Referer)
and of course those are also entrys with _grokparsefailure. So I had to get rid of them.
With four separate filters like:
filter {
if "#Version: 1.0" in [message] {
drop { }
}
}
And after getting rid of all those crap I got follwing result in kibana discover:
So I want to thank you for your help!! Really! Thanks a lot.
Glad you got it working, also thanks for the explanation, makes sense.
With four separate filters like:
filter { if "#Version: 1.0" in [message] { drop { } } }
Just a note, I would replace those four filters with one:
filter {
if [message] =~ /^#.*/ {
drop {}
}
}
Also, now that you got the @timestamp to work properly, you could add remove_fields -option in the date filter which would remove event_timestamp field if the date filter succeeds.
Adapted...
Thanks!
Dear Community,
I have an other problem with matching timestamps. Maybe you have a right type:
Source Logfile Timestamp:
2019-01-23T11:08:16+01:00
I tried following:
match => [ "timestamp", "ISO8601" ]
or
match => [ "timestamp", "yyyy-MM-dd'T'HH:mm:ssZZ:ZZ" ]
or
match => [ "timestamp", "yyyy-MM-dd'T'HH:mm:ssZZZZ" ]
without success.
Maybe there is an other problem but please be so kind and tell me which format should match.
Regards
Wilhelm
is solved.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.