One_Punch
(One Punch)
August 9, 2021, 3:56pm
1
Hello, im integratin filebeats and logstash, sample grok pattern works in grokdebug but throwing grokfailure in logstash
Message:
[2m2021-08-09 15:50:07.850 [0;39m [32m INFO [0;39m [35m1 [0;39m [2m--- [0;39m [2m[nio-8085-exec-1] [0;39m [36mo.s.web.servlet.DispatcherServlet [0;39m [2m: [0;39m Completed initialization in 2 ms
Pattern:
%{TIMESTAMP_ISO8601:timestamp} (?:[^:]+)m (?:[^:]+)m %{LOGLEVEL:logLevel} (?:[^:]+)m (?:[^:]+)m%{BASE10NUM:pid} (?:[^:]+)m (?:[^:]+)- (?:[^:]+)m (?:[^:]+)] (?:[^:]+)m (?:[^:]+)m%{DATA:class} (?:[^:]+)m (?:[^:]+): (?:[^:]+)m %{GREEDYDATA:message}
Conf
logstash.conf: |
input {
beats {
port => 5044
ssl => false
}
stdin {
codec => plain { charset=>"UTF-8" }
}
}
filter {
if [message] =~ "\tat" {
grok {
match => ["message", "^(\tat)"]
add_tag => ["stacktrace"]
}
}
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} (?:[^:]+)m (?:[^:]+)m %{LOGLEVEL:logLevel} (?:[^:]+)m (?:[^:]+)m%{BASE10NUM:pid} (?:[^:]+)m (?:[^:]+)- (?:[^:]+)m (?:[^:]+)] (?:[^:]+)m (?:[^:]+)m%{DATA:class} (?:[^:]+)m (?:[^:]+): (?:[^:]+)m %{GREEDYDATA:message}" }
}
date {
match => ["timestamp", "YYYY-MM-dd HH:mm:ss.SSS"]
}
mutate {
gsub => ["message", "\u001b", " " ]
}
}
output {
elasticsearch {
hosts => "http://elasticsearch-master:9200 "
index => "%{[@metadata ][beat]}-%{[@metadata ][version]}-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
}
Badger
August 9, 2021, 5:09pm
2
When I run
input { generator { count => 1 lines => [ '[2m2021-08-09 15:50:07.850 [0;39m [32m INFO [0;39m [35m1 [0;39m [2m--- [0;39m [2m[nio-8085-exec-1] [0;39m [36mo.s.web.servlet.DispatcherServlet [0;39m [2m: [0;39m Completed initialization in 2 ms' ] } }
filter {
grok { match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} (?:[^:]+)m (?:[^:]+)m %{LOGLEVEL:logLevel} (?:[^:]+)m (?:[^:]+)m%{BASE10NUM:pid} (?:[^:]+)m (?:[^:]+)- (?:[^:]+)m (?:[^:]+)] (?:[^:]+)m (?:[^:]+)m%{DATA:class} (?:[^:]+)m (?:[^:]+): (?:[^:]+)m %{GREEDYDATA:message}" } }
}
I get
"message" => [
[0] "[2m2021-08-09 15:50:07.850 [0;39m [32m INFO [0;39m [35m1 [0;39m [2m--- [0;39m [2m[nio-8085-exec-1] [0;39m [36mo.s.web.servlet.DispatcherServlet [0;39m [2m: [0;39m Completed initialization in 2 ms",
[1] "Completed initialization in 2 ms"
],
"class" => "o.s.web.servlet.DispatcherServlet",
"timestamp" => "2021-08-09 15:50:07.850"
so I do not think you are doing what you think you are doing.
One_Punch
(One Punch)
August 9, 2021, 5:39pm
3
Hello @Badger
Im having also trouble in the logs itself. kubernetes logs
2021-08-09 16:29:15.525 INFO 1 --- [nio-8085-exec-1] o.s.web.servlet.DispatcherServlet : Completed initialization in 2 ms
Logstash logs:
[2m2021-08-09 15:50:07.850 [0;39m [32m INFO [0;39m [35m1 [0;39m [2m--- [0;39m [2m[nio-8085-exec-1] [0;39m [36mo.s.web.servlet.DispatcherServlet [0;39m [2m: [0;39m Completed initialization in 2 ms
Is it possible to remove the [2m [32m [0;39m characters?
One_Punch
(One Punch)
August 9, 2021, 5:40pm
4
I want to extract
timestamp
loglevel
class
message
Badger
August 9, 2021, 5:49pm
5
You could try
mutate { gsub => [ "message", "\[[0-9;]+m\s?", "" ] }
One_Punch
(One Punch)
August 9, 2021, 6:00pm
6
This is the sample message when i've added that and remove my filter
2021-08-09 17:57:13.237INFO1---[nio-8085-exec-1]o.s.web.servlet.DispatcherServlet :Completed initialization in 2 ms
One_Punch
(One Punch)
August 9, 2021, 6:10pm
7
This is the latest message
2021-08-09 18:02:05.950 INFO 1 --- [nio-8085-exec-1] o.s.web.servlet.DispatcherServlet : Completed initialization in 2 ms
One_Punch
(One Punch)
August 9, 2021, 6:35pm
8
"tags" => [
[0] "beats_input_codec_plain_applied",
[1] "_grokparsefailure"
],
"message" => " 2021-08-09 18:29:58.900 INFO 1 --- [nio-8085-exec-1] o.s.web.servlet.DispatcherServlet : Initializing Servlet 'dispatcherServlet'",
"host" => {
"name" => "filebeat-filebeat-8jbts"
}
But in gork its succesfull
Badger
August 9, 2021, 6:41pm
9
I would use dissect instead of grok
mutate { gsub => [ "message", "\s+", " " ] }
dissect { mapping => { "message" => " %{timestamp} %{+timestamp} %{loglevel} %{pid} --- [%{}] %{class} : %{[@metadata][message]}" }
mutate { replace { "message" => "%{[@metadata][message]}" } }
One_Punch
(One Punch)
August 9, 2021, 6:54pm
11
[2021-08-09T18:54:22,477][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \t\r\n], "#", "=>" at line 15, column 14 (byte 303) after filter { \n dissect { \n mapping => { "message" => " %{timestamp} %{+timestamp} %{loglevel} %{pid} --- [%{}] %{class} : %{[@metadata ][message]}" } \n }\n mutate { \n replace ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:in compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:184:in
initialize'", "org/logstash/execution/JavaBasePipelineExt.java:69:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:47:in
initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:365:in
block in converge_state'"]}
input {
beats {
port => 5044
ssl => false
}
stdin {
codec => plain { charset=>"UTF-8" }
}
}
filter {
dissect {
mapping => { "message" => " %{timestamp} %{+timestamp} %{loglevel} %{pid} --- [%{}] %{class} : %{[@metadata ][message]}" }
}
mutate {
replace { "message" => "%{[@metadata ][message]}" }
}
mutate {
gsub => ["message", "\u001b", " ", "message", "[[0-9;]+m\s?", "", "message", "\s+", " " ]
}
}
output {
elasticsearch {
hosts => "http://elasticsearch-master:9200 "
index => "%{[@metadata ][beat]}-%{[@metadata ][version]}-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
}
One_Punch
(One Punch)
August 9, 2021, 7:00pm
12
Ive added => in the replace.
"tags" => [
[0] "beats_input_codec_plain_applied",
[1] "_dissectfailure"
],
"kubernetes" => {
Badger
August 9, 2021, 7:07pm
13
Change that to
if "_dissectfailure" not in [tags] { mutate { replace => { "message" => "%{[@metadata][message]}" } } }
then show us what the [message] field that got the failure. Copy from the JSON tab on an expanded event in the Kibana Discover pane.
One_Punch
(One Punch)
August 9, 2021, 7:47pm
14
it's already working now but may i know how can i handle this also?
system
(system)
Closed
September 6, 2021, 7:50pm
16
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.