Getting error in Logstash grok filter for rds slow query

Hi Team Below are the error that I am getting

Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x622f012e @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, @id="950af9e45367a48d0d80d2016dabb2c00b6443c6da22ed5b72a3a2d31dcef7a6", @klass=LogStash::Filters::Grok, @metric_events=#LogStash::Instrument::NamespacedMetric:0x240caa68, @filter=<LogStash::Filters::Grok match=>{"message"=>"^# User@Host: %{USER:mysql_user}(?:\\[[^\\]]+\\])?\\s+@\\s+%{HOST:mysql_client_host}?\\s+\\[%{IP:mysql_client_ip}?\\]"}, id=>"950af9e45367a48d0d80d2016dabb2c00b6443c6da22ed5b72a3a2d31dcef7a6", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>"*", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>["_grokparsefailure"], timeout_millis=>30000, tag_on_timeout=>"_groktimeout">>", :error=>"pattern %{HOST:mysql_client_host} not defined", :thread=>"#<Thread:0x3f9ab6d@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:156 run>"}

2019-04-15T08:49:06,459][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{HOST:mysql_client_host} not defined>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:123:in block in compile'", "org/jruby/RubyKernel.java:1292:inloop'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:93:in compile'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:281:inblock in register'", "org/jruby/RubyArray.java:1734:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:275:inblock in register'", "org/jruby/RubyHash.java:1343:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:270:inregister'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:241:in register_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:252:inblock in register_plugins'", "org/jruby/RubyArray.java:1734:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:252:inregister_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:594:in maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:262:instart_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:199:in run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:159:inblock in start'"], :thread=>"#<Thread:0x3f9ab6d@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:156 run>"}
[2019-04-15T08:49:06,468][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"

And this is my conf file.

filter{
if [type] == "rds" {
grok {
match => [ "message", "^# User@Host: %{USER:mysql_user}(?:[[^]]+])?\s+@\s+%{HOST:mysql_client_host}?\s+[%{IP:mysql_client_ip}?]"]
}
grok {
match => [ "message", "^# Query_time: %{NUMBER:query_duration_s:float}\s+Lock_time: %{NUMBER:lock_wait_s:float} Rows_sent: %{NUMBER:query_sent_rows:int} \sRows_examined: %{NUMBER:query_examined_rows:int}"]
}
grok {
match => [ "message", "^SET timestamp=%{NUMBER:timestamp};" ]
}
grok {
match => {
"message" => [
"^FROM %{NOTSPACE:query_table}",
"^UPDATE %{NOTSPACE:query_table}.
",
"^INSERT INTO %{NOTSPACE:query_table}.",
"^DELETE FROM %{NOTSPACE:query_table}.
"
]
}
}
mutate {
add_field => {
"[@metadata][indexname]" => "rds-logs"
}
}
}
}

Do you mean %{HOSTNAME:mysql_client_host} ?

yes same

No, it is not the same. HOSTNAME exists in the default set of pattern definitions, HOST does not.

Let me check.
Thanks

Thanks Its working fine.
I need one more help as I am getting index field TYPE in string but need in Number for some purpose can you help me.

If you have a field on an event that is a string and you want it to be a number then you can use the mutate+convert function.

Using convert function can I do the same in existing index or do I need to create new index.

In logstash you can make the change on-the-fly, but in elasticsearch you will need a new index to change something from string to number or vice versa.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.