Logstash configuration file error filtering

Hello, I'm starting to use ELK and I'm attempting to start logstash with a custom logstash.conf file but I've an error :

 Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, \", ', -, [, { at line 10, column 37 (byte 149) after filter {\n    if \"HTTP\" in [message] {\n      grok {\n          mapping => { \"message\" => ", :backtrace=>["/opt/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "/opt/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/opt/logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/opt/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:149:in `initialize'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:22:in `initialize'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:90:in `initialize'", "/opt/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:42:in `block in execute'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:92:in `block in exclusive'", "org/jruby/ext/thread/Mutex.java:148:in `synchronize'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:92:in `exclusive'", "/opt/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38:in `execute'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:317:in `block in converge_state'"

Here my configuration file :

input {

beats {
port => 5044
codec => "json"
}
}
filter {
if [message] =~ "HTTP" {
grok {
mapping => { "message" => %{TIMESTAMP_ISO8601:timestamp} %{WORD:type} %{LOGLEVEL:level} "%{WORD:method} %{URIPATHPARAM:url}" %{INT:code} %{INT:bytes} - %{GREEDYDATA:response_time} }
}
}
else if [message] =~ "APP" {
grok {
mapping => { "message" => %{TIMESTAMP_ISO8601:timestamp} %{WORD:type} %{LOGLEVEL:level} %{GREEDYDATA:jsonstring} }
}
json {
source => "jsonstring"
target => "doc"
}
mutate {
add_field => {
"code" => "%{[doc][code]}"
"message" => "%{[doc][message]}"
}
}
}
}

output {
elasticsearch {
hosts => ["localhost"]
}
}

Logs sample I want to get back (docker stdout logs) :

  1. HTTP request logs
    2019-01-29T18:35:15.423Z HTTP INFO "POST /myroute/?param1=test" 201 41 - 44.014 ms

  2. APP logs
    2019-01-29T18:48:19.657Z APP ERROR : {"code":201,"message":"ok"}

discriminator : APP or HTTP

What is wrong with this config ? I tried a change a lot things specially around line 37 but i don't understand why it doesn't work.

Thank you very much for your help :slight_smile:

@althair34, you need to enclose your grok patterns in quotes. E.g.:

grok {
mapping => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{WORD:type} %{LOGLEVEL:level} %{GREEDYDATA:jsonstring}" }
}

Thank you for your help. I just have a question :

How can escape me double quotes in message ? Should I use simple quote to enclose my grok pattern ?

Thank you again

You can use either single quotes around your grok pattern or a backslash to escape special characters within your grok pattern.

I've almost fixed my configuration file. I have an error of deprecation on the output :

You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"%{[@metadata][beat]}-%{+YYYY.MM.dd}", manage_template=>false, id=>"6a469e616cda88c3ab1205d3def17747cd86ab278aa8b4bcabba84d12b7accf2", hosts=>[//localhost], document_type=>"%{[@metadata][type]}", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_33572ebe-e2b0-4f11-97b6-e0bca5239b31", enable_metric=>true, charset=>"UTF-8">, workers=>1, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2019-01-30T12:04:01,099][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50

Here my configuration file updated

input {

beats {
port => 5044
codec => "json"
}
}
filter {
if [message] =~ "HTTP" {
grok {
mapping => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{WORD:type} %{LOGLEVEL:level} "%{WORD:method} %{URIPATHPARAM:url}" %{INT:code} %{INT:bytes} - %{GREEDYDATA:response_time}" }
}
}
else if [message] =~ "APP" {
grok {
mapping => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{WORD:type} %{LOGLEVEL:level} %{GREEDYDATA:jsonstring}" }
}
json {
source => "jsonstring"
target => "doc"
}
mutate {
add_field => {
"code" => "%{[doc][code]}"
"message" => "%{[doc][message]}"
}
}
}
}

output {
elasticsearch {
hosts => ["localhost"]
}
}

That's just a warning that in the next major release of Elasticsearch, the document_type option will be removed. You can read more about that here: https://www.elastic.co/guide/en/elasticsearch/reference/6.0/removal-of-types.html

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.