Logstash can not write date into elasticsearch after activating elastic security

Hey guys,

I´ve a problem with logstash.
After activating the stack security, logstash can not write into elasticsearch.
I´ve followed the instructions in the install guide and created a logstash_internal user which has the rights to write data.

Since I have activated these feature I get the following errors in my journal:

Jun 12 00:02:35 logstash-serv logstash[4904]: [2019-06-12T00:02:35,482][DEBUG] [logstash.config.source.multilocal] Reading pipeline configurations from YAML {:location=>"/etc/logstash/pipelines.yml"}
    Jun 12 00:02:35 logstash-serv logstash[4904]: [2019-06-12T00:02:35,495][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["/etc/logstash/conf.d/performance_pipeline.conf", "/etc/logstash/conf.d/pum_pipeline.conf"]}
    Jun 12 00:02:35 logstash-serv logstash[4904]: [2019-06-12T00:02:35,495][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/etc/logstash/conf.d/logging_pipeline.conf"}
    Jun 12 00:02:35 logstash-serv logstash[4904]: [2019-06-12T00:02:35,496][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>1}
    Jun 12 00:02:35 logstash-serv logstash[4904]: [2019-06-12T00:02:35,511][DEBUG][logstash.agent           ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:logging_pipeline}
    Jun 12 00:02:35 logstash-serv logstash[4904]: [2019-06-12T00:02:35,690][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:logging_pipeline, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, { at line 41, column 4 (byte 988) after output { \n\tif [type] == \"log\"{\n\t\tif ![MachineName]{\n\t\t\telasticsearch {\n\t\t\t\tindex => \"log-%{+YYYY.MM.dd}\"\n\t\t\t\tid => \"elasticsearch_logging_receiver\"\n\t\t\t\tuser => logstash_internal\n\t\t\t\tpassword => ********\n\t\t\t", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2577:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:151:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:23:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:325:in `block in converge_state'"]}

What can I do to fix this issue?

Thanks to everyone who can help!
Best regards,
PolterFox

Can you share your logstash configuration?

Which configuration do you want?

  • Pipeline Configuration
  • Pipeines Configuration /etc/logstash/pipelines.yml
  • Logstash Configuration /etc/logstash/logstash.yml

Let's start by what you changed after you enabled security for the stack, as the issue will probably be there.

I just added the user and password "nodes" into the output block:

elasticsearch {
    index => "log-%{+YYYY.MM.dd}"
    id => "elasticsearch_logging_receiver"
    user => logstash_internal
    password => *******
}

Potentially a syntax issue, missing a curly bracket somewhere?

It surprises me that it expects one of # { when it looks like it should be expecting }. Can you post the entire output section from logging_pipeline.conf?

You need to string-quote those values:

elasticsearch {
    index => "log-%{+YYYY.MM.dd}"
    id => "elasticsearch_logging_receiver"
    user => "logstash_internal"
    password => "*******"
}
2 Likes

Thanks that works, there are no errors in the logs right now, but I do not see any data coming into elasticsearch, so there must be another error.

Is it possible that logstash holds the data which can not be sent to elasticsearch and tries to work them off (i have persistent pipelines).

Try going to management --> security --> roles --> logstash_writer and then add log-* to the indices section and click on update role. See if that does it. Also try adding logstash-* if log-* doesnt work.

1 Like

Thanks that helps.
I´ve added * and thought that this allows logstash_writer to all my indices, but this was a misunderstanding of me.

1 Like

It does. Change the index pattern from * to log-* would reduce the role's (user's) access and would not add any additional indices. I think something else happened here.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.