Unexpected problem when running logstash with flag

I am trying to implement new filter in Logstash. To do so I try to run production logstash with local configuration. I created all necessary files in the folder from which I run. When specify flag -f to make sure the correct file is picked up logstash exits without giving any reason.

$ /usr/share/logstash/bin/logstash --path.settings . --path.data data -f ./pipelines.yml -l . --log.level debug
Sending Logstash's logs to . which is now configured via log4j2.properties
$

directory structure as follows:

[~/logstash/logstash_conf]$ ls
jvm.options         logstash.yml             	startup.options
data               	log4j2.properties         	debug_file 
pipelines.yml      	logstash-plain.log

I have spent a whole day trying to debug my pipeline and find the problem. I did came down to most basic pipeline:

# MY COMMENT
- pipeline.id: amq
  config.string: "input { stdin { } } output {  stdout { codec => rubydebug } }"

I can confirm in log that I run correct pipeline:

[2019-01-25T14:20:44,627][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, input, filter, output at line 2, column 1 (byte 14) after # MY COMMENT\n", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:42:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:50:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:12:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `compile_sources'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:49:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:167:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:305:in `block in converge_state'"]}

BUT when I run logstash without specifying -f flag the behaviour is as expected.

$ /usr/share/logstash/bin/logstash --path.settings . --path.data data -l . --log.level debug
Sending Logstash's logs to . which is now configured via log4j2.properties
The stdin plugin is now waiting for input:
hello World
{
       "message" => "hello World",
      "@version" => "1",
          "host" => "###",
    "@timestamp" => 2019-01-25T14:04:46.183Z
}

I did use -f successfully on other machine and did not experience such problems
Do I miss something or is it simply a bug?

$ /usr/share/logstash/bin/logstash --version                    logstash 6.3.1

That's not going to work.

  1. You can give logstash a config on the command line using -e
  2. You can tell logstash to read a configuration from a file using -f
  3. If you do not supply either -e or -f then logstash will use pipelines.yml

You are trying to mix 2 and 3.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.