Systemctl logstash.service pipeline problem

Ok, so I've been dealing with this issue for a couple of days and it's becoming really frustrating.

I want to start logstash with systemctl start logstash.service to have it in the background sending some metric logs to elasticsearch. But when the logstash server starts, nothing is sent to Elastic. with systemctl status logstash.service -l I get this:

Nov 07 07:01:39 elk.nipne.ro logstash[13797]: [2019-11-07T07:01:39,644][ERROR][logstash.agent           ] Fai
led to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:metricbeat-droplet, :exception=>
"LogStash::ConfigurationError", :message=>"Expected one of #, input, filter, output at line 1, column 2 (byte
 2) after  ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperat
ive'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/usr/share/logsta
sh/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2584:i
n `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash
/execution/AbstractPipelineExt.java:153:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47
:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:26:in `initialize'", "/us
r/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in `execute'", "/usr/share/logstash/
logstash-core/lib/logstash/agent.rb:326:in `block in converge_state'"]}                                      
Nov 07 07:01:42 elk.nipne.ro logstash[13797]: [2019-11-07T07:01:42,642][ERROR][logstash.agent           ] Fai
led to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:metricbeat-droplet, :exception=>
"LogStash::ConfigurationError", :message=>"Expected one of #, input, filter, output at line 1, column 2 (byte
 2) after  ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperat
ive'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/usr/share/logsta
sh/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2584:i
n `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash
/execution/AbstractPipelineExt.java:153:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47
:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:26:in `initialize'", "/us
r/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in `execute'", "/usr/share/logstash/
logstash-core/lib/logstash/agent.rb:326:in `block in converge_state'"]}                                      

My pipepline is very basic because it only has one configuration file (honestly I don't even see the point in making a pipeline in the first place) but even if it's empty, I still get a similar error.
Here you can see:
Pipeline: (pipelines.yml)

- pipeline.id: metricbeat-droplet
  path.config: "/etc/logstash/conf.d/digitalIngest.conf"

And conf file:

input {                                                                        
  beats {                                                                      
    port => 5044                                                               
    ssl => true                                                                
    ssl_certificate => "/etc/letsencrypt/live/elk.nipne.ro/fullchain.pem"      
    ssl_key => "/etc/letsencrypt/live/elk.nipne.ro/privkey.pem"                                               
  }                                                                            
}                                                                              
filter {                                                                       
}                                                                              
output{                                                                        
elasticsearch {                                                                
hosts => "localhost:9200"                                                                 
manage_template => false                                                       
index => "droplet-metrics-1"                                                   
}                                                                                                                                                     
}                                                                              

What causes the pipeline to fail?
I'm on CentOS 7 by the way. If I go to /usr/share/logstash and start Logstash from the binaries, with the flag for configuration file, it WORKS (but that's only because when you flag something, the pipeline YAML file get's ignored)

Dump your config file using "od -ha" and verify that there are not any non-printing characters at the start of it (a BOM, for example).

@Badger thank you for the reply. I dumped the file and this is the result: as you can see, everything seems to be fine. However, logstash still can't read it properly.

[root@elk logstash]# od -ha pipelines.yml
0000000    202d    6970    6570    696c    656e    692e    3a64    6d20
          -  sp   p   i   p   e   l   i   n   e   .   i   d   :  sp   m
0000020    7465    6972    6263    6165    2d74    7264    706f    656c
          e   t   r   i   c   b   e   a   t   -   d   r   o   p   l   e
0000040    0a74    2020    6170    6874    632e    6e6f    6966    3a67
          t  nl  sp  sp   p   a   t   h   .   c   o   n   f   i   g   :
0000060    2220    652f    6374    6c2f    676f    7473    7361    2f68
         sp   "   /   e   t   c   /   l   o   g   s   t   a   s   h   /
0000100    6f63    666e    642e    642f    6769    7469    6c61    6e49
          c   o   n   f   .   d   /   d   i   g   i   t   a   l   I   n
0000120    6567    7473    632e    6e6f    2266    000a
          g   e   s   t   .   c   o   n   f   "  nl
0000133
[root@elk logstash]#

Do you by any chance have a basic config file, so I can just copy-paste and replace the path? :smiley:

The problem is not in pipelines.yml, it is in digitalIngest.conf

Just a helpful tip:
The error says it expect either of #, input, filter or output. So, can you make sure if the conf file you have pasted actually matches with the name of the file specified in the pipeline config.

since you are not using logstash to do any filtering would It not be easier to send right into elasticsearch from metricbeat? ( I am assuming you are using metricbeat )

But I said that the configuration file works just fine if I run it from the logstash binary from usr/share/logstash/bin .
So I would assume it is written OK.

I'm trying to play with a basic logstash file, then I will start to use a filter, once I can make it run as a deamon. Since I can't even make it run without filters, I saw no reason to put filters as or right now.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.