In RHEL how to run Configuration file along with logstash?

Hi ELK team,
i have successfully installed Logstash in RHEL. but what issue is we want to run Config file placed under config.d folder when we run "sudo initctl start logstash ". it is not reading those config files placed under config.d folder. OR how we need to run " logstash -f simple.conf " command in RHEl to start logstash and run config file at a time???

Thanks in advance. any reference on this will be great help!!

What config.d directory are you talking about, /etc/logstash/conf.d? With what arguments are Logstash started (check with ps aux | grep logstash)? What's in logstash.yml?

Hi Magnusbaeck,

Yes im talking about /etc/logstash/conf.d directory , my logstash.yml looks like:

only following lines are active in LOGSTASH.yml file

path.data: /var/lib/logstash

path.config: /etc/logstash/conf.d/*.conf

path.logs: /var/log/logstash


in RHEL im Starting logstash using command " sudo initctl start logstash "

this above command should also execute config files placed in /etc/logstash/conf.d directory right??

Thanks,
Naveena K N

this above command should also execute config files placed in /etc/logstash/conf.d directory right??

Yes. How do you know the files aren't read? If you bump of the log level Logstash will log information about all configuration that's loaded.

im placing simple.conf file inside /etc/logstash/conf.d as follows:
input { stdin { } }
output {
stdout { codec => rubydebug } }

so after executing sudo initctl start logstash , it should ask for input from shell right??it is not taking any input!!!

Thanks,
Naveena K N

No, it won't ask for input from the shell. When you start Logstash with initctl it'll run in the background and the stdin input won't be usable.

ok then will try with feeding data to ELK
Thanks

this is the error im getting in /var/log/logstash/logstash-plain.log file:

[2018-02-26T12:43:06,967][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2018-02-26T12:43:06,990][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2018-02-26T12:43:07,937][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-02-26T12:43:08,394][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.2"}
[2018-02-26T12:43:08,622][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, { at line 3, column 5 (byte 14) after input {\n\nssl ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:42:in compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:50:incompile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:12:in block in compile_sources'", "org/jruby/RubyArray.java:2486:inmap'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in compile_sources'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:51:ininitialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:169:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:inexecute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:315:in block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:inwith_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:312:in block in converge_state'", "org/jruby/RubyArray.java:1734:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:299:in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:inblock in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:inconverge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:90:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:348:inblock in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}
[2018-02-26T12:43:08,619][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

why am i getting this error??any configuration missing??

Thanks in advance

There's a syntax problem in one of the configuration files. The error message indicates approximately where.

Hi @magnusbaeck
i tried with placing config file under /etc/logstash/conf.d directory.
and changed log mode to debug mode...
logstash is working fine but it is not able to create index by taking server log as input.

following are the lines of Logstash log file:
Pipeline started succesfully {:pipeline_id=>"main", :thread=>"#<Thread:0x3c05e288@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 sleep>"}

_globbed_files: /home/nakn/logfile/server.log: glob is: []

Pipelines running {:count=>1, :pipelines=>["main"]}

collector name {:name=>"ParNew"}
collector name {:name=>"ConcurrentMarkSweep"}
Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x3c05e288@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 sleep>"}

collector name {:name=>"ParNew"}
collector name {:name=>"ConcurrentMarkSweep"}

....continues
help me to resolve this issue of pushing data to ELK!

Thanks
Naveena K N

_globbed_files: /home/nakn/logfile/server.log: glob is:

This indicates a typo in the path or that Logstash doesn't have permissions to read the file or any of the directories leading up to it.

Thanks @magnusbaeck im able to resolve the issue now. Index is getting created now.

Naveena K N

Hi Elastic Experts!, Hi @magnusbaeck

we are trying send serverlog data from file system and error log data from database.so we placed config files in "/etc/logstash/conf.d/ " , for filesystem we are using file input plugin and for database we are using jdbc input plugin. so when we start logstash using command "sudo /usr/bin/systemctl start logstash" its working but data in indexes created are not correct. for config file which needs to read from database ,it is reading from file system path specified in other config files.

so why it is behaving like that, do we need to make any changes in logstash.yml or pipeline.yml for our Scenario to work proprly??

All configuration files in /etc/logstash/conf.d will be concatenated. If you want to have isolated event streams you need to use conditionals or switch to using multiple pipelines.

This is an extremely frequently asked question so please excuse my brevity.

thanks @magnusbaeck for your approach of using multiple pipelines, but how to configure that??

Thanks,
Naveena K N

Have you read the documentation about that feature?

https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html

so according that documentation, we need to create different pipelines with unique id and in the path we need to specify which config file to select , right??

Thanks,
Naveena K N

like for example :

  • pipeline.id: my-pipeline_1
    path.config: "/etc/path/to/mon.config"
    pipeline.workers: 2
  • pipeline.id: my-pipeline_2
    path.config: "/etc/different/path/sl1a.cfg"
    • pipeline.id: my-pipeline_3
      path.config: "/etc/different/path/sl2a.cfg"

This is how u r suggesting??

what is that workers means??

so according that documentation, we need to create different pipelines with unique id and in the path we need to specify which config file to select , right??

Yes.

what is that workers means??

Have you looked in the documentation?

thanks @magnusbaeck, i hope problem will resolve now.

workers come into picture if we consider cpu utilization i think according to documentation??!