So I have to achieve a Logstash setup where I have two (or maybe even more) log files, each with its own input and filters (output will always be sent to Elastic and then visualized in Kibana).
I saw this is very doable with the
I've done this for two logs (two conf files in the default config folders) and the pipeline setup looks like this:
- pipeline.id: filter1 path.config: "/etc/logstash/conf.d/log_conf.conf" pipeline.workers: 3 - pipeline.id: filter2 path.config: "/etc/logstash/conf.d/log_conf2.conf" queue.type: persisted
I'm running my development project in a centOS 7 machine at work. Now, I tested this with pipeline by going to
/usr/share/logstash/ and running
./bin/logstash to see if it works. The
pipeline.yml file is in the correct folder and also the config files as well. Everything works OK, the data is sent from Logstash to Elastic and then I can see it in Kibana.
But here comes the problem: I want to have my ELK setup running all the time, so I want to do
systemctl start logstash.service and then let the logstash pipeline work in the background, without having to start it from the binaries. However, when the process starts, no more data is going to Kibana, basically it's just like the logs are not even sent to Elastic/Kibana.
I tried to see the logs for starting the logstash service with
systemctl status logstash.service and here is what I got:
Oct 23 18:14:52 elk.nipne.ro logstash: Pipeline_id:filter2 Oct 23 18:14:52 elk.nipne.ro logstash: Plugin: <LogStash::Inputs::File start_position=>"beginning", path=>["/home/robert.poenaru/elk/arc_slurm_jobs.txt"], id=>"0c16b9ff1fe1aaca45b6072f213460ef2e62993c22d521cccd05ebe4e4d66e1b", sincedb_path=>"NULL", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_7cca3f5f-6719-4396-b596-bf12ac253b57", enable_metric=>true, charset=>"UTF-8">, stat_interval=>1.0, discover_interval=>15, sincedb_write_interval=>15.0, delimiter=>"\n", close_older=>3600.0, mode=>"tail", file_completed_action=>"delete", sincedb_clean_after=>1209600.0, file_chunk_size=>32768, file_chunk_count=>140737488355327, file_sort_by=>"last_modified", file_sort_direction=>"asc"> Oct 23 18:14:52 elk.nipne.ro logstash: Error: Permission denied - NULL Oct 23 18:14:52 elk.nipne.ro logstash: Exception: Errno::EACCES Oct 23 18:14:52 elk.nipne.ro logstash: Stack: org/jruby/RubyIO.java:1237:in `sysopen' Oct 23 18:14:52 elk.nipne.ro logstash: org/jruby/RubyFile.java:367:in `initialize' Oct 23 18:14:52 elk.nipne.ro logstash: org/jruby/RubyIO.java:1156:in `open' Oct 23 18:14:52 elk.nipne.ro logstash: uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/fileutils.rb:1136:in `block in touch' Oct 23 18:14:52 elk.nipne.ro logstash: org/jruby/RubyArray.java:1800:in `each' Oct 23 18:14:52 elk.nipne.ro logstash: uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/fileutils.rb:1130:in `touch'
Why is there a permission problem? I run the ELK stack with sudo rights. Also, the structure of logstash in the etc/ folder is:
[root@elk logstash]# pwd/etc/logstash [root@elk logstash]# tree -h. ├── [ 49] conf.d (two simple configs for reading logs from two files) │ ├── [ 503] log_conf2.conf │ └── [ 502] log_conf.conf ├── [2.0K] jvm.options ├── [4.9K] log4j2.properties ├── [ 342] logstash-sample.conf ├── [8.1K] logstash.yml ├── [ 578] pipelines.yml (content of the file the one given above) └── [1.7K] startup.options
Any ideas what's the issue? How can I make the pipeline work by using the logstash.service process instead of running it from the
Thank you in advance