How to send an application (systemd service) logs to Logstash using Journalbeat?

My application / service name 'appdb', running as systemd service. I can observe logs from this application using shall command 'journalctl -u appdb.service'. I want to send these logs to Logstash. I am using the following software (versions).
Elasticsearch 6.3.0 (IP: 10.111.151.149)
journalbeat-7.5.0-1.x86_64 to send logs from application host (IP: 10.111.151.118) to Logstash-6.3.0 (IP: 10.111.151.66) with logstash-input-journald-2.0.2 installed.
Logstash-6.3.0 is able to reach Elasticsearch-6.3.0 (10.111.151.149:9200)
journalbeat-7.5.0-1.x86_64 (from 10.111.151.118) is NOT able to reach Logstash-6.3.0 (10.111.151.66:5044) using following configuration. Please help.

$ cat /etc/journalbeat/journalbeat.yml
journalbeat.inputs:

paths:
include_matches:
"systemd.unit=appdb"
setup.template.settings:
index.number_of_shards: 1
#index.codec: best_compression
#_source.enabled: false
fields:
env: staging
output.logstash:
hosts: ["10.111.151.66:5044"]
logging.level: debug
logging.selectors: ["*"]
cat logstash.conf
input {
journald {
lowercase => true
seekto => "head"
thisboot => true
type => "systemd"
tags => ["influxdb"]
}
}

output {
elasticsearch {
hosts => ["http://10.111.151.149:9200"]
manage_template => false
}
}
$ systemctl status logstash630
● logstash630.service - Logstash v6.3.0 Service
Loaded: loaded (/usr/lib/systemd/system/logstash630.service; enabled; vendor preset: disabled)
Active: active (running) since Tue 2019-12-10 16:56:20 UTC; 53s ago
Main PID: 3070 (java)
Tasks: 21
Memory: 651.8M
CGroup: /system.slice/logstash630.service
└─3070 /bin/java -Xms1g -Xmx1g -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOc...

Dec 10 16:56:20 PreProd_Logstash systemd[1]: Started (PreProd) Logstash v6.3.0 Service.
$ tail -f logstash-plain.log
[2019-12-10T16:57:19,448][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Systemd::JournalError: No such file or directory>, :backtrace=>["/home/centos/logstash-6.3.0/vendor/bundle/jruby/2.3.0/gems/systemd-journal-1.2.3/lib/systemd/journal.rb:52:in initialize'", "/home/centos/logstash-6.3.0/vendor/bundle/jruby/2.3.0/gems/logstash-input-journald-2.0.2/lib/logstash/inputs/journald.rb:67:in register'", "/home/centos/logstash-6.3.0/logstash-core/lib/logstash/pipeline.rb:340:in register_plugin'", "/home/centos/logstash-6.3.0/logstash-core/lib/logstash/pipeline.rb:351:in block in register_plugins'", "org/jruby/RubyArray.java:1734:in each'", "/home/centos/logstash-6.3.0/logstash-core/lib/logstash/pipeline.rb:351:in register_plugins'", "/home/centos/logstash-6.3.0/logstash-core/lib/logstash/pipeline.rb:498:in start_inputs'", "/home/centos/logstash-6.3.0/logstash-core/lib/logstash/pipeline.rb:392:in start_workers'", "/home/centos/logstash-6.3.0/logstash-core/lib/logstash/pipeline.rb:288:in run'", "/home/centos/logstash-6.3.0/logstash-core/lib/logstash/pipeline.rb:248:in block in start'"], :thread=>"#<Thread:0x6fa01436 run>"}
[2019-12-10T16:57:19,528][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil}
[2019-12-10T16:57:20,239][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-12-10T16:58:13,943][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-12-10T16:58:16,310][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.3.0"}
[2019-12-10T16:58:25,224][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-12-10T16:58:25,617][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"<LogStash::Inputs::Journald lowercase=>true, seekto=>"head", thisboot=>true, type=>"systemd", tags=>["influxdb"], id=>"b3fff89f140a734538e00984c983e881203be97ee59faa0013b15bb25530c48c", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_00a87d0d-2a19-4571-a19b-cf87cd3d4ac0", enable_metric=>true, charset=>"UTF-8">, threads=>1, flags=>0, path=>"/var/log/journal", sincedb_write_interval=>15, wait_timeout=>3000000>", :error=>"No such file or directory", :thread=>"#<Thread:0x63a50cd3 run>"}
[2019-12-10T16:58:25,818][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Systemd::JournalError: No such file or directory>, :backtrace=>["/home/centos/logstash-6.3.0/vendor/bundle/jruby/2.3.0/gems/systemd-journal-1.2.3/lib/systemd/journal.rb:52:in initialize'", "/home/centos/logstash-6.3.0/vendor/bundle/jruby/2.3.0/gems/logstash-input-journald-2.0.2/lib/logstash/inputs/journald.rb:67:in register'", "/home/centos/logstash-6.3.0/logstash-core/lib/logstash/pipeline.rb:340:in register_plugin'", "/home/centos/logstash-6.3.0/logstash-core/lib/logstash/pipeline.rb:351:in block in register_plugins'", "org/jruby/RubyArray.java:1734:in each'", "/home/centos/logstash-6.3.0/logstash-core/lib/logstash/pipeline.rb:351:in register_plugins'", "/home/centos/logstash-6.3.0/logstash-core/lib/logstash/pipeline.rb:498:in start_inputs'", "/home/centos/logstash-6.3.0/logstash-core/lib/logstash/pipeline.rb:392:in start_workers'", "/home/centos/logstash-6.3.0/logstash-core/lib/logstash/pipeline.rb:288:in run'", "/home/centos/logstash-6.3.0/logstash-core/lib/logstash/pipeline.rb:248:in block in start'"], :thread=>"#<Thread:0x63a50cd3 run>"}
[2019-12-10T16:58:25,905][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil}

^C
$

Hi @arblr,

I can see two possible problems by now.

First, the versions. You mention that you are using Journalbeat 7.5.0 with Logstash and Elasticsearch 6.3.0. This might work, but it is not supported, you can find here the comatibility matrix: https://www.elastic.co/support/matrix#matrix_compatibility
You would need to use Journalbeat 6.x, or upgrade your Stack.

Second, it seems that you are using both Journalbeat and a journald input for Logstash. Why are you doing that? In principle you would only need to collect logs from journald using Journalbeat (or the Logstash input, but not both). The trace seems to indicate that the Journald plugin for Logstash is not being able to open the journal file.
In the logstash configuration I cannot see any beats input configured. If you want Beats to send events through logstash, a beats input should be configured there.

Also take into account that if you are not doing any additional processing in Logstash this piece is not needed and you can directly configure Elasticsearch as an output in journalbeat.

Hello jsoriano,
Thanks A Lot about your reply and suggestions/answers to fix problems. Let me update logstash configuration file and test again.

Regards,
Ravi