Hi Team,
I am trying to set logstash
keystore but having hard time.
I have refer below link but still its not working. (not creating keystore password
from below)
I am creating keystore
as,
echo y | /usr/share/logstash/bin/logstash-keystore --path.settings /etc/logstash create
/bin/cat /root/es_password | /usr/share/logstash/bin/logstash-keystore add --path.settings /etc/logstash es_pwd --stdin
chown logstash:root /etc/logstash/logstash.keystore
chmod 0600 /etc/logstash/logstash.keystore
keystore
file.
[root@ip-10-10-10-242 ~]# ls -l /etc/logstash/logstash.keystore
-rw-r--r--. 1 logstash logstash 716 Sep 19 20:50 /etc/logstash/logstash.keystore
[root@ip-10-10-10-242 ~]#
key is getting listed.
[root@ip-10-10-10-242 ~]# /usr/share/logstash/bin/logstash-keystore --path.settings /etc/logstash/ list
Using JAVA_HOME defined java: /opt/jre1.8.0_221
WARNING, using JAVA_HOME while Logstash distribution comes with a bundled JDK
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bundler-1.17.3/lib/bundler/rubygems_integration.rb:200: warning: constant Gem::ConfigMap is deprecated
es_pwd
[root@ip-10-10-10-242 ~]#
I have below /etc/logstash/conf.d/logstash.conf
file.
i) -rw-r--r--. 1 logstash logstash 8316 Sep 19 20:11 /etc/logstash/conf.d/logstash.conf
input {
beats {
port => 5044
}
}
filter {
if [log_type] == "access_server" and [app_id] == "pa"
{
grok { match => { "message" => "%{YEAR}-%{MONTHNUM}-%{MONTHDAY}[T ]%{HOUR}:%{MINUTE}(?::?%{SECOND})\| %{USERNAME:exchangeId}\| %{DATA:trackingId}\| %{NUMBER:RoundTrip:int}%{SPACE}ms\| %{NUMBER:ProxyRoundTrip:int}%{SPACE}ms\| %{NUMBER:UserInfoRoundTrip:int}%{SPACE}ms\| %{DATA:Resource}\| %{DATA:subject}\| %{DATA:authmech}\| %{DATA:scopes}\| %{IPV4:Client}\| %{WORD:method}\| %{DATA:Request_URI}\| %{INT:response_code}\| %{DATA:failedRuleType}\| %{DATA:failedRuleName}\| %{DATA:APP_Name}\| %{DATA:Resource_Name}\| %{DATA:Path_Prefix}" } }
mutate {
replace => {
"[type]" => "access_server"
}
}
}
}
output {
if [log_type] == "access_server" {
elasticsearch {
hosts => ['http://10.10.10.242:9200']
user => elastic
password => "${es_pwd}"
index => "access"
template => "/root/access_template.json"
template_name => "access"
template_overwrite => "false"
}
}
elasticsearch {
hosts => ['http://10.10.10.242:9200']
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM}"
user => elastic
password => "${es_pwd}"
}
}
index template
file is copied to /root
as used in above setting with logstash
ownership.
[root@ip-10-10-10-242 ~]# ls -l /root/access_template.json
-rw-r--r--. 1 logstash logstash 840 Sep 19 15:42 /root/access_template.json
[root@ip-10-10-10-242 ~]#
ii) /etc/logstash/logstash.yml
file.
-rw-r--r--. 1 logstash logstash 8220 Sep 19 20:11 /etc/logstash/logstash.yml
node.name: logstash_1
path.data: /var/lib/logstash
path.logs: /var/log/logstash
xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.username: elastic
xpack.monitoring.elasticsearch.password: ${es_pwd}
xpack.monitoring.elasticsearch.hosts: ['http://10.10.10.242:9200']
After above configuration, logstash
service is getting restarted after some time and hence no data is indexed yet.
I am getting below error in /var/log/logstash/logstash-plain.log
.
[2021-09-19T21:30:29,805][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2021-09-19T21:30:31,740][INFO ][org.reflections.Reflections] Reflections took 232 ms to scan 1 urls, producing 120 keys and 417 values
[2021-09-19T21:30:33,752][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearchMonitoring", :hosts=>["http://10.10.10.242:9200"]}
[2021-09-19T21:30:33,858][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@10.10.10.242:9200/]}}
[2021-09-19T21:30:33,910][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@10.10.10.242:9200/"}
[2021-09-19T21:30:34,033][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Configuration is data stream compliant but due backwards compatibility Logstash 7.x will not assume writing to a data-stream, default behavior will change on Logstash 8.0 (set `data_stream => true/false` to disable this warning)
[2021-09-19T21:30:34,036][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Configuration is data stream compliant but due backwards compatibility Logstash 7.x will not assume writing to a data-stream, default behavior will change on Logstash 8.0 (set `data_stream => true/false` to disable this warning)
[2021-09-19T21:30:34,107][WARN ][logstash.javapipeline ][.monitoring-logstash] 'pipeline.ordered' is enabled and is likely less efficient, consider disabling if preserving event order is not necessary
[2021-09-19T21:30:34,299][INFO ][logstash.javapipeline ][.monitoring-logstash] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2, "pipeline.sources"=>["monitoring pipeline"], :thread=>"#<Thread:0xd5430b4 run>"}
[2021-09-19T21:30:36,829][ERROR][logstash.outputs.elasticsearch] Invalid setting for elasticsearch output plugin:
output {
elasticsearch {
# This setting must be a path
# File does not exist or cannot be opened /root/access_template.json
template => "/root/access_template.json"
...
}
}
[2021-09-19T21:30:36,849][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (ConfigurationError) Something is wrong with your configuration.", :backtrace=>["org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:119)", "org.logstash.execution.JavaBasePipelineExt.initialize(JavaBasePipelineExt.java:86)", "org.jruby.runtime.Block.call(Block.java:139)", "org.jruby.RubyProc.call(RubyProc.java:318)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:105)", "java.base/java.lang.Thread.run(Thread.java:829)"]}
[2021-09-19T21:30:36,990][FATAL][logstash.runner ] An unexpected error occurred! {:error=>#<LogStash::Error: Don't know how to handle `Java::JavaLang::IllegalStateException` for `PipelineAction::Create<main>`>, :backtrace=>["org/logstash/execution/ConvergeResultExt.java:135:in `create'", "org/logstash/execution/ConvergeResultExt.java:60:in `add'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:404:in `block in converge_state'"]}
[2021-09-19T21:30:36,992][INFO ][logstash.javapipeline ][.monitoring-logstash] Pipeline Java execution initialization time {"seconds"=>2.68}
[2021-09-19T21:30:37,016][FATAL][org.logstash.Logstash ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747) ~[jruby-complete-9.2.19.0.jar:?]
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710) ~[jruby-complete-9.2.19.0.jar:?]
at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:89) ~[?:?]
testing logstash
pipeline config.
[FATAL] 2021-09-19 21:39:55.737 [LogStash::Runner] runner - The given configuration is invalid. Reason: Unable to configure plugins: Cannot evaluate `${es_pwd}`. Replacement variable `es_pwd` is not defined in a Logstash secret store or an environment entry and there is no default value given.
[FATAL] 2021-09-19 21:39:55.758 [LogStash::Runner] Logstash - Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747) ~[jruby-complete-9.2.19.0.jar:?]
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710) ~[jruby-complete-9.2.19.0.jar:?]
at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:89) ~[?:?]
even after creating keystore
and adding key
to , its not recognising es_pwd
.
With this setup logstash
service is getting restarted frequently and index access-000001
is not created.
Thanks,