Logstash ConfigurationError

Hello,

When I run logstash with this configuration file I get these errors.
Do you have a solution?

Best regards,

/etc/logstash/conf.d/logstash-syslog.conf

input {
  tcp {
    port => 5000
    type => syslog
  }
  udp {
    port => 5000
    type => syslog
  }
}

filter {
  if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
    }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
  }
}

output {
  elasticsearch {
    hosts => ["https://192.168.1.30:9200", "https://192.168.1.40:9200", "https://192.168.1.50:9200"]
    ssl => true
    ssl_certificate_verification => true
    keystore => /etc/logstash/certs/logstash1.p12
    keystore_password => "${KEY_PWD}"
    truststore => /etc/logstash/certs/logstash1.p12
    truststore_password => "${TRUST_PWD}"
    api_key => "66GKX9GYT36ziqNjXv3vvw"
  }
}
[2022-11-17T16:47:01,992][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2022-11-17T16:47:09,871][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2022-11-17T16:47:12,701][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \\t\\r\\n], \"#\", [A-Za-z0-9_-], '\"', \"'\", [A-Za-z_], \"-\", [0-9], \"[\", \"{\" at line 30, column 17 (byte 747) after output {\n  elasticsearch {\n    hosts => [\"https://192.168.1.30:9200\", \"https://192.168.1.40:9200\", \"https://192.168.1.50:9200\"]\n    ssl => true\n    ssl_certificate_verification => true\n    keystore => ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:in `compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:210:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:72:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:48:in `initialize'", "org/jruby/RubyClass.java:911:in `new'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:50:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:381:in `block in converge_state'"]}
[2022-11-17T16:47:13,058][INFO ][logstash.runner          ] Logstash shut down.
[2022-11-17T16:47:13,079][FATAL][org.logstash.Logstash    ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
        at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:790) ~[jruby.jar:?]
        at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:753) ~[jruby.jar:?]
        at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:91) ~[?:?]

Try to put keystore and trust paths in double quotes and do a config test.

Thank you @sai_kiran1
The error has been corrected.

I've got this error now:

[2022-11-18T09:39:12,383][WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"https://192.168.1.30:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError, :message=>"Got response code '401' contacting Elasticsearch at URL 'https://192.168.1.30:9200/'"}
[2022-11-18T09:39:12,398][WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"https://192.168.1.40:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError, :message=>"Got response code '401' contacting Elasticsearch at URL 'https://192.168.1.40:9200/'"}
[2022-11-18T09:39:12,414][WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"https://192.168.1.50:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError, :message=>"Got response code '401' contacting Elasticsearch at URL 'https://192.168.1.50:9200/'"}

an authentication problem ?
I created the API key with this command from a master node:

curl -X POST --cacert "./elastic-stack-ca.crt" --cert "./node-1.crt" --key "./node-1.key" "https://192.168.1.10:9200/_security/api_key?pretty" -u elastic -H 'Content-Type: application/json' -d'
POST /_security/api_key
{
  "name": "logstash_host001", 
  "role_descriptors": {
    "logstash_writer": { 
      "cluster": ["monitor", "manage_ilm", "read_ilm"],
      "index": [
        {
          "names": ["logstash-*"],
          "privileges": ["view_index_metadata", "create_doc"]
        }
      ]
    }
  }
}

I put the id of the api key and not id:api_key

now it seems that the api key does not allow me to create an index
action [indices:admin/auto_create] is unauthorized for API key id [Db8XioQBml7AGtpq8v1N] of user [elastic], this action is granted by the index privileges [auto_configure,create_index,manage,all]"}}

[2022-11-18T14:25:13,992][INFO ][logstash.outputs.elasticsearch][main][e8b005d4a7ee170090677c839c32fa98a5385a0ae7de03cb7f8cb5e018f712c6] Retrying failed action {:status=>403, :action=>["create", {:_id=>nil, :_index=>"logs-generic-default", :routing=>nil}, {"type"=>"syslog", "syslog_pid"=>"774", "syslog_timestamp"=>"Nov 18 14:24:41", "syslog_hostname"=>"syslog-server", "@version"=>"1", "received_from"=>"{\"ip\":\"192.168.1.110\"}", "syslog_message"=>"Unregistered Authentication Agent for unix-process:1713:1793129 (system bus name :1.37, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale fr_FR.UTF-8) (disconnected from bus)", "@timestamp"=>2022-11-18T13:24:41.000Z, "syslog_program"=>"polkitd", "message"=>"<85>Nov 18 14:24:41 syslog-server polkitd[774]: Unregistered Authentication Agent for unix-process:1713:1793129 (system bus name :1.37, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale fr_FR.UTF-8) (disconnected from bus)", "received_at"=>"2022-11-18T13:24:42.054523607Z", "host"=>{"ip"=>"192.168.1.110"}, "event"=>{"original"=>"<85>Nov 18 14:24:41 syslog-server polkitd[774]: Unregistered Authentication Agent for unix-process:1713:1793129 (system bus name :1.37, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale fr_FR.UTF-8) (disconnected from bus)"}, "data_stream"=>{"type"=>"logs", "dataset"=>"generic", "namespace"=>"default"}}], :error=>{"type"=>"security_exception", "reason"=>"action [indices:admin/auto_create] is unauthorized for API key id [D7_lioQBml7AGtpqPP0i] of user [elastic] on indices [logs-generic-default], this action is granted by the index privileges [auto_configure,create_index,manage,all]"}}

I added the privileges in the api key for index logs-*

curl -X POST --cacert "./elastic-stack-ca.crt" --cert "./node-1.crt" --key "./node-1.key" "https://192.168.1.10:9200/_security/api_key?pretty" -u elastic -H 'Content-Type: application/json' -d'
{
  "name": "logstash1", 
  "role_descriptors": {
    "logstash_writer": { 
      "cluster": ["all"],
      "index": [
        {
          "names": ["logs-*"],
          "privileges": ["auto_configure", "create_index", "manage", "all"]
        }
      ]
    }
  }
}
'