Http_poller

Hi there. Http_poller

found here: Http_poller input plugin | Logstash Reference [8.13] | Elastic

I have tried using keystore and truststore, converting .pem to .p12 and importing them.

Reference: IBM Documentation

and here: ssl - How do I import a PKCS12 certificate into a java keystore? - Stack Overflow

I have made a random user genorator api, the api nginx server loads in signed certs from my certificate authority and it works in Firefox.

Now am stuck writing http_poller. I thought is was a .jks thing, i understand this to be ssl_cert ready though. Can someone help please? Here is my pipeline.

input {
  http_poller {
    urls => {
      my_url => {
        method => get
        url => "https://www.webserver.co.uk/random-person"
        headers => {
          "Content-Type" => "application/json"
        }
        ssl_supported_protocols = "TLSv1.3"
        ssl_cipher_suites => "TLS_AES_256_GCM_SHA384"
        ssl_certificate_authorities => "/home/administrator/Documents/ca3/ca.cert.pem"
        ssl_certificate => "/home/administrator/Documents/ca3/server.cert.pem"
        ssl_key => "/home/administrator/Documents/ca3/server.unenc.key.pem"
        #keystore => "/home/administrator/elk/logstash-8.7.1/config/keystore.jks"
        #truststore_password => "password"
        #keystore_password => "password"
        request_timeout => 60
        schedule => { every => "10s" }
      }
    }
    request_timeout => 60
    # Other input configurations...
  }
}

filter {
  # Add your filters here if needed
}

output {
  stdout {
    codec => rubydebug
  }
}

and what happens when you try to run the pipeline? Does logstash run? Does the pipeline start? Do you get errors? What do you want help fixing?

Hi, Badger. Really appriciate your interest.

When the pipeline runs, not a lot happens.

Ignore anything domain names with splunk, this is just old config from scripts making certs from old jobs.

i am trying to poll a json api which looks like this

age	31
name	"Emma"
(base) administrator@ubuntu:~/elk/logstash-8.7.1/config$ sudo ../bin/logstash -f /home/administrator/elk/logstash-8.7.1/config/test.yml --log.level debug
Using bundled JDK: /home/administrator/elk/logstash-8.7.1/jdk
Sending Logstash logs to /home/administrator/elk/logstash-8.7.1/logs which is now configured via log4j2.properties
[2024-05-06T05:55:43,175][INFO ][logstash.runner          ] Log4j configuration path used is: /home/administrator/elk/logstash-8.7.1/config/log4j2.properties
[2024-05-06T05:55:43,179][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.7.1", "jruby.version"=>"jruby 9.3.10.0 (2.6.8) 2023-02-01 107b2e6697 OpenJDK 64-Bit Server VM 17.0.7+7 on 17.0.7+7 +indy +jit [x86_64-linux]"}
[2024-05-06T05:55:43,181][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2024-05-06T05:55:43,182][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"fb_apache", :directory=>"/home/administrator/elk/logstash-8.7.1/modules/fb_apache/configuration"}
[2024-05-06T05:55:43,182][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x8dd572e @directory="/home/administrator/elk/logstash-8.7.1/modules/fb_apache/configuration", @module_name="fb_apache", @kibana_version_parts=["6", "0", "0"]>}
[2024-05-06T05:55:43,183][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"netflow", :directory=>"/home/administrator/elk/logstash-8.7.1/modules/netflow/configuration"}
[2024-05-06T05:55:43,183][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x3e908a0b @directory="/home/administrator/elk/logstash-8.7.1/modules/netflow/configuration", @module_name="netflow", @kibana_version_parts=["6", "0", "0"]>}
[2024-05-06T05:55:43,192][DEBUG][logstash.runner          ] Setting global FieldReference escape style: none
[2024-05-06T05:55:43,362][DEBUG][logstash.runner          ] -------- Logstash Settings (* means modified) ---------
[2024-05-06T05:55:43,362][DEBUG][logstash.runner          ] allow_superuser: true
[2024-05-06T05:55:43,362][DEBUG][logstash.runner          ] node.name: "ubuntu"
[2024-05-06T05:55:43,362][DEBUG][logstash.runner          ] *path.config: "/home/administrator/elk/logstash-8.7.1/config/test.yml"
[2024-05-06T05:55:43,363][DEBUG][logstash.runner          ] path.data: "/home/administrator/elk/logstash-8.7.1/data"
[2024-05-06T05:55:43,363][DEBUG][logstash.runner          ] modules.cli: #<Java::OrgLogstashUtil::ModulesSettingArray: []>
[2024-05-06T05:55:43,363][DEBUG][logstash.runner          ] modules: []
[2024-05-06T05:55:43,363][DEBUG][logstash.runner          ] modules_list: []
[2024-05-06T05:55:43,363][DEBUG][logstash.runner          ] modules_variable_list: []
[2024-05-06T05:55:43,363][DEBUG][logstash.runner          ] modules_setup: false
[2024-05-06T05:55:43,363][DEBUG][logstash.runner          ] config.test_and_exit: false
[2024-05-06T05:55:43,363][DEBUG][logstash.runner          ] config.reload.automatic: false
[2024-05-06T05:55:43,364][DEBUG][logstash.runner          ] config.reload.interval: #<Java::OrgLogstashUtil::TimeValue:0x299d1bd6>
[2024-05-06T05:55:43,364][DEBUG][logstash.runner          ] config.support_escapes: false
[2024-05-06T05:55:43,364][DEBUG][logstash.runner          ] config.field_reference.escape_style: "none"
[2024-05-06T05:55:43,364][DEBUG][logstash.runner          ] event_api.tags.illegal: "rename"
[2024-05-06T05:55:43,364][DEBUG][logstash.runner          ] metric.collect: true
[2024-05-06T05:55:43,364][DEBUG][logstash.runner          ] pipeline.id: "main"
[2024-05-06T05:55:43,364][DEBUG][logstash.runner          ] pipeline.system: false
[2024-05-06T05:55:43,364][DEBUG][logstash.runner          ] pipeline.workers: 4
[2024-05-06T05:55:43,364][DEBUG][logstash.runner          ] pipeline.batch.size: 125
[2024-05-06T05:55:43,365][DEBUG][logstash.runner          ] pipeline.batch.delay: 50
[2024-05-06T05:55:43,365][DEBUG][logstash.runner          ] pipeline.unsafe_shutdown: false
[2024-05-06T05:55:43,365][DEBUG][logstash.runner          ] pipeline.reloadable: true
[2024-05-06T05:55:43,365][DEBUG][logstash.runner          ] pipeline.plugin_classloaders: false
[2024-05-06T05:55:43,365][DEBUG][logstash.runner          ] pipeline.separate_logs: false
[2024-05-06T05:55:43,365][DEBUG][logstash.runner          ] pipeline.ordered: "auto"
[2024-05-06T05:55:43,365][DEBUG][logstash.runner          ] pipeline.ecs_compatibility: "v8"
[2024-05-06T05:55:43,365][DEBUG][logstash.runner          ] path.plugins: []
[2024-05-06T05:55:43,366][DEBUG][logstash.runner          ] *config.debug: true (default: false)
[2024-05-06T05:55:43,366][DEBUG][logstash.runner          ] *log.level: "debug" (default: "info")
[2024-05-06T05:55:43,367][DEBUG][logstash.runner          ] version: false
[2024-05-06T05:55:43,367][DEBUG][logstash.runner          ] help: false
[2024-05-06T05:55:43,367][DEBUG][logstash.runner          ] enable-local-plugin-development: false
[2024-05-06T05:55:43,367][DEBUG][logstash.runner          ] log.format: "plain"
[2024-05-06T05:55:43,367][DEBUG][logstash.runner          ] api.enabled: true
[2024-05-06T05:55:43,368][DEBUG][logstash.runner          ] api.http.host: "127.0.0.1"
[2024-05-06T05:55:43,368][DEBUG][logstash.runner          ] api.http.port: 9600..9700
[2024-05-06T05:55:43,368][DEBUG][logstash.runner          ] api.environment: "production"
[2024-05-06T05:55:43,368][DEBUG][logstash.runner          ] api.auth.type: "none"
[2024-05-06T05:55:43,368][DEBUG][logstash.runner          ] api.auth.basic.password_policy.mode: "WARN"
[2024-05-06T05:55:43,368][DEBUG][logstash.runner          ] api.auth.basic.password_policy.length.minimum: 8
[2024-05-06T05:55:43,369][DEBUG][logstash.runner          ] api.auth.basic.password_policy.include.upper: "REQUIRED"
[2024-05-06T05:55:43,369][DEBUG][logstash.runner          ] api.auth.basic.password_policy.include.lower: "REQUIRED"
[2024-05-06T05:55:43,369][DEBUG][logstash.runner          ] api.auth.basic.password_policy.include.digit: "REQUIRED"
[2024-05-06T05:55:43,369][DEBUG][logstash.runner          ] api.auth.basic.password_policy.include.symbol: "OPTIONAL"
[2024-05-06T05:55:43,370][DEBUG][logstash.runner          ] api.ssl.enabled: false
[2024-05-06T05:55:43,370][DEBUG][logstash.runner          ] queue.type: "memory"
[2024-05-06T05:55:43,370][DEBUG][logstash.runner          ] queue.drain: false
[2024-05-06T05:55:43,370][DEBUG][logstash.runner          ] queue.page_capacity: 67108864
[2024-05-06T05:55:43,370][DEBUG][logstash.runner          ] queue.max_bytes: 1073741824
[2024-05-06T05:55:43,380][DEBUG][logstash.runner          ] queue.max_events: 0
[2024-05-06T05:55:43,380][DEBUG][logstash.runner          ] queue.checkpoint.acks: 1024
[2024-05-06T05:55:43,380][DEBUG][logstash.runner          ] queue.checkpoint.writes: 1024
[2024-05-06T05:55:43,380][DEBUG][logstash.runner          ] queue.checkpoint.interval: 1000
[2024-05-06T05:55:43,380][DEBUG][logstash.runner          ] queue.checkpoint.retry: true
[2024-05-06T05:55:43,380][DEBUG][logstash.runner          ] dead_letter_queue.enable: false
[2024-05-06T05:55:43,380][DEBUG][logstash.runner          ] dead_letter_queue.max_bytes: 1073741824
[2024-05-06T05:55:43,381][DEBUG][logstash.runner          ] dead_letter_queue.flush_interval: 5000
[2024-05-06T05:55:43,381][DEBUG][logstash.runner          ] dead_letter_queue.storage_policy: "drop_newer"
[2024-05-06T05:55:43,381][DEBUG][logstash.runner          ] slowlog.threshold.warn: #<Java::OrgLogstashUtil::TimeValue:0x3727021e>
[2024-05-06T05:55:43,381][DEBUG][logstash.runner          ] slowlog.threshold.info: #<Java::OrgLogstashUtil::TimeValue:0x568c4909>
[2024-05-06T05:55:43,381][DEBUG][logstash.runner          ] slowlog.threshold.debug: #<Java::OrgLogstashUtil::TimeValue:0x6ddba5>
[2024-05-06T05:55:43,381][DEBUG][logstash.runner          ] slowlog.threshold.trace: #<Java::OrgLogstashUtil::TimeValue:0x70366fb5>
[2024-05-06T05:55:43,381][DEBUG][logstash.runner          ] keystore.classname: "org.logstash.secret.store.backend.JavaKeyStore"
[2024-05-06T05:55:43,381][DEBUG][logstash.runner          ] keystore.file: "/home/administrator/elk/logstash-8.7.1/config/logstash.keystore"
[2024-05-06T05:55:43,381][DEBUG][logstash.runner          ] path.queue: "/home/administrator/elk/logstash-8.7.1/data/queue"
[2024-05-06T05:55:43,382][DEBUG][logstash.runner          ] path.dead_letter_queue: "/home/administrator/elk/logstash-8.7.1/data/dead_letter_queue"
[2024-05-06T05:55:43,382][DEBUG][logstash.runner          ] path.settings: "/home/administrator/elk/logstash-8.7.1/config"
[2024-05-06T05:55:43,382][DEBUG][logstash.runner          ] path.logs: "/home/administrator/elk/logstash-8.7.1/logs"
[2024-05-06T05:55:43,382][DEBUG][logstash.runner          ] xpack.monitoring.enabled: false
[2024-05-06T05:55:43,382][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.hosts: ["http://localhost:9200"]
[2024-05-06T05:55:43,382][DEBUG][logstash.runner          ] xpack.monitoring.collection.interval: #<Java::OrgLogstashUtil::TimeValue:0x48c15d35>
[2024-05-06T05:55:43,382][DEBUG][logstash.runner          ] xpack.monitoring.collection.timeout_interval: #<Java::OrgLogstashUtil::TimeValue:0x129df247>
[2024-05-06T05:55:43,383][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.username: "logstash_system"
[2024-05-06T05:55:43,383][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.ssl.verification_mode: "certificate"
[2024-05-06T05:55:43,383][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.sniffing: false
[2024-05-06T05:55:43,383][DEBUG][logstash.runner          ] xpack.monitoring.collection.pipeline.details.enabled: true
[2024-05-06T05:55:43,383][DEBUG][logstash.runner          ] xpack.monitoring.collection.config.enabled: true
[2024-05-06T05:55:43,383][DEBUG][logstash.runner          ] monitoring.enabled: false
[2024-05-06T05:55:43,383][DEBUG][logstash.runner          ] monitoring.elasticsearch.hosts: ["http://localhost:9200"]
[2024-05-06T05:55:43,383][DEBUG][logstash.runner          ] monitoring.collection.interval: #<Java::OrgLogstashUtil::TimeValue:0x1062f767>
[2024-05-06T05:55:43,383][DEBUG][logstash.runner          ] monitoring.collection.timeout_interval: #<Java::OrgLogstashUtil::TimeValue:0x4a3d0660>
[2024-05-06T05:55:43,384][DEBUG][logstash.runner          ] monitoring.elasticsearch.username: "logstash_system"
[2024-05-06T05:55:43,384][DEBUG][logstash.runner          ] monitoring.elasticsearch.ssl.verification_mode: "certificate"
[2024-05-06T05:55:43,384][DEBUG][logstash.runner          ] monitoring.elasticsearch.sniffing: false
[2024-05-06T05:55:43,384][DEBUG][logstash.runner          ] monitoring.collection.pipeline.details.enabled: true
[2024-05-06T05:55:43,384][DEBUG][logstash.runner          ] monitoring.collection.config.enabled: true
[2024-05-06T05:55:43,384][DEBUG][logstash.runner          ] node.uuid: ""
[2024-05-06T05:55:43,384][DEBUG][logstash.runner          ] xpack.management.enabled: false
[2024-05-06T05:55:43,384][DEBUG][logstash.runner          ] xpack.management.logstash.poll_interval: #<Java::OrgLogstashUtil::TimeValue:0x7e4921be>
[2024-05-06T05:55:43,385][DEBUG][logstash.runner          ] xpack.management.pipeline.id: ["main"]
[2024-05-06T05:55:43,385][DEBUG][logstash.runner          ] xpack.management.elasticsearch.username: "logstash_system"
[2024-05-06T05:55:43,385][DEBUG][logstash.runner          ] xpack.management.elasticsearch.hosts: ["https://localhost:9200"]
[2024-05-06T05:55:43,385][DEBUG][logstash.runner          ] xpack.management.elasticsearch.ssl.verification_mode: "certificate"
[2024-05-06T05:55:43,385][DEBUG][logstash.runner          ] xpack.management.elasticsearch.sniffing: false
[2024-05-06T05:55:43,385][DEBUG][logstash.runner          ] xpack.geoip.downloader.enabled: true
[2024-05-06T05:55:43,385][DEBUG][logstash.runner          ] --------------- Logstash Settings -------------------
[2024-05-06T05:55:43,387][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2024-05-06T05:55:43,402][DEBUG][logstash.agent           ] Initializing API WebServer {"api.http.host"=>"127.0.0.1", "api.http.port"=>9600..9700, "api.ssl.enabled"=>false, "api.auth.type"=>"none", "api.environment"=>"production"}
[2024-05-06T05:55:43,407][DEBUG][logstash.api.service     ] [api-service] start
[2024-05-06T05:55:43,434][DEBUG][logstash.agent           ] Setting up metric collection
[2024-05-06T05:55:43,438][DEBUG][logstash.instrument.periodicpoller.os] Starting {:polling_interval=>5, :polling_timeout=>120}
[2024-05-06T05:55:43,439][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2024-05-06T05:55:43,456][DEBUG][logstash.instrument.periodicpoller.jvm] Starting {:polling_interval=>5, :polling_timeout=>120}
[2024-05-06T05:55:43,523][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2024-05-06T05:55:43,525][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2024-05-06T05:55:43,529][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2024-05-06T05:55:43,531][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2024-05-06T05:55:43,532][DEBUG][logstash.instrument.periodicpoller.flowrate] Starting {:polling_interval=>5, :polling_timeout=>120}
[2024-05-06T05:55:43,857][DEBUG][logstash.agent           ] Starting agent
[2024-05-06T05:55:43,859][DEBUG][logstash.agent           ] Starting API WebServer (puma)
[2024-05-06T05:55:43,862][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["/home/administrator/elk/logstash-8.7.1/config/${sys:ls.logs}", "/home/administrator/elk/logstash-8.7.1/config/ca.p12", "/home/administrator/elk/logstash-8.7.1/config/http_poller.yml", "/home/administrator/elk/logstash-8.7.1/config/jvm.options", "/home/administrator/elk/logstash-8.7.1/config/keystore.jks", "/home/administrator/elk/logstash-8.7.1/config/keystore.p12", "/home/administrator/elk/logstash-8.7.1/config/log4j2.properties", "/home/administrator/elk/logstash-8.7.1/config/logstash-sample.conf", "/home/administrator/elk/logstash-8.7.1/config/logstash.keystore", "/home/administrator/elk/logstash-8.7.1/config/logstash.yml", "/home/administrator/elk/logstash-8.7.1/config/pipelines.yml", "/home/administrator/elk/logstash-8.7.1/config/server.p12", "/home/administrator/elk/logstash-8.7.1/config/startup.options", "/home/administrator/elk/logstash-8.7.1/config/truststore.jks"]}
[2024-05-06T05:55:43,863][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/home/administrator/elk/logstash-8.7.1/config/test.yml"}
[2024-05-06T05:55:43,867][DEBUG][logstash.agent           ] Trying to start API WebServer {:port=>9600, :ssl_enabled=>false}
[2024-05-06T05:55:43,868][DEBUG][org.logstash.config.ir.PipelineConfig] -------- Logstash Config ---------
[2024-05-06T05:55:43,869][DEBUG][org.logstash.config.ir.PipelineConfig] Config from source, source: LogStash::Config::Source::Local, pipeline_id:: main
[2024-05-06T05:55:43,869][DEBUG][org.logstash.config.ir.PipelineConfig] Config string, protocol: file, id: /home/administrator/elk/logstash-8.7.1/config/test.yml
[2024-05-06T05:55:43,870][DEBUG][org.logstash.config.ir.PipelineConfig] 

input {
  http_poller {
    urls => {
      my_url => {
        method => get
        url => "https://www.splunkweb.co.uk/random-person"
        headers => {
          "Content-Type" => "application/json"
        }
        ssl_supported_protocols = "TLSv1.3"
        ssl_cipher_suites => "TLS_AES_256_GCM_SHA384"
        ssl_certificate_authorities => "/home/administrator/Documents/ca3/ca.cert.pem"
        ssl_certificate => "/home/administrator/Documents/ca3/server.cert.pem"
        ssl_key => "/home/administrator/Documents/ca3/server.unenc.key.pem"
        #keystore => "/home/administrator/elk/logstash-8.7.1/config/keystore.jks"
        truststore_password => "password"
        keystore_password => "password"
        request_timeout => 60
        schedule => { every => "10s" }
      }
    }
    request_timeout => 60
    # Other input configurations...
  }
}

filter {
  # Add your filters here if needed
}

output {
  stdout {
    codec => rubydebug
  }
}


[2024-05-06T05:55:43,871][DEBUG][org.logstash.config.ir.PipelineConfig] Merged config
[2024-05-06T05:55:43,872][DEBUG][org.logstash.config.ir.PipelineConfig] 

input {
  http_poller {
    urls => {
      my_url => {
        method => get
        url => "https://www.splunkweb.co.uk/random-person"
        headers => {
          "Content-Type" => "application/json"
        }
        ssl_supported_protocols = "TLSv1.3"
        ssl_cipher_suites => "TLS_AES_256_GCM_SHA384"
        ssl_certificate_authorities => "/home/administrator/Documents/ca3/ca.cert.pem"
        ssl_certificate => "/home/administrator/Documents/ca3/server.cert.pem"
        ssl_key => "/home/administrator/Documents/ca3/server.unenc.key.pem"
        #keystore => "/home/administrator/elk/logstash-8.7.1/config/keystore.jks"
        truststore_password => "password"
        keystore_password => "password"
        request_timeout => 60
        schedule => { every => "10s" }
      }
    }
    request_timeout => 60
    # Other input configurations...
  }
}

filter {
  # Add your filters here if needed
}

output {
  stdout {
    codec => rubydebug
  }
}


[2024-05-06T05:55:43,875][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>1}
[2024-05-06T05:55:43,898][DEBUG][logstash.agent           ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2024-05-06T05:55:43,905][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to exists or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[2024-05-06T05:55:43,909][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to load or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[2024-05-06T05:55:43,933][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::OrgLogstashSecretStore::SecretStoreException::LoadException", :message=>"Found a file at /home/administrator/elk/logstash-8.7.1/config/logstash.keystore, but it is not a valid Logstash keystore.", :backtrace=>["org.logstash.secret.store.backend.JavaKeyStore.load(JavaKeyStore.java:294)", "org.logstash.secret.store.backend.JavaKeyStore.load(JavaKeyStore.java:77)", "org.logstash.secret.store.SecretStoreFactory.doIt(SecretStoreFactory.java:129)", "org.logstash.secret.store.SecretStoreFactory.load(SecretStoreFactory.java:115)", "org.logstash.secret.store.SecretStoreExt.getIfExists(SecretStoreExt.java:60)", "org.logstash.execution.AbstractPipelineExt.getSecretStore(AbstractPipelineExt.java:790)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:238)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:173)", "org.logstash.execution.AbstractPipelineExt$INVOKER$i$initialize.call(AbstractPipelineExt$INVOKER$i$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:846)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1229)", "org.jruby.ir.instructions.InstanceSuperInstr.interpret(InstanceSuperInstr.java:131)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:361)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:128)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:115)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:329)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:87)", "org.jruby.RubyClass.newInstance(RubyClass.java:911)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:329)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:87)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:549)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:361)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:92)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:238)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:225)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:226)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:393)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:206)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:325)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:116)", "org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:66)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.Block.call(Block.java:143)", "org.jruby.RubyProc.call(RubyProc.java:309)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:107)", "java.base/java.lang.Thread.run(Thread.java:833)"]}
[2024-05-06T05:55:43,954][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2024-05-06T05:55:43,961][DEBUG][logstash.agent           ] Shutting down all pipelines {:pipelines_count=>0}
[2024-05-06T05:55:43,962][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>0}
[2024-05-06T05:55:43,963][DEBUG][logstash.instrument.periodicpoller.os] Stopping
[2024-05-06T05:55:43,964][DEBUG][logstash.instrument.periodicpoller.jvm] Stopping
[2024-05-06T05:55:43,964][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Stopping
[2024-05-06T05:55:43,964][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Stopping
[2024-05-06T05:55:43,964][DEBUG][logstash.instrument.periodicpoller.flowrate] Stopping
[2024-05-06T05:55:43,971][DEBUG][logstash.agent           ] API WebServer has stopped running
[2024-05-06T05:55:43,971][INFO ][logstash.runner          ] Logstash shut down.
[2024-05-06T05:55:43,977][FATAL][org.logstash.Logstash    ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
	at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:790) ~[jruby.jar:?]
	at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:753) ~[jruby.jar:?]
	at home.administrator.elk.logstash_minus_8_dot_7_dot_1.lib.bootstrap.environment.<main>(/home/administrator/elk/logstash-8.7.1/lib/bootstrap/environment.rb:91) ~[?:?]
(base) administrator@ubuntu:~/elk/logstash-8.7.1/config$ 


I saw this. The rubber ducky effect. This keystore is testing me. Any tips on key store, AI and google arn't helpful.

[2024-05-06T05:55:43,933][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::OrgLogstashSecretStore::SecretStoreException::LoadException", :message=>"Found a file at /home/administrator/elk/logstash-8.7.1/config/logstash.keystore, but it is not a valid Logstash keystore.

This didnt work

openssl pkcs12 -export -out keystore.p12 -inkey /home/administrator/Documents/ca3/server.key.pem -in /home/administrator/Documents/ca3/server.cert.pem
 1311  cp ll
 1312  ls
 1313  ll
 1314  keytool -importkeystore -srckeystore keystore.p12 -srcstoretype PKCS12 -destkeystore keystore.jks -deststoretype JKS
 1315  ../jdk/bin/keytool -importkeystore -srckeystore keystore.p12 -srcstoretype PKCS12 -destkeystore keystore.jks -deststoretype JKS
 1316  ll
 1317  ../jdk/bin/keytool -importcert -file /home/administrator/Documents/ca3/ca.cert.pem -alias ca_cert -keystore truststore.jks -storepass password
 1318  ll
 1319  nano test.yml 
 1320  sudo ../bin/logstash -f /home/administrator/elk/logstash-8.7.1/config/test.yml --log.level debug

So in short i got it to work. I was using a server and client extention of my own desgin. in the end i just reused the apiwebserver.cert.pem; importing it into the key store.

The working http_poller pipeline was this.

input {
  http_poller {
    urls => {
      myurl => "https://www.webserver.co.uk:8000/random_person"
    }
    truststore => "/home/administrator/elk/logstash-8.7.1/config/test.jks"
    truststore_password => "password"
    schedule => { every => "10s"}
  }
}

output {
  stdout {
    codec => rubydebug
  }
}

and i imported the server cert with this


keytool -import -alias test -file server_cert.pem -keystore test.jks -storetype jks

Stay tuned for a better explination.

this helped me