I think I am still getting a communication error
Sending Logstash logs to /home/trex/Downloads/logstash-8.17.1/logs which is now configured via log4j2.properties
[2025-02-10T22:36:26,081][INFO ][logstash.runner ] Log4j configuration path used is: /home/trex/Downloads/logstash-8.17.1/config/log4j2.properties
[2025-02-10T22:36:26,084][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.17.1", "jruby.version"=>"jruby 9.4.9.0 (3.1.4) 2024-11-04 547c6b150e OpenJDK 64-Bit Server VM 21.0.5+11-LTS on 21.0.5+11-LTS +indy +jit [x86_64-linux]"}
[2025-02-10T22:36:26,085][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2025-02-10T22:36:26,101][INFO ][org.logstash.jackson.StreamReadConstraintsUtil] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2025-02-10T22:36:26,101][INFO ][org.logstash.jackson.StreamReadConstraintsUtil] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2025-02-10T22:36:26,171][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2025-02-10T22:36:26,396][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2025-02-10T22:36:26,642][INFO ][org.reflections.Reflections] Reflections took 44 ms to scan 1 urls, producing 151 keys and 528 values
[2025-02-10T22:36:26,890][INFO ][logstash.javapipeline ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2025-02-10T22:36:26,897][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://localhost:9200"]}
[2025-02-10T22:36:26,967][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://localhost:9200/]}}
[2025-02-10T22:36:27,070][WARN ][logstash.outputs.elasticsearch][main] Health check failed {:code=>401, :url=>https://localhost:9200/, :message=>"Got response code '401' contacting Elasticsearch at URL 'https://localhost:9200/'"}
[2025-02-10T22:36:27,074][WARN ][logstash.outputs.elasticsearch][main] Elasticsearch main endpoint returns 401 {:message=>"Got response code '401' contacting Elasticsearch at URL 'https://localhost:9200/'", :body=>"{\"error\":{\"root_cause\":[{\"type\":\"security_exception\",\"reason\":\"missing authentication credentials for REST request [/]\",\"header\":{\"WWW-Authenticate\":[\"Basic realm=\\\"security\\\", charset=\\\"UTF-8\\\"\",\"Bearer realm=\\\"security\\\"\",\"ApiKey\"]}}],\"type\":\"security_exception\",\"reason\":\"missing authentication credentials for REST request [/]\",\"header\":{\"WWW-Authenticate\":[\"Basic realm=\\\"security\\\", charset=\\\"UTF-8\\\"\",\"Bearer realm=\\\"security\\\"\",\"ApiKey\"]}},\"status\":401}"}
[2025-02-10T22:36:27,075][ERROR][logstash.javapipeline ][main] Pipeline error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: Could not read Elasticsearch. Please check the credentials>, :backtrace=>["/home/trex/Downloads/logstash-8.17.1/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.10-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:275:in `block in healthcheck!'", "org/jruby/RubyHash.java:1615:in `each'", "/home/trex/Downloads/logstash-8.17.1/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.10-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:267:in `healthcheck!'", "/home/trex/Downloads/logstash-8.17.1/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.10-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:401:in `update_urls'", "/home/trex/Downloads/logstash-8.17.1/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.10-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:109:in `update_initial_urls'", "/home/trex/Downloads/logstash-8.17.1/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.10-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:103:in `start'", "/home/trex/Downloads/logstash-8.17.1/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.10-java/lib/logstash/outputs/elasticsearch/http_client.rb:373:in `build_pool'", "/home/trex/Downloads/logstash-8.17.1/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.10-java/lib/logstash/outputs/elasticsearch/http_client.rb:64:in `initialize'", "org/jruby/RubyClass.java:922:in `new'", "/home/trex/Downloads/logstash-8.17.1/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.10-java/lib/logstash/outputs/elasticsearch/http_client_builder.rb:106:in `create_http_client'", "/home/trex/Downloads/logstash-8.17.1/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.10-java/lib/logstash/outputs/elasticsearch/http_client_builder.rb:102:in `build'", "/home/trex/Downloads/logstash-8.17.1/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.10-java/lib/logstash/plugin_mixins/elasticsearch/common.rb:42:in `build_client'", "/home/trex/Downloads/logstash-8.17.1/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.10-java/lib/logstash/outputs/elasticsearch.rb:301:in `register'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:69:in `register'", "/home/trex/Downloads/logstash-8.17.1/logstash-core/lib/logstash/java_pipeline.rb:245:in `block in register_plugins'", "org/jruby/RubyArray.java:1981:in `each'", "/home/trex/Downloads/logstash-8.17.1/logstash-core/lib/logstash/java_pipeline.rb:244:in `register_plugins'", "/home/trex/Downloads/logstash-8.17.1/logstash-core/lib/logstash/java_pipeline.rb:622:in `maybe_setup_out_plugins'", "/home/trex/Downloads/logstash-8.17.1/logstash-core/lib/logstash/java_pipeline.rb:257:in `start_workers'", "/home/trex/Downloads/logstash-8.17.1/logstash-core/lib/logstash/java_pipeline.rb:198:in `run'", "/home/trex/Downloads/logstash-8.17.1/logstash-core/lib/logstash/java_pipeline.rb:150:in `block in start'"], "pipeline.sources"=>["/home/trex/Downloads/logstash-8.17.1/config/testPipe.conf"], :thread=>"#<Thread:0x782acb2c /home/trex/Downloads/logstash-8.17.1/logstash-core/lib/logstash/java_pipeline.rb:138 run>"}
[2025-02-10T22:36:27,076][INFO ][logstash.javapipeline ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2025-02-10T22:36:27,080][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
I copied the Elasticsearch cert to a logstash accessible location and referenced the location in the config
input {
file {
path => "/home/trex/Downloads/Power-12.2.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["Day of Date Eastern", "Device Id", "Location", "Time Eastern", "Date Eastern", "Hour Eastern", "count", "Address", "Geohash", "Ip Address", "Timestamp UTC", "count_locations", "count_signals", "Horizontal Accuracy", "Latitude", "Longitude", "Timestamp"]
}
date {
match => ["Timestamp UTC", "MM/dd/yyyy hh:mm:ss a"]
target => "@timestamp"
timezone => "UTC"
}
mutate {
convert => {
"Hour Eastern" => "integer"
"count_locations" => "integer"
"count_signals" => "integer"
"Horizontal Accuracy" => "float"
"Latitude" => "float"
"Longitude" => "float"
"Timestamp" => "float"
}
}
geoip {
source => "Ip Address"
target => "geoip"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
}
output {
elasticsearch {
hosts => ["https://localhost:9200"]
index => "substation_sabotage_%{+YYYY.MM.dd}"
ssl_certificate_authorities=>['/home/trex/Downloads/logstash-8.17.1/config/certs/http_ca.crt']
}
stdout { codec => rubydebug }
}