Hello All,
I am trying to deploy ELK stack in lab. I was able to get elastic and kibana up and running with XPAC enabled. But now struggling to get logs in elasticsearch from logstash. Refer config as below
elasticsearch.yml
By default Elasticsearch is only accessible on localhost. Set a different
# address here to expose this node on the network:
#
#network.host: 192.168.0.1
#
# By default Elasticsearch listens for HTTP traffic on the first free port it
# finds starting at 9200. Set a specific HTTP port here:
#
#http.port: 9200
#
# For more information, consult the network module documentation.
#
# --------------------------------- Discovery ----------------------------------
#
# Pass an initial list of hosts to perform discovery when this node is started:
# The default list of hosts is ["127.0.0.1", "[::1]"]
#
#discovery.seed_hosts: ["host1", "host2"]
#
# Bootstrap the cluster using an initial set of master-eligible nodes:
#
#cluster.initial_master_nodes: ["node-1", "node-2"]
#
# For more information, consult the discovery and cluster formation module documentation.
#
# ---------------------------------- Various -----------------------------------
#
# Allow wildcard deletion of indices:
#
#action.destructive_requires_name: false
#----------------------- BEGIN SECURITY AUTO CONFIGURATION -----------------------
#
# The following settings, TLS certificates, and keys have been automatically
# generated to configure Elasticsearch security features on 02-10-2023 19:13:18
#
# --------------------------------------------------------------------------------
# Enable security features
xpack.security.enabled: true
xpack.security.enrollment.enabled: true
# Enable encryption for HTTP API client connections, such as Kibana, Logstash, and Agents
xpack.security.http.ssl:
enabled: true
keystore.path: certs/http.p12
# Enable encryption and mutual authentication between cluster nodes
xpack.security.transport.ssl:
enabled: true
verification_mode: certificate
keystore.path: certs/transport.p12
truststore.path: certs/transport.p12
# Create a new cluster with the current node only
# Additional nodes can still join the cluster later
cluster.initial_master_nodes: ["sandbox"]
# Allow HTTP API connections from anywhere
# Connections are encrypted and require user authentication
http.host: 0.0.0.0
# Allow other nodes to join the cluster from anywhere
# Connections are encrypted and mutually authenticated
#transport.host: 0.0.0.0
#----------------------- END SECURITY AUTO CONFIGURATION -------------------------
sudo tail -f /var/log/logstash/logstash-plain.log
[2023-10-03T12:31:37,808][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@localhost:9200/]}}
[2023-10-03T12:31:37,923][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@localhost:9200/"}
[2023-10-03T12:31:37,935][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.10.2) {:es_version=>8}
[2023-10-03T12:31:37,936][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2023-10-03T12:31:37,960][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `true`
[2023-10-03T12:31:37,976][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/etc/logstash/conf.d/squid.conf"], :thread=>"#<Thread:0x36eca527 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2023-10-03T12:31:38,011][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>0.03}
[2023-10-03T12:31:38,034][INFO ][filewatch.observingtail ][main][035bd55e10fb8050b9e1352969dd97a58cd6fa3cef5c7dec3b3bc29c8e0f94a3] START, creating Discoverer, Watch with file and sincedb collections
[2023-10-03T12:31:38,039][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2023-10-03T12:31:38,054][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
/etc/logstash/conf.d/squid.conf
input{
file{
path => "/home/ameer/Desktop/access.log"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter{
}
output{
elasticsearch{
hosts => ["https://localhost:9200"]
# index => "malware"
# data_stream => true
ssl_certificate_verification => false
user => "elastic"
password => "<password>"
}
}
Everything is installed on single host.
Help is appreciated.