Unable to see logs in Elasticsearch from logstash

Hello All,

I am trying to deploy ELK stack in lab. I was able to get elastic and kibana up and running with XPAC enabled. But now struggling to get logs in elasticsearch from logstash. Refer config as below

elasticsearch.yml

 By default Elasticsearch is only accessible on localhost. Set a different
# address here to expose this node on the network:
#
#network.host: 192.168.0.1
#
# By default Elasticsearch listens for HTTP traffic on the first free port it
# finds starting at 9200. Set a specific HTTP port here:
#
#http.port: 9200
#
# For more information, consult the network module documentation.
#
# --------------------------------- Discovery ----------------------------------
#
# Pass an initial list of hosts to perform discovery when this node is started:
# The default list of hosts is ["127.0.0.1", "[::1]"]
#
#discovery.seed_hosts: ["host1", "host2"]
#
# Bootstrap the cluster using an initial set of master-eligible nodes:
#
#cluster.initial_master_nodes: ["node-1", "node-2"]
#
# For more information, consult the discovery and cluster formation module documentation.
#
# ---------------------------------- Various -----------------------------------
#
# Allow wildcard deletion of indices:
#
#action.destructive_requires_name: false

#----------------------- BEGIN SECURITY AUTO CONFIGURATION -----------------------
#
# The following settings, TLS certificates, and keys have been automatically      
# generated to configure Elasticsearch security features on 02-10-2023 19:13:18
#
# --------------------------------------------------------------------------------
# Enable security features
xpack.security.enabled: true

xpack.security.enrollment.enabled: true

# Enable encryption for HTTP API client connections, such as Kibana, Logstash, and Agents
xpack.security.http.ssl:
  enabled: true
  keystore.path: certs/http.p12

# Enable encryption and mutual authentication between cluster nodes
xpack.security.transport.ssl:
  enabled: true
  verification_mode: certificate
  keystore.path: certs/transport.p12
  truststore.path: certs/transport.p12
# Create a new cluster with the current node only
# Additional nodes can still join the cluster later
cluster.initial_master_nodes: ["sandbox"]

# Allow HTTP API connections from anywhere
# Connections are encrypted and require user authentication
http.host: 0.0.0.0

# Allow other nodes to join the cluster from anywhere
# Connections are encrypted and mutually authenticated
#transport.host: 0.0.0.0

#----------------------- END SECURITY AUTO CONFIGURATION -------------------------

sudo tail -f /var/log/logstash/logstash-plain.log

[2023-10-03T12:31:37,808][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@localhost:9200/]}}
[2023-10-03T12:31:37,923][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@localhost:9200/"}
[2023-10-03T12:31:37,935][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.10.2) {:es_version=>8}
[2023-10-03T12:31:37,936][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2023-10-03T12:31:37,960][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `true`
[2023-10-03T12:31:37,976][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/etc/logstash/conf.d/squid.conf"], :thread=>"#<Thread:0x36eca527 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2023-10-03T12:31:38,011][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.03}
[2023-10-03T12:31:38,034][INFO ][filewatch.observingtail  ][main][035bd55e10fb8050b9e1352969dd97a58cd6fa3cef5c7dec3b3bc29c8e0f94a3] START, creating Discoverer, Watch with file and sincedb collections
[2023-10-03T12:31:38,039][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2023-10-03T12:31:38,054][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

/etc/logstash/conf.d/squid.conf

input{
        file{
                path => "/home/ameer/Desktop/access.log"
                start_position => "beginning"
                sincedb_path => "/dev/null"
        }
}

filter{


}

output{
        elasticsearch{
                        hosts => ["https://localhost:9200"]
#                       index => "malware"
#                       data_stream => true
                        ssl_certificate_verification => false
                        user => "elastic"
                        password => "<password>"
        }
}

Everything is installed on single host.

Help is appreciated.

I can see log in logstash when using stdout{} output, but not able to understand what is casuing to not to create index in elasticsearch and populate logs

{
         "event" => {
        "original" => "192.168.100.191 - - [02/Oct/2023:18:33:57 +0530] CONNECT \"www.facebook.com:443\" \"HTTP/1.1\" 200 0 \"-\" \"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/117.0.0.0 Safari/537.36\" NONE_NONE:HIER_NONE"
    },
       "message" => "192.168.100.191 - - [02/Oct/2023:18:33:57 +0530] CONNECT \"www.facebook.com:443\" \"HTTP/1.1\" 200 0 \"-\" \"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/117.0.0.0 Safari/537.36\" NONE_NONE:HIER_NONE",
          "host" => {
        "name" => "sandbox"
    },
      "@version" => "1",
    "@timestamp" => 2023-10-03T09:25:23.187138827Z,
           "log" => {
        "file" => {
            "path" => "/home/ameer/Desktop/access.log"
        }
    }
}

I can see logs are getting pushed with "sudo /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/squid.conf"

But not from service

Hello @AnyThink_A,

Welcome to the community!

Do you mean that the logs are not being ingested into Elasticsearch, or that you cannot visualize them in Kibana?

@Priscilla_Parodi It seems logs are not being sent to Elasticsearch as I can see when using sudo command it shows data in console. But not when using user "logstash". Directory /usr/share/logstash/ in owned by "logstash".

"access.log" is also owned by "logstash"

When running service as root, it works as expected, it is definitely permission issue for user "logstash". But can't figure out what location.

All of below path are owned by logstash

/var/lib/logstash
/usr/share/logstash/
/usr/Default/logstash/

access.log is also owned by logstash.

Any leads will be apreciated

The user that Logstash runs should have read/write access to /var/lib/logstash and /var/log/logstash.

But if you've configured it to read additional log files the same user needs read access to those too.

Note: This is the default directory structure that is created when you unpack the Logstash installation packages.

I re-installed everything and gave permission only for /usr/share/logstash/data.

It seems that elastic was not reading file which was not updating realtime. I installed squid which kept updating log file and it started to show logs in Kibana.

input{
        file{
                path => "/var/log/squid/access.log"
        }
}

filter {
        grok {
               match => { "message" => "%{IP:client_ip} %{DATA:user_info} \[%{HTTPDATE:timestamp}\] %{WORD:http_method} \"%{DATA:url}\" \"%{DATA:http_version}\" %{NUMBER:http_status:int} %{NUMBER:response_size:int} \"%{DATA:referrer}\" \"%{DATA:user_agent}\" %{GREEDYDATA:additional_info}"}
        }
}

output{
        elasticsearch{
                        hosts => ["https://localhost:9200"]
                        # index => "malware"
                        user => "elastic"
                        password => "<Password>"
					ssl_enabled => true
						# copied http_ca.crt from elastic cert folder
						ssl_certificate_authorities => "/etc/logstash/certs/http_ca.crt"
        }
		stdout { codec => rubydebug } # remove this in actual config
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.