Couldn't find any Elasticsearch data in Kibana


(Kotresh Matada) #1

Hi,
I am trying to implement Elasticsearch for my spring boot application.

But some how In Kibana, when i try to configure index, I am not able find any elasticsearch data

I made sure that, my elasticsearch, Kibana and Logstash is up and running

when i tried to print indices
http://localhost:9200/_cat/indices?v

below is the result. Here somehow I am not getting logstash index

Logs in the logstash are as shown in the below screen shot

My logstash.conf file contents are as follows

Sample Logstash configuration for creating a simple

Beats -> Logstash -> Elasticsearch pipeline.

input {
file {
type => "java"
path => "C:\Kotresh\work\Q-MATE\elastic-demo-logs\qmate-elastic-tool.log"
codec => multiline {
pattern => "^%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}.*"
negate => "true"
what => "previous"
}
}
}

filter {
#If log line contains tab character followed by 'at' then we will tag that entry as stacktrace
if [message] =~ "\tat" {
grok {
match => ["message", "^(\tat)"]
add_tag => ["stacktrace"]
}
}

#Grokking Spring Boot's default log format
grok {
match => [ "message",
"(?%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}) %{LOGLEVEL:level} %{NUMBER:pid} --- [(?[A-Za-z0-9-]+)] [A-Za-z0-9.].(?[A-Za-z0-9#_]+)\s:\s+(?.)",
"message",
"(?%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}) %{LOGLEVEL:level} %{NUMBER:pid} --- .+? :\s+(?.
)"
]
}

#Parsing out timestamps which are in timestamp field thanks to previous grok section
date {
match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss.SSS" ]
}
}

output {

stdout {
codec => rubydebug
}

Sending properly parsed log events to elasticsearch

elasticsearch {
hosts => ["127.0.0.1:9200"]
}
}

I have placed above logstash.conf file under logstash installation bin directory ( logstash-6.4.1\bin )

Please suggest, where I am doing mistake.

If i push test data using devtools options, then I am able to find this particular index .


(Jake Landis) #2

Your description states Beats->Logstash->Elasticsearch ... however your configuration is only Logstash->Elasticsearch

Either way will work, but Beats->Logstash->Elasticsearch is likely preferable so you don't need to install Logstash on every application node (that's what Beats is designed to do).

Beats -> Logstash -> Elasticsearch (preferred)

To get started : https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-getting-started.html and you will want to point Filebeat to Logstash: https://www.elastic.co/guide/en/beats/filebeat/current/config-filebeat-logstash.html

Then from Logstash, you will want to use the Beats input: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-beats.html (not the file input).

Logstash -> Elasticsearch (as you have it)

I would guess that you are not seeing the data in Elasticsearch since Logstash's file input is stateful and will not re-read the same file by default. I am guessing that during some testing the file was already read and when you try to send to Elasticsearch the file input won't re-read the file. On nix/mac you can set this sincedb_path => "/dev/null" while testing to always start fresh. Not sure the equivalent on Windows. You may need to find the .since_db file (as seen in the Logstash startup) and manually delete it to ensure that the file input appears to be a new file.


(Kotresh Matada) #3

Hi Jakelandis,

Thank you for the replay.

I tried setting since db path as follows in my windows machine.

sincedb_path => "C:\logstash-6.4.1\data\plugins\inputs\file\null"

Even, this is not helped me. It created new null file when i start logstash using logstash -f logstatsh.conf.

any other things, am I missing here?


(Jake Landis) #4

I am not sure how to emulate /dev/null on Windows (however, deleting the .since_db file(s) before each Logstash start should have the same effect). Please note /dev/null is just a hacky trick to always replay the file from the file input when Logstash is restarted.

Assuming you delete the .since_db file(s) before starting Logstash:

When you use stdout do you see any output ?

When you use elasticsearch output do you see errors ?
Is there now an index created ?

If you continue with the logstash -> elasticsearch (no beats), and deleting the .since_db file doesn't help with your testing it may be better to ask this question in https://discuss.elastic.co/c/logstash.


(Kotresh Matada) #5

Hi Jakelandis,

Thank you for the replay again.

There was some issue in my logstash.conf file, hence i was not able to find any elasticsearch data in kibana. Now it is working. Only concern is ,

In the input section, of logstash.conf file

input {
file {
path => "Monitoringlogs/.log"
codec => multiline {
pattern => "^%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}.
"
negate => "true"
what => "previous"
}
type => "java"
}
}

is pattern and type are mandatory attributes? should i use any standard pattern for logback configuration?

Also, Now I am able to view log information generated from application from the file. but is there any way we can view information about deployments failures/ or using Kibana?
can i see deployment status of applications of production environments?

we do not have access to production environment, i wanted to understand what are the all the possibilities i would get using Elastic search.


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.