How to get index in kibana

Hi Below is my logstash.conf file

input {

file {
    add_field => [ "host", "my-dev-host" ]
    path => "D:\JHipster_Demo\logFile.2017-08-04.log"
    codec => "plain"
}

}
filter {
grok {
match => [ "path", "(?logFile.2017-08-04.log$)"]
}

}

output {

elasticsearch{


		index => "%{filename}"
    hosts => [ "localhost:9200" ]
   
 
}

stdout { codec => rubydebug }

}

Am getting data flow with output
"path" => "D:\JHipster_Demo\logFile.2017-08-04.log",
"@timestamp" => 2017-08-04T07:00:22.730Z,
"filename" => "logFile.2017-08-04.log",
"@version" => "1",
"host" => [
[0] "FS-WS195-D45",
[1] "my-dev-host"
],
"message" => " 2017-08-04 12:30:22,153 INFO [metrics-logger-reporter-1-t
hread-1] metrics: type=GAUGE, name=jvm.memory.pools.Tenured-Gen.committed, value
=178978816\r"
}

Now when i creating index in kibana am not able to create showing pattern not matching.

Am trying with index name as file name
"filename" => "logFile.2017-08-04.log"

Can you do a _cat\indices request to your Elasticsearch instance just so you can see what the indices got named?
After you get those you can use a wildcard for the index patttern, if you want to match all of them regardless of the date.
https://www.elastic.co/guide/en/elasticsearch/reference/current/cat-indices.html

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.