No existing kibana index found

I am getting No existing kibana index found when i run kibana.

Previously i was able to get the index and visualize the data.

Here is my logstash conf file.
input {
file {
path => "D:/LOGS/log1.txt"
type => "core2"
start_position => "beginning"
}
}
filter {
csv {
columns=>["TYPE","TIMESTAMP","APPNAME"]
separator => ","
}
}

output {
elasticsearch {
action => "index"
host => "localhost"
index => "logstash-%{+YYYY.MM.dd}"
workers => 1
cluster => "myelastic"
}
stdout {
codec => rubydebug
}
}

1 Like

KB should create this automatically if it doesn't exist.

Does curl HOST:9200/_cat/indices show a .kibana index?

Here is my result for http://localhost:9200/_cat/indices?v
health status index pri rep docs.count docs.deleted store.size pri.store.size
yellow open custom 3 2 0 0 432b 432b
yellow open logstash-2013.11.06 1 1 2 0 3.1kb 3.1kb

I am not able to index the json data into kibana.
In above custom have json file.But it is showing docs.count as 0.

What is the modifications in elasticserach.yml and kibana.yml files to index tha data.

Check with your kibana.yml file, whether correct configuration is present or not

#The Elasticsearch instance to use for all your queries.
elasticsearch_url: "http://localhost:9200"

Here is my kibana.yml file

Kibana is served by a back end server. This controls which port to use.

port: 5600

The host to bind the server to.

host: "0.0.0.0"

The Elasticsearch instance to use for all your queries.

elasticsearch_url: "http://localhost:9200"

preserve_elasticsearch_host true will send the hostname specified in elasticsearch. If you set it to false,

then the host you use to connect to this Kibana instance will be sent.

elasticsearch_preserve_host: true

Kibana uses an index in Elasticsearch to store saved searches, visualizations

and dashboards. It will create a new index if it doesn't already exist.

kibana_index: ".kibana"

#kibana_index: "logstash-2015.07.23"

If your Elasticsearch is protected with basic auth, this is the user credentials

used by the Kibana server to perform maintence on the kibana_index at statup. Your Kibana

users will still need to authenticate with Elasticsearch (which is proxied thorugh

the Kibana server)

kibana_elasticsearch_username: user

kibana_elasticsearch_password: pass

#index.mapper.dynamic: true

If your Elasticsearch requires client certificate and key

kibana_elasticsearch_client_crt: /path/to/your/client.crt

kibana_elasticsearch_client_key: /path/to/your/client.key

If you need to provide a CA certificate for your Elasticsarech instance, put

the path of the pem file here.

ca: /path/to/your/CA.pem

The default application to load.

default_app_id: "discover"

Time in milliseconds to wait for elasticsearch to respond to pings, defaults to

request_timeout setting

ping_timeout: 1500

Time in milliseconds to wait for responses from the back end or elasticsearch.

This must be > 0

request_timeout: 300000

Time in milliseconds for Elasticsearch to wait for responses from shards.

Set to 0 to disable.

shard_timeout: 0

Time in milliseconds to wait for Elasticsearch at Kibana startup before retrying

startup_timeout: 5000

Set to false to have a complete disregard for the validity of the SSL

certificate.

verify_ssl: true

SSL for outgoing requests from the Kibana Server (PEM formatted)

ssl_key_file: /path/to/your/server.key

ssl_cert_file: /path/to/your/server.crt

Set the path to where you would like the process id file to be created.

pid_file: /var/run/kibana.pid

If you would like to send the log output to a file you can set the path below.

This will also turn off the STDOUT log output.

log_file: ./kibana.log

Plugins that are included in the build, and no longer found in the plugins/ folder

bundled_plugin_ids:

  • plugins/dashboard/index
  • plugins/discover/index
  • plugins/doc/index
  • plugins/kibana/index
  • plugins/markdown_vis/index
  • plugins/metric_vis/index
  • plugins/settings/index
  • plugins/table_vis/index
  • plugins/vis_types/index
  • plugins/visualize/index

elasticsearch_url: "http://localhost:9200" is there in kibana.yml

Change the Kibana Port Number from 5600 to 5601

yea i am using 5609 and elasticsearch_url: "http://localhost:9202"

Then change the Respective Port Number like 5609

i am having like that only..

Ok, If you run the Browser for Kibana UI like this what happend, Kibana UI loaded...

localhost:5609

It's loading. but not able to find the index

Are You using Elasticsearch with mobz_head Plugin. If you are using this one you can see the kibana index like this on this Elasticsearch UI

Yellow marked(.kibana) is the kibana index.

I am not using any Plugin.To import data am using logstash.
Previously i was able to visualize the data, but after some days it's not getting eventhough i didn't modify anything

See, You can debug the logstash.conf file using command like

logstash --debug logstash.conf

You will getting any Problems, first yo need check whether the logs are loaded into Elasticsearch server or not(means output of logstash rubydebug console)

Getting following error.
Clamp::UsageError: Unrecognised option '-d'
signal_usage_error at D:/ELK/logstash-1.5.2/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:103
find_option at D:/ELK/logstash-1.5.2/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/option/parsing.rb:62
parse_options at D:/ELK/logstash-1.5.2/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/option/parsing.rb:18
parse at D:/ELK/logstash-1.5.2/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:52
run at D:/ELK/logstash-1.5.2/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.2.2-java/lib/logstash/runner.rb:80
call at org/jruby/RubyProc.java:271
run at D:/ELK/logstash-1.5.2/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.2.2-java/lib/logstash/runner.rb:96
call at org/jruby/RubyProc.java:271
initialize at D:/ELK/logstash-1.5.2/vendor/bundle/jruby/1.9/gems/stud-0.0.20/lib/stud/task.rb:12

use --d (double --symbol)

Getting No configuration file was specified. Perhaps you forgot to provide the '-f yourlogstash.conf' flag?
But it running correctly when i run logstash -f logstash.conf
Here is my logstash.conf file
input {
file {
path => "D:/Log/log1.txt"
type => "core2"
start_position => "beginning"
}
}
filter {
csv {
columns => ["TYPE","TIMESTAMP", "MESSAGE", "APPNAME"]
separator => ","
}
}

output {
elasticsearch {
action => "index"
host => "localhost"
index => "logstash-%{+YYYY.MM.dd}"
workers => 1
}
stdout {
codec => rubydebug
}
}

See, Please fallow the correct installation seeing this link

ELK Stack Installtion