New Kibana Installation Lacking 'default index pattern'

We have just finished installing Kibana using instructions provided on Wazuh HIDS. Kibana is generating an error that "No default index pattern" exists. I have searched the documents and found numerous scripts but no instruction on where to apply the scripts or location of conf file to configure an index pattern. Any advice would be greatly appreciated.

1 Like

Marc - this was helpful for me:



Thank you. The problem is that on my dashboard there is not an "Add New" option or drop down menu:

"The Logstash data set does contain time-series data, so after clicking Add New to define the index for this data set, make sure the Index contains time-based events box is checked and select the @timestamp field from the Time-field name drop-down."

The instructions indicate that the indices chosen must match those in Elasticsearch. The installation instructions did not include configuring indices so am I correct to assume these were automatically created?

Hmm, not sure if your setup is different from a standard new setup, but if Kibana is installed locally and you go to http://localhost:5601/, and then to Management->IndexPatterns, you should see an 'Add New' button that allows you to add an index pattern. This screen will also display any pre-existing index patterns.


FWIW, I'm having the same problem. After following the "getting started" video, I installed elasticsearch and kibana (version 5.2.2), but I don't see an "Add/Create" button, or anything similar, under "Management->Index Patterns". I saw one post where someone said the button appeared after they typed in their index name and then clicked somewhere else on the page, but no joy. I installed again on Windows 10 with same result (original install was on Ubuntu). I don't see any errors in the elasticsearch or kibana (node) logs, but, if I turn on Firebug, I see a failed (404) GET to elasticsearch, and message in the console that says "index pattern set to null". It's pretty hard to debug though with minified Javascript. Interestingly, the "Dev Tools" page seems to work, and I'm able to PUT documents into my index, and receive "created/updated" responses. Any ideas?


i'm not familiar with the Wazuh HIDS documentation, but no index will be created in ES until you load data from a source (like Logstash or Beats) or until you create it using the API yourself.
You can check what indices you have in your ES by running a "GET _cat/indices" on localhost:9200 (or your ES host and port).
If you've run Logstash and created the index, you should be seeing an index in the result from above that is starting with "logstash-".
If that is there, you can now create your index pattern in Kibana by going to "Management" -> "Index patterns" and you can create the pattern there (the field for Index pattern will have the default value of "logstash-*", which should work if you have the logstash index in ES).

If you still have problems with this, could you provide the result to the "GET _cat/indices" on ES and a screenshot of the Index patterns page in Kibana?


Ah, thanks, the GET to "http://localhost:9200/_cat/indices" helped me figure out my issue. I was doing PUT's in Console to "/index/myindex/1" and trying to set "myindex" as the default index pattern, while the path structure is actually '/indexname/type/id', so I should have been PUTting to '/myindex/mytype/1'. Once I sent a PUT, using Console, with the correct URL, and typed 'myindex' into the 'Create an Index Pattern' form (under Management), the Create button appeared. It was greyed out until I unchecked " Index contains time-based events", and then it was clickable. Hopefully Marc's issue is something similar.


I have finally gotten the indexes added but am not seeing data in Kibana. Our goal is to have the system reading OSSEC alerts in the alerts.json file. We have been working with Wazuh HIDS who say that this is happening because we have not configured Docker properly but it seems there should be a configuration in Elasticsearch or Logstash to read the alerts. Any advice would be greatly appreciated.

I'm not familiar with Wazuh HIDS and I only just perused through their documentation righ now, but other than pointing Logstash at the file to be ingested (in you case the alerts.json file) there shouldn't be any specific config to do in Logstash or Elasticsearch.
You can check if any index has arrived in Elasticsearch quickly by using:
curl -XGET localhost:9200/_cat/indices
which will get you the list of all indices in ES.

You have some examples here on how to use the file input in Logstash.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.