Not able to create index pattern in 6.2.2

Hello Kibana team,

Even after following index creation documentations (link provided on kibana interface and on the web) I'm not able to access the ES data on kibana. I keep having "Warning No default index pattern. You must select or create one to continue.".
FYI, I didn't that much trouble starting indexing and visualizing in 6.0, if it can help.

GET /_cat/indices?v

health status index                       uuid                   pri rep docs.count docs.deleted store.size pri.store.size
yellow open   logstash-jobs               nHtULW65QWistGDCmqTUOA   5   1       2033            0      2.9mb          2.9mb
yellow open   metricbeat-6.2.2-2018.02.25 asLt2TcaR4e09Sg14G_Lrg   1   1      79231            0     13.3mb         13.3mb
yellow open   bank                        95IoLJkIQyG8ulWsPGardQ   5   1       1000            0    475.2kb        475.2kb
green  open   .kibana                     _lCtMjcKTa-kq9UDhfAJpw   1   0          1            0      3.7kb          3.7kb
yellow open   logstash-2013.12.11         5pd7IExpSmyNpUXKWGmiUQ   5   1          1            0     12.5kb         12.5kb
yellow open   logstash-2015.05.18         x3S3XIDsTYeMjzT3eVHy6w   5   1       4631            0       21mb           21mb
yellow open   metricbeat-6.2.2-2018.02.23 HYNnU_WLQzSBG71PehYLZw   1   1      33768            0      5.9mb          5.9mb
yellow open   logstash-2015.05.20         4hLt-qsgQP2rvHOGY5CMqw   5   1       4750            0       21mb           21mb
yellow open   logstash-2011.05.18         c3C9mCqSTIClbi957jP1Kg   5   1          2            0     26.4kb         26.4kb
yellow open   metricbeat-6.2.2-2018.02.24 uw4NdzCTRFOxhsXROeZvrA   1   1     137924            0     22.9mb         22.9mb
yellow open   logstash-2011.05.19         cX3Co3AgQa6jiuJTU-CyTg   5   1          1            0     13.8kb         13.8kb
yellow open   shakespeare                 WV0rqPgmTM6f8G9E1lTl5g   5   1     111396            0     21.3mb         21.3mb
yellow open   logstash-2015.05.19         98m0utEBTFWWvuLY3IadPg   5   1       4624            0     21.1mb         21.1mb

I create a logstash-jobs such as:

PUT /logstash-jobs
{
 "mappings": {
  "doc": {
	"properties" : {
		"@id" : { "type" : "integer"},
		"@title" : { "type" : "text"},
		"@description" : { "type" : "text"},
		"@company" : { "type" : "text"},
		"@city" : { "type" : "text"},
		"@state" : { "type" : "text"},
		"@salary" : { "type" : "text"},
		"@duration" : { "type" : "text"},
		"@link" : { "type" : "text"},
		"@provider" : { "type" : "text"},
		"@update" : { "type" : "date", "format": "yyyy-MM-dd HH:mm:ss"}
	}
  }
 }
}

Some configuration parameters:

indexPattern:placeholder = logstash-*
timelion:es.timefield = @timestamp
timelion:es.default_index = _all

I'm not sure how to investigate this from there.
Feel free to ask me for logs, actions, results, ...

Thank you very much for your help
Best regards
Arnaud

Kibana version: 6.2.2
Elasticsearch version: 6.2.2
Server OS version: Centos7
Linux Elastic.newfrontierdata.com 3.10.0-693.11.6.el7.x86_64 #1 SMP Thu Jan 4 01:06:37 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux
Browser version: Firefox 58.0.2 (64-bit)
Browser OS version: Windows 10
Original install method (e.g. download page, yum, from source, etc.):
Extracted .tar.gz for ELK 6.2.2
Description of the problem including expected versus actual behavior:

Steps to reproduce:

  • Connect to kibana
  • Run GET /_cat/indices?v in console : success
  • Navigate to Management/Index patterns : see " Couldn't find any Elasticsearch data
  • You'll need to index some data into Elasticsearch before you can create an index pattern. Learn how.
    "

Errors in browser console (if relevant):
Couldn't find any Elasticsearch data
You'll need to index some data into Elasticsearch before you can create an index pattern. Learn how.

Provide logs and/or server output (if relevant):
No errors/warnings

In fact, I could find a fix for indexing, but still didn't reach a stable situation.
Running:

POST /.kibana/doc/index-pattern:logstash-jobs
{
  "type": "index-pattern",
  "index-pattern": {
    "title": "logstash-jobs",
    "timeFieldName": "update"
  }
}

displayed the index in the index pattern list. I still don't see the others, can you tell me why? do I have to manually add all the index patterns manually?
Another issue is, why I try to display any charts on this dataset, the dataset "seems" empty.
For example a blank map when I try to count by state, or even more informative, a metric with a global count is "0 count".

Something is wrong with the mapping or something... Can you orient me to the right direction?

Thank you very much
Best regards
Arnaud

More info:
I did a search: GET /logstash-jobs3/_search. The data is not being displayed in the charts, could it be due to the fact that the data is contained in a "_source": {} tag?

{
  "took": 1,
  "timed_out": false,
  "_shards": {
    "total": 5,
    "successful": 5,
    "skipped": 0,
    "failed": 0
  },
  "hits": {
    "total": 2033,
    "max_score": 1,
    "hits": [
      {
        "_index": "logstash-jobs3",
        "_type": "doc",
        "_id": "BRvOzmEB4Eh7pShpbcAB",
        "_score": 1,
        "_source": {
          "@timestamp": "2018-02-25T21:11:29.415Z",
          "id": "7",
          "@version": "1",
          "city": "Fayetteville",
          "path": "/home/elastic/logstash-6.2.2/config/file.csv",
          "title": "Title of the post",
          "host": "Elastic.newfrontierdata.com",
          "state": "AR",
          "duration": "22 days ago",
          "update": "2017-12-03 07:00:02",
          "description": "Here is some description of the post... [...]",

Thank you

Ok,

I worked around the "no-data" in visualization (0 counts) by transforming the data into json using that script:

import csv
import json

csvfile = open('jobs.csv', 'r')
jsonfile = open('jobs.json', 'w')

fieldnames = ("id", "title", "description", "company", "city", "state", "salary", "duration", "link", "provider", "update")
reader = csv.DictReader( csvfile, fieldnames)
for row in reader:
    json.dump(row, jsonfile)
    jsonfile.write('\n')

and then by running: sed -e 's/^/{ "index" : {} }\n/' -i jobs.json

Then, I changed the types of the fields I wanted aggregated to keyword (another issue I had, which I fixed easily.)

I finally ran the following command:
curl -H 'Content-Type: application/x-ndjson' -XPOST 'localhost:9200/aja-jobs/doc/_bulk?pretty' --data-binary @jobs.json

and that's it. it's working now.

I don;t want to do that for the future dataset. Can you tell me what's wrong with my logstash configuration?
jobs.yml

cat jobs.yml
input {
        file {
                path => "/home/elastic/logstash-6.2.2/config/jobs.csv"
                start_position => "beginning"
                sincedb_path => "/dev/null"
        }
}
filter {
        csv {
                separator => ","
                columns => ["id", "title", "description", "company", "city", "state", "salary", "duration", "link", "provider", "update"]
  }
}
output {
        elasticsearch {
                hosts => "http://localhost:9200"
                index => "nfd-jobs"
                template => "/home/elastic/logstash-6.2.2/config/jobs.template.json"
        }
        stdout {}
}

jobs.template.json

{
        "template": "jobs",
        "settings" : {
                "number_of_shards" : 1,
                "number_of_replicas" : 1,
                "index" : {
                        "query" : { "default_field" : "id" }
                }
        },
        "mappings": {
                "_default_": {
                        "_all": { "enabled": false },
                        "_source": { "compress": true },
                        "dynamic_templates": [
                                {
                                        "text_template" : {
                                                "match" : "*",
                                                "mapping": { "type": "text", "index": "not_analyzed" },
                                                "match_mapping_type" : "text"
                                         }
                                }
                        ],
                        "properties" : {
                                "id" : { "type" : "integer"},
                                "title" : { "type" : "keyword"},
                                "description" : { "type" : "text"},
                                "company" : { "type" : "keyword"},
                                "city" : { "type" : "keyword"},
                                "state" : { "type" : "keyword"},
                                "salary" : { "type" : "text"},
                                "duration" : { "type" : "text"},
                                "link" : { "type" : "text"},
                                "provider" : { "type" : "keyword"},
                                "@update" : { "type" : "date", "format": "yyyy-MM-dd HH:mm:ss"}
                        }
                }
        }
}

You shouldn't need to manually post to the Kibana index, in-fact I would highly advise against it. On the index pattern creation page, have you tried adding an index pattern? If so, what pattern are you attempting to add?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.