The indices which match this index pattern don't contain any time fields

Hello,

I need assistance in creating an index pattern so I can search logs in Kibana. I have the following logstash configuration:

input {
  beats {
    port => 5044
    host => "0.0.0.0"
  }
}

filter {

  json {
    source => "message"
  }

  mutate {
    convert => {
      "startTime" => string
    }
  }
 
 date {
    match => [ "startTime" , "yyyy-MM-dd'T'HH:mm:ss'.'SSS'Z'" ]
    timezone => "UTC"
    target => "@timestamp"
 }

 mutate {
    remove_field => [ "startTime", "@version", "tags", "message", "ecs", "agent", "input", "host" ]
  }
}

output {
  elasticsearch {
    hosts => "${es_host}"
    user => "${es_user}"
    password => "${es_pwd}"
    index => "xxx-development-%{+YYYY.MM.dd}"
    ilm_enabled => true
    ilm_rollover_alias => "xxx-development"
    ilm_policy => "xxx-development"
  }
}

I have the following date field defined in the index template:

"properties": {
        "@timestamp": {
          "type": "date"
        },

An example log message read from filebeat before sent to Logstash looks like this:

{"startTime":"2021-12-02T05:56:04.696Z","level":"FATAL","serviceName":"ABC","pid":3674,"logId":"App Unhandled Rejection","data":"blah" ,"ServicePid":3674}}}

Essentially the startTime field is being sent as @timestamp as per above logstash config. I have previously created index patterns before for other templates using the same method but now it does not work.

When I got to Kibana->Index Patterns->Create index pattern, I can see matching sources for xxx-development-*, however on step 2 it picks up no time fields.

What can I do?

Hi @damienhaynes,

I'd start from checking indices mapping in Elasticsearch,
you can do it from Dev tools in kibana:

GET /xxx-development-*/_mapping

Hi @dosant,

thanks for the reply. Here is an example of what gets returned for one of the index logs:

the bulk of the fields are undeath data but is too big to post here.

Is there something specific I should look for, it lists the @timestamp field as type 'date', so that part looks good.

Cheers,
Damien

Hello there,

Can you see data being indexed into @timestamp field?

Hi @can.ozdemir, how can I do this? Is there a specific GET request I can make on an index log?

I'm not sure how to query what is in a log without Kibana.

You can do;

GET /xxx-development-*/_search

and look at the @timestamp field if its populated.

Here is what the _mapping returns for an empty index log:

{
  "XXX-development-2022.01.05-000004" : {
    "mappings" : {
      "dynamic" : "true",
      "_source" : {
        "includes" : [ ],
        "excludes" : [ ]
      },
      "dynamic_date_formats" : [
        "strict_date_optional_time",
        "yyyy/MM/dd HH:mm:ss Z||yyyy/MM/dd Z"
      ],
      "dynamic_templates" : [
        {
          "message_field" : {
            "path_match" : "message",
            "match_mapping_type" : "string",
            "mapping" : {
              "norms" : false,
              "type" : "text"
            }
          }
        },
        {
          "string_fields" : {
            "match" : "*",
            "match_mapping_type" : "string",
            "mapping" : {
              "fields" : {
                "keyword" : {
                  "ignore_above" : 256,
                  "type" : "keyword"
                }
              },
              "norms" : false,
              "type" : "text"
            }
          }
        }
      ],
      "date_detection" : true,
      "numeric_detection" : false,
      "properties" : {
        "@timestamp" : {
          "type" : "date"
        },
        "@version" : {
          "type" : "keyword"
        },
        "geoip" : {
          "dynamic" : "true",
          "properties" : {
            "ip" : {
              "type" : "ip"
            },
            "latitude" : {
              "type" : "half_float"
            },
            "location" : {
              "type" : "geo_point"
            },
            "longitude" : {
              "type" : "half_float"
            }
          }
        }
      }
    }
  },

This is quite a bit different than my working staging/production one (could be because I have tried to do some manual edits to the index template in an attempt to get it to work) e.g:

"XXX-staging-2022.01.01-000337" : {
    "mappings" : {
      "dynamic_templates" : [
        {
          "message_field" : {
            "path_match" : "message",
            "match_mapping_type" : "string",
            "mapping" : {
              "norms" : false,
              "type" : "text"
            }
          }
        },
        {
          "string_fields" : {
            "match" : "*",
            "match_mapping_type" : "string",
            "mapping" : {
              "fields" : {
                "keyword" : {
                  "ignore_above" : 256,
                  "type" : "keyword"
                }
              },
              "norms" : false,
              "type" : "text"
            }
          }
        }
      ],
      "properties" : {
        "@timestamp" : {
          "type" : "date"
        },
        "@version" : {
          "type" : "keyword"
        },
        "geoip" : {
          "dynamic" : "true",
          "properties" : {
            "ip" : {
              "type" : "ip"
            },
            "latitude" : {
              "type" : "half_float"
            },
            "location" : {
              "type" : "geo_point"
            },
            "longitude" : {
              "type" : "half_float"
            }
          }
        }
      }
    }
  },

Here is an example of an entry:

    {
        "_index" : "xxx-development-2022.01.04-000001",
        "_type" : "_doc",
        "_id" : "-yqOIn4B1SKzgg2f1LA3",
        "_score" : 1.0,
        "_source" : {
          "data" : {
            "CurrentTS" : 1641257288608,
            "LagMS" : 490,
            "StartTS" : 1641257288118,
            "PreviousTS" : 0,
            "Counter" : 0
          },
          "serviceType" : "VENUE",
          "logId" : "[LAG] Exceeds lag time",
          "pid" : 1375,
          "serviceName" : "blah",
          "log" : {
            "file" : {
              "path" : "/home/ubuntu/xxx/Logs/VENUE_blah.trace.2022-01-04-00.log"
            }
          },
          "@timestamp" : "2022-01-04T00:48:08.609Z",
          "level" : "DEBUG"
        }
      },

Is that a timestamp format it can recognise?

Yes thats a legit format, I have tried what you are experiencing on a local cluster. I used file input instead with same mappings and everything, I can confirm kibana lets me choose @timestamp for my index pattern. I cannot figure out whats wrong here :confused:

Maybe your index template overrides the index mappings, can you make sure you can do a range aggregation using the @timestamp field.

I can try, do you mind giving me an example command (sorry never done that before so not sure how).

you can try this, It should return documents within 2 days.

GET /xxx-development*/_search
{
  "query": {
    "range": {
      "@timestamp": {
        "gte": "now-2d",
        "lte": "now"
      }
    }
  }
}

Thanks, that returned results as expected e.g.

{
  "took" : 1,
  "timed_out" : false,
  "_shards" : {
    "total" : 4,
    "successful" : 4,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 10000,
      "relation" : "gte"
    },
    "max_score" : 1.0,
    "hits" : [
      {
....

Although I would expect more than 10 hits, not sure what happen to the rest.

That total you see is total shards used to get this response, you have more than 10K results you can see it under:

hits.total.value part of the response.

This can confirm your @timestamp is legit and can be used as a date. I cannot point out why kibana refuses to pick it up as date on index pattern creation. have you tried creating it like xxx-development* instead of xxx-development-* maybe ilm rollover alias confuses kibana.

note: If you meant you see 10 results on kibana, thats the default size of query responses you can change it by adding size: before the query json object.

I just tried xxx-development* (picks up alias as well), but still, The indices which match this index pattern don't contain any time fields. on step 2 :(.

A few things that I have not mentioned:

  • I previously had an index with this name but I removed it and started again (i.e. I cleaned up the index logs, index template and policy).
  • the reason why I deleted everything is that I was getting conflicting fields that I could not refresh to fix. Inspecting the index pattern it was reporting conflicts but on index logs that were cleaned up months ago!

Note: if I try to continue after Step2 I just get blank page so I don't do that anymore until I can fix the timestamp issue.

what do you mean by;

if I try to continue after Step2 I just get blank page

even if you don't choose a time field, kibanas discover should show you the documents.

So when I say proceed past Step2, this is what I see (nothing):

If I try to reload or select it from the index pattern page, same thing.

If I bring up the dev console (f12), I see:

Something is completely screwed up, not sure what to check next. Any suggestions for diagnostics?

Hey there,

can you post the result for;

GET _cat/templates/your-template-name

This is what I get:

xxx-development [xxx-development-*] 0 20211223

I cannot think of a way to diagnose this issue any further, I am sorry. but the kibana error you got when creating the index patterns is concerning maybe try to solve that problem first.