Convert field containing a timestamp from string to a date / time or timestamp?

But Im confused on where do I define this.

Logstash? Elasticsearch? If Elasticsearch, where?

etc

In Kibana or using Curl. Below is a Curl example.

curl -X PUT "localhost:9200/my-index-000001/_mapping?pretty" -H 'Content-Type: application/json' -d'
{
  "properties": {
    "email": {
      "type": "keyword"
    }
  }
}
'

It would be Kibana.

When I setup a Kibana Index, it automatically detects it. Are you saying I need to "force it"?

You shouldn't have to force it if your data is coming in like you shown above. But it seems you still have something in your index or pattern that is setting it as a text/keyword.

The best practice is to set the mapping yourself so the data is mapped the way you want it. If you don't then Elastic will guess what the type is and isn't always what you want.

OK but I want to make it clear: When you mention that I set the mapping myself, where exactly do I set that up? That is what Im confused about.

When I setup a Kibana Index, I set up to watch "myindex-data-*" which gets it from myindex-data-20210302, myindex-data-20210301, etc...

All my data is recieved to Logstash THEN it gets sent to Elasticsearch. Elasticsearch doesnt recieve data directly.

I apologize. I can see where I am confusing you since you are using dynamically created index and not static.

You would create an index template in this case and include the pattern. Then anything that matches that pattern that comes in will use those mappings.

PUT _index_template/template_name
{
  "index_patterns": ["myindex-data-*"],
  "template": {
    "mappings": {
      "properties": {
        "TimeOfConnection": {
          "type": "date"
        }
      }
    }
  }
}

Im sorry I wasnt clear as there are some concepts in Elastic Stack that I am not yet understanding....

Is there a way to create dynamic created index but with some fields "force" them with a static type?

Currently, I have all my data incoming using a generic index template (I dont even know which honestly)

You probably don't have an index template you are currently using. When one doesn't exist it Elastic will guess the mapping.

If you insert the index template I put above and then clear out your old mapping/index then ingest the data again it should all work. You can add to the mapping if any other fields you are ingesting are of the incorrect type. Or you can leave as is and Elastic will map them for you.

Im going to attempt to create one today and see if I can get it to work that way.

Illl try to post back updates.

OK so Ive attempted a few times to create a template index but seem to be failing.

In Kibana, I went to Stack Management, Data, Index Management and went to Index Templates. I create a template there, with the mapping like you mentioned. But in my Logstash conf file, I tried to make it use that template but it says it cannot be found.

Im trying to even search what Index Template is being applied to a index...

You can run GET _index_template to see all the templates. But if you never created one for this index then I doubt you will find it.

In Dev Tools run the below and it will create it for you. Change the template_name to something more unique.

PUT _index_template/template_name
{
  "index_patterns": ["myindex-data-*"],
  "template": {
    "mappings": {
      "properties": {
        "TimeOfConnection": {
          "type": "date"
        }
      }
    }
  }
}

So maybe:

PUT _index_template/myndex_template
{
  "index_patterns": ["myndex-*"],
  "template": {
    "mappings": {
      "properties": {
        "TimeOfConnection": {
          "type": "date"
        }
      }
    }
  }
}

And you are saying to paste it here?

This works without credentials?

In Kibana there is Dev Tools

I apoligize for my blindness and stupidity.

1 Like

Well ACK true.....Let me see....

Still getting it as text....

Did you delete your index, index pattern, run the template API above, run logstash, create new index pattern? In that order.

Where are you seeing it as a string?

OK, so maybe Im doing something wrong. Lets confirm the steps.

1: Stop Logstash

systemctl stop logstash

2: Delete data index:

3: Delete Kibana Index Pattern:

4: Run the template API:

5: Start Logstash again:

systemctl start logstash

6: Thats it?

I do not have to change my Logstash conf?

Step 4 has to be done in Kibana Dev Tools Console, not the browser developer console.