Problem with JSON date types

I am generating custom logs formatted as single line JSON with Date fields and sending them to ES with Filebeat, however when the index and pattern is created, the date fields are Strings and numbers instead of Dates.

datetest.ps1

@{ Date = (Get-Date) } |  ConvertTo-Json -Compress | Out-File -Encoding utf8 -Force -Append file.json
@{ ToFileTimeUtc = (Get-Date).ToFileTimeUtc() } |  ConvertTo-Json -Compress | Out-File -Encoding utf8 -Force -Append file.json
@{ DateTime = (Get-Date).DateTime } |  ConvertTo-Json -Compress | Out-File -Encoding utf8 -Force -Append file.json

file.json

{"Date":{"value":"\/Date(1565580667972)\/","DisplayHint":2,"DateTime":"Monday, August 12, 2019 1:31:07 PM"}}
{"ToFileTimeUtc":132100542679805620}
{"DateTime":"Monday, August 12, 2019 1:31:07 PM"}

How can i get Date Types into ES?

04

I was able to get this partly working, if i PUT a _Mapping into ES before sending any data..

How can i format my data in the first place, and let dynamic/auto mapping occur, rather than me manually create the mapping before hand?

For timestamps you need a mapping. As JSON has no notion of time. Without mapping Elasticsearch thinks it is a string.

One normally creates a template. Beats generate templates dynamicaly (adjusting settings to the Elasticsearch version being used).

Using setup.template.append_fields (see: Load the Elasticsearch index template), one can add extra mappings right from within the beats config.