I'm sorry if this is covered elsewhere, but I have been having trouble getting this to work.
I have a field in a log being sent to elasticsearch from filebeat. The log is ndjson formatted.
The field is called time and contains a value formatted like so: "1679266923.890941"
I first tried doing the following:
PUT _ingest/pipeline/id
{
processors [
{
"date": {
"field": "time",
"formats": [ "epoch_second"]
}
]
}
This does not put a proper, correct date (the date is several years before the actual date of this timestamp).
So I tried this:
PUT _ingest/pipeline/id
{
processors [
{
"convert" {
"field": "time",
"type": "long"
}
},
{
"date": {
"field": "time",
"formats": [ "epoch_second"]
}
}
]
}
This produces an error with filebeat.
Any suggestions would be very helpful.
Thanks.