Hello,
I was using the scripted field to get a date field from system.uptime.duration.ms field value. I used below painless script.
String val = "system.uptime.duration.ms";
if (doc[val] != null){
if (doc[val].size() != 0 ){
long now = doc["@timestamp"].value.toInstant().toEpochMilli();
long elapsedTime = now - doc[val].value;
return elapsedTime;
}
}
I wanted to migrate from this solution but not be forced to remember to add some additional fields whenever I do beats update.
I tried two painless scripts for the default pipeline.
1.) This one sets value from the object of ZonedDateTime but elasticsearch dynamically recognizes it as a keyword which messes up my dashboard.
if (ctx?.system?.uptime?.duration?.ms != null) {
ZonedDateTime zdt = ZonedDateTime.parse(ctx['@timestamp']);
long now = zdt.toInstant().toEpochMilli();
long elapsedTime = now - ctx.system.uptime.duration.ms;
Instant instant = Instant.ofEpochMilli(elapsedTime);
ZonedDateTime zdt = ZonedDateTime.ofInstant(instant, ZoneId.of('Z'));
ctx['system.uptime.since'] = zdt;
}
2.) This one sets it up as long but I cannot set up the date format inside the index pattern. So just big number is displayed inside of the date.
if (ctx?.system?.uptime?.duration?.ms != null) {
ZonedDateTime zdt = ZonedDateTime.parse(ctx['@timestamp']);
long now = zdt.toInstant().toEpochMilli();
long elapsedTime = now - ctx.system.uptime.duration.ms;
ctx['system.uptime.since'] = elapsedTime ;
}
Is there a way besides adding manually mapping to the index template that says that this field should be recognized as date not long or keyword?
ELK/Kibana 7.17-5