Painless script to calculate elapsed time between two fields

Hi All,

i have two fields,
up : "2021-06-03T07:01:56.159Z"
down : "2021-06-03T10:01:29.978Z"

Following the above link and solution,
doc['down'].value.millis - doc['up'].value.millis

shows me Script is invalid. View script preview for details in painless scripted tab.

Type : date
Format : Date

Please advise.

View script preview for details

What does it show in script preview?

There nothing in script preview. FYI, i am running 7.13.1 and it shows that

Scripted fields are deprecated

Attached is screenshot of script window.

You can see the script preview by clicking "Get help with the syntax and preview the results of your script". Could you click that and do a screenshot of the flyout that opens?

There's an error in your script

{
 "rootcause": [
  {
   "type": "scriptexception",
   "reason": "runtime error",
   "scriptstack": [
    "org.elasticsearch.index.fielddata.ScriptDocValues$Dates.get(ScriptDocValues.java:149)",
    "org.elasticsearch.index.fielddata.ScriptDocValues$Dates.getValue(ScriptDocValues.java:143)",
    "doc['down'].value.millis - doc['up'].value.millis",
    "                                    ^---- HERE"
   ],
   "script": "doc['down'].value.millis - doc['up'].value.millis",
   "lang": "painless",
   "position": {
    "offset": 54,
    "start": 0,
    "end": 67
   }
  }
 ],
 "type": "searchphaseexecutionexception",
 "reason": "all shards failed",
 "phase": "query",
 "grouped": true,
 "failedshards": [
  {
   "shard": 0,
   "index": "durations",
   "node": "WnIzR55T5iBqhPS2vP9lw",
   "reason": {
    "type": "scriptexception",
    "reason": "runtime error",
    "scriptstack": [
     "org.elasticsearch.index.fielddata.ScriptDocValues$Dates.get(ScriptDocValues.java:149)",
     "org.elasticsearch.index.fielddata.ScriptDocValues$Dates.getValue(ScriptDocValues.java:143)",
     "doc['down'].value.millis - doc['up'].value.millis",
     "                                    ^---- HERE"
    ],
    "script": "doc['down'].value.millis - doc['up'].value.millis",
    "lang": "painless",
    "position": {
     "offset": 54,
     "start": 0,
     "end": 67
    },
    "causedby": {
     "type": "illegalstateexception",
     "reason": "A document doesn't have a value for a field! Use doc[<field>].size()==0 to check if a document is missing a field!"
    }
   }
  }
 ]
}

The "reason" tells you what's going on:

Can't put that into better words

I doubt that as "up" and "down" fields have timestamps in them.

A scripted field is executed once per document. As you can see in your screenshot, every document either has up or down, but never both. A scripted field can't access values from different documents.

oh, i see now!

i was trying to calculate the time difference between two timestamps. Isn't this proper way to do it using painless script?

As these fields already have timestamps in them. If this isn't possible, can you please guide me to another way to obtain this?

Much appreciated!

Anyone ?

You need to merge the documents so up and down are fields on the same document. One way to do this is using transforms with the latest method: Transforming data | Elasticsearch Guide [7.13] | Elastic

Thanks for the suggestion.

The example given in docs for transforming is grouping. In my case, i want to merge or group up and down (which are timestamps) into a single document.

Right, good point. Is there any field in your documents that could be used for grouping?

Unfortunately, for this particular index,I don't.

I still trying for using elapsed filter on these but the problem is with the worker 1 in LS. Using elapsed with worker 1 is making the elapsed join start and end tags from different files of different directories. Which is giving me false time differences between start and end.

Is there a way I can make logstash reset the elapsed filter once the file is done reading and start on a new file, without joining the start tag from previous dir/file ?

If we can do this, then I could get the desired differences. Looks for of a LS issue than kibana to me!