Get Month From Date/Time Field [Painless]

Hello
Following this example

When using doc['timestamp'].value.getMonth(); Getting an error
[Possible causing node crash, still monitoring]
doc['timestamp'].value.getMonthValue(); works as expected and returning month number.

What is the way to get month names? Without using SWITCH / CASE

I tried some other functions like getMonths()

if(!doc.containsKey('timestamp') || doc['timestamp'].empty )
return 'None';
else 
return doc['timestamp'].value.getMonths();

Also tried getShortMonths();

getting errors -->

    "type": "illegal_argument_exception",
     "reason": "dynamic method [org.elasticsearch.script.JodaCompatibleZonedDateTime, getMonths/0] not found"

Cheers!

Believe the answer is what you tried, or that is what works for me. getMonth()

How do you know it's causing a crash? Can you test this in Painless Lab to see if it works there for you?

Painless Lab

ZonedDateTime zdt = ZonedDateTime.parse(params.date); 

return zdt.getMonth();

Parameters

{
  "date": "2019-04-10T18:07:50.022Z"
}

Result

APRIL

@aaron-nimocks
Sorry for my ignorance. What is Painless Lab?

Cheers!

OK Found it :slight_smile:
It works

Can I use painless lab with data from my index?

Can I use painless lab with data from my index?

I don't believe you can yet. Would be nice though!

But you should be able to use getMonth() in your script. If it is causing a crash we would be very interested in trying to replicate that error.

And how is this possible?
The month is January and returning December?

What is the exact syntax?
Should I use ZonedDateTime ?

Using this script

return doc['closed_at'].value.getMonth();

Returns this error

{
 "root_cause": [
  {
   "type": "node_disconnected_exception",
   "reason": "[Node1][IP:9300][indices:data/read/search[phase/query]] disconnected"
  }
 ],
 "type": "search_phase_execution_exception",
 "reason": "all shards failed",
 "phase": "query",
 "grouped": true,
 "failed_shards": [
  {
   "shard": 0,
   "index": "my-index",
   "node": "_RdeNIWkQqKnpzSVXDKhug",
   "reason": {
    "type": "node_disconnected_exception",
    "reason": "[Node1][IP:9300][indices:data/read/search[phase/query]] disconnected"
   }
  }
 ]
}

And nodes are not disconnected, for sure. I have 5 nodes up and running.

return doc['timestamp'].value.getMonth(); gives me the correct output and works in a Scripted Field. You don't need to set the ZonedDateTime type since it is reading timestamp. I had to in my example or it would think of it as a String in Painless Lab.

I am not sure why that is returning that error for you but that's the correct way to do the query. Hope someone else can jump that has seen that error before.

Thanks @aaron-nimocks

Any idea about the wrong numbers returned?
Should I convert the date/time field to the local timezone? How?
It is showing the correct date/time, but when extracting the month it is showing previous one. As I am in UTC+11 Elastic might think it is previous day?!

A bit confused here...

Cheers!

It could be timezone related but I checked it and couldn't replicate what you are seeing. As far as I can tell the function getMonth() doesn't look at all at timezones.

Are you able to share a screenshot of your entire Scripted Field?

Also can you give me an example of data in your timestamp field?

This is the script at this moment (ideal will want to return month name.

doc['closed_at'].value.getMonthValue();

Sample data is like the screenshot above.
And here again

Also When I use get API (look at closed_at)-->

{
  "_index" : "my-index",
  "_type" : "_doc",
  "_id" : "__ENC__NDQ1YTAwNDIxYjE1MjhkMDg1NDZjYmZmMWQ0YmNiMWM=-ZmI0YWNjMDIxYjE1MjhkMDg1NDZjYmZmMWQ0YmNiYjM=",
  "_version" : 105,
  "_seq_no" : 95276,
  "_primary_term" : 1,
  "found" : true,
  "_source" : {
    "closed_at" : "2021-01-01T00:00:31.000+11:00",
    "inc_u_incident_type" : "TYPE",
    "made_sla" : "true",
    "sla" : "SLA",
    "priority" : "Priority",
    "inc_number" : "INC000000",
    "sys_id" : "__ENC__NDQ1YTAwNDIxYjE1MjhkMDg1NDZjYmZmMWQ0YmNiMWM=-ZmI0YWNjMDIxYjE1MjhkMDg1NDZjYmZmMWQ0YmNiYjM=",
    "inc_contact_type" : "Event",
    "contact_type" : "Event",
    "sla_progress" : 0.0,
    "stage" : "Completed",
    "company" : "Company",
    "state" : "Closed",
    "has_breached" : "HAS_BREACHED",
    "inc_parent_incident" : "",
    "timestamp" : "2021-02-02T12:54:41.394+11:00"
  }
}

What is your mapping for closed_at?

Also try this in your Scripted Field

ZonedDateTime zdt = ZonedDateTime.parse(doc['closed_at'].value); 

return zdt.getMonthValue();

This works well with numbers

ZonedDateTime doc_closed_at_tz = doc['closed_at'].value.withZoneSameInstant(ZoneId.of('Australia/Sydney'));//AEDT
return doc_closed_at_tz.getMonthValue();

So YES, need to convert to local TZ.

But still getting errors with getMonth()

Cheers!

1 Like
"mappings" : {
      "properties" : {
        "closed_at" : {
          "type" : "date",
          "fields" : {
            "keyword" : {
              "type" : "keyword"
            }
          }
        },
...
...

@aaron-nimocks

Got a reconstruction of the nodes crash.

Following script works in the lab, fails in the scripted fields module.

DateTimeFormatter formatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");
LocalDateTime localDateTime = LocalDateTime.parse(doc['closed_at.keyword'].value, formatter);
return localDateTime;

First run shows an error

{
 "root_cause": [
  {
   "type": "node_disconnected_exception",
   "reason": "[Node4][IP:9300][indices:data/read/search[phase/query]] disconnected"
  }
 ],
 "type": "search_phase_execution_exception",
 "reason": "all shards failed",
 "phase": "query",
 "grouped": true,
 "failed_shards": [
  {
   "shard": 0,
   "index": "my-index",
   "node": "wKXfVA24TgugQ_bCVs8wZA",
   "reason": {
    "type": "node_disconnected_exception",
    "reason": "[Node4][IP:9300][indices:data/read/search[phase/query]] disconnected"
   }
  }
 ]
}

The second run crashes 2 nodes.

Any advise?

I think I would create a new topic for specifically the reason of the failure and might get someone that has a better idea to look at it.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.