How to parse array of objects into separate field

I have some log look like

########2023-08-12#########

{‘crewrosters’: [ { ‘crew_roster’ : ‘det1’, 'empno': 1} , {‘crew_roster’ : ‘det2’, 'empno': 2} , {‘crew_roster’ : ‘det3’, 'empno': 3} ] }

I need to parse the data to get the value of each 'empno' and 'crew_roster' fields separately.
Please suggest.

Thanks in advance!

I would suggest

    mutate { gsub => [ "message", "‘", '"', "message", "’", '"', "message", "'", '"' ] }
    json { source => "message" remove_field => [ "message" ] }
    split { field => "crewrosters" }

which results in three events like

{
 "@timestamp" => 2023-08-13T00:50:45.486366051Z,
   "@version" => "1",
"crewrosters" => {
    "crew_roster" => "det3",
          "empno" => 3
    }
}
1 Like

Hi Badger,
Thanks for your reply. I tried the same.
I could see that you have used the conversion of single quote to double quote thrice. Because number of events in my data is dynamic, it is not always three.

There is a date_time field in my data. Few date_time fields are empty here. When I replace from '(single quote) to "(double quote) the date_time field is not getting converted. With the below error message
"Can not be converted from type [date] to [text]"

Thank you!

I am not converting it three times, I am globally converting three different characters: left and right curly single quote, plus straight single quote.

Is that a mapping exception from elasticsearch? If your field is empty you will need to modify it in logstash to either delete the field or set a default date.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.