Filter to add fiels of on nested JSON name value fields

Hi,
In Logstash I'm trying to create new fields from Nested JSON where the value of "Name" is part of the field name and "Value" is the actual value of the new field.

For example I got part of an audit log here below.

data.DeviceProperties	
{
  "Name": "OS",
  "Value": "Windows 10"
},
{
  "Name": "BrowserType",
  "Value": "Firefox"
},
{
  "Name": "IsCompliantAndManaged",
  "Value": "False"
}

Now from this I wish to see 3 new fields being present in Kibana, namely:
data.DeviceProperties.OS: "Windows 10"
data.DeviceProperties.BrowserType: "FireFox"
data.DeviceProperties.IsCompliantAndManaged: False

I hope someone has an solution or suggestion how I can achieve this in Logstash.

You could do something like this.

Thank you, it looks like it might be the thing I was looking for. I'll go try it and will let you know if it worked.

Hi, I'm getting an unexpected error I can't seem to resolve.

[2022-03-30T10:58:44,651][ERROR][logstash.codecs.json     ][beats-server][219d0691fbdb75c864a5b62548deb7f17646341a375b877ee0e35512e7ef3280] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unexpected end-of-input within/between Object entries
 at [Source: (String)"{"CreationTime": "2021-11-08T10:10:14", "Id": "x", "Operation": "UserLoggedIn", "OrganizationId": "x", "RecordType": 15, "ResultStatus": "Success", "UserKey": "x", "UserType": 0, "Version": 1, "Workload": "AzureActiveDirectory", "ClientIP": "x", "ObjectId": "x", "UserId": "x", "AzureActiveDirectoryEventType": "[truncated 24 chars]; line: 1, column: 1049]>, :data=>"{\"CreationTime\": \"2021-11-08T10:10:14\", \"Id\": \"x\", \"Operation\": \"UserLoggedIn\", \"OrganizationId\": \"x\", \"RecordType\": 15, \"ResultStatus\": \"Success\", \"UserKey\": \"x\", \"UserType\": 0, \"Version\": 1, \"Workload\": \"AzureActiveDirectory\", \"ClientIP\": \"x\", \"ObjectId\": \"x\", \"UserId\": \"x\", \"AzureActiveDirectoryEventType\": 1, \"ExtendedProperties\":"}

The full log it is trying to parse:

{"CreationTime": "2021-11-08T10:10:14", "Id": "x", "Operation": "UserLoggedIn", "OrganizationId": "x", "RecordType": 15, "ResultStatus": "Success", "UserKey": "x", "UserType": 0, "Version": 1, "Workload": "AzureActiveDirectory", "ClientIP": "x", "ObjectId": "x", "UserId": "x", "AzureActiveDirectoryEventType": 1, "ExtendedProperties":
[{"Name": "ResultStatusDetail", "Value": "Success"}, {"Name": "UserAgent", "Value": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/15.0 Safari/605.1.15"}, {"Name": "RequestType", "Value": "OAuth2:Authorize"}], "ModifiedProperties": [], "Actor": [{"ID": "x", "Type": 0}, {"ID": "x", "Type": 5}], "ActorContextId": "x", "ActorIpAddress": "x", "InterSystemsId": "x", "IntraSystemId": "x", "SupportTicketId": "", "Target": [{"ID": "x", "Type": 0}], "TargetContextId": "x", "ApplicationId": "x", "DeviceProperties": [{"Name": "OS", "Value": "MacOs"}, {"Name": "BrowserType", "Value": "Safari"}, {"Name": "IsCompliantAndManaged", "Value": "False"}, {"Name": "SessionId", "Value": "x"}], "ErrorNumber": "0", "customerName": "x"}

The content of the logstash.conf as of now:

input {
  pipeline { address => "office365" }
}

filter {
   json {
     source => "DeviceProperties"
   }
       ruby {
        code => '
        props = event.get("DeviceProperties")
        if props
            props.each { |x|
                key = x["key"]
                event.set("DeviceProperties.#{key}", x["value"])
            }
            event.remove("DeviceProperties")
        end
    '
    }
}


output {
  elasticsearch {    
  }
}

And pipelines.yml

- pipeline.id: beats-server
  config.string:
    input {
      beats {
        port => 5046
        codec => "json"
      }
    }

    output {
      pipeline {
        send_to => "office365"
      }
    }


- pipeline.id: office365
  path.config: "/etc/logstash/conf.d/logstash.conf"

What I've tried this far is changing the input codec from "json" to "json_lines" and I've tried it without specifying the "json source" in the filter but unfortunaly to no prevail.

Hope you can see what I'm doing wrong here!

That is two lines. You are going to need to combine them before using a json filter or codec. You will probably want to solve that upstream in the beat.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.