How can I extract a sub-field from a field and print it as a separate field in filebeat?

"msg":"{"appName":"abc","eventCategory":"Authentication event","eventType":"Operator record change","id":"12345","ipAddress":"0.0.1.1","nodeID":"nodeabc","operation":"update","operatorID":"admin","operatorRecID":"DATAADMIN","operatorRecName":"Administrator Copy","requestorIdentity":"121212","tenantID":"shared","timeStamp":"Tue 2023 May 23, 10:17:47:593"}"

I need to extract the "timeStamp" field from this whole msg object, and print it as an independent field.

Have you tried decode_json_fields processor?

Hi @varunsingla and welcome to the community!

Additional context would be helpful so that the community can best provide insight.

How are you ingesting these logs?

As @kcreddy suggested, these appear to be JSON logs so you may want to look into the following:

I tried:

  • decode_json_fields:
    fields: ["msg.timeStamp"]
    target: ""
    max_depth: 1
    overwrite_keys: false
    process_array: false
    add_error_key: true

but got these errors:

"log.logger":"truncate_fields","log.origin":{"file.name":"actions/decode_json_fields.go","file.line":109},"message":"Error trying to GetValue for field : msg.timeStamp in event : &{2023-05-24 10:46:27.702919726 -0500 CDT m=+14.072969149 {}

decode_json_fields=msg.timeStamp}: expected map but type is string",

I tried:

  • decode_json_fields:
    fields: ["msg.timeStamp"]
    target: ""
    max_depth: 1
    overwrite_keys: false
    process_array: false
    add_error_key: true

But, got these errors:

"log.logger":"truncate_fields","log.origin":{"file.name":"actions/decode_json_fields.go","file.line":109},"message":"Error trying to GetValue for field : msg.timeStamp in event : &{2023-05-24 10:46:27.702919726 -0500 CDT m=+14.072969149 {}

decode_json_fields=msg.timeStamp}: expected map but type is string","service.name":"filebeat","ecs.version":"1.6.0"}

Hello @eMitch , I am not using Logstash or Elasticsearch.
Just ingesting logs through filebeat.inputs

filebeat.inputs:

  • type: filestream

    id: my-filestream-id

    enabled: true

    paths:

    • /opt/tomcat/logs/SecurityEvent.log

try fields: ["msg"] - then see if you can access timeStamp as a field.

Thanks @eMitch after you suggested, I used ["msg"] under fields, and it gave me the desired result :slight_smile:

Now I am looking help on converting the timeStamp to EPOCH format and ipAddress as well to a particular format.

Could you please suggest how can that be done?

I tried using convert processor to convert ipAddress to "long" type, but it started returning a blank value instead:

  • convert:
    fields:
    - {from: "ipAddress", to: "device.ip4", type: "long"}

Please suggest.

what is your expected output for the IP if converting to "long"? For example, if the IP is 192.168.1.1, what do you expect the final output to be?

If you're not moving this data into elasticsearch, then you may want to check out the script processor and write some custom javascript to do those conversions.

192.168.1.1 should be converted to 3232235777

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.