Parse json in Log field to get individual fields for visualization

Hello
I am new to Kibana and have difficulty reading the logs for visualization.
I have pasted a sample of how my log field looks.
I need to read the values inside the log for creating visualizations in Kibana, like where env is DEV, or where transactionId is xyz, or where payload contains a certain value. Could someone help me with how this is done? Thank you!

Pasting sample log field here:

2021-12-08T14:02:49.899 INFO [bwEngThread:In-Memory Process Worker-1] c.t.b.p.g.L.E.LogMessage - {"env":"<mark>DEV</mark>","appName":"GenLogs","transactionID":"449d8241-392f-4877-a5ea-dddeebed3c29","timestamp":"1638972169775","srcApplication":"EPIC","operation":"testLog","type":"INFO","message":"New message logged.","payload":"<timer:TimerOutputSchema xmlns:timer=\"http://tns.tibco.com/bw/activity/timer/xsd/output\"><Now>1638972169328</Now><Hour>2</Hour><Minute>2</Minute><Second>49</Second><Week>50</Week><Month>12</Month><Year>2021</Year><Date>2021-12-08</Date><Time>2:02:49 PM</Time><DayOfMonth>8</DayOfMonth></timer:TimerOutputSchema>"}

You will need to use ingest processors like grok or dissect and then json to parse it all out. If you are ingesting through Logstash you can also do it there.

If you can paste your entire log message in text format then someone might be able to help you out configuring it. Should be text, not a screenshot.

Thank you for replying. I have updated my post with the log details now.

I am using fluent-bit and logstash to ingest.

Try this for a grok pattern:
%{DATE}T%{TIME} %{NOTSPACE:log_level} \[%{DATA:worker}\] %{DATA:message_type} - %{GREEDYDATA:json_message}

You should end up with something like this:

{
  "DATE": [
    [
      "21-12-08"
    ]
  ],
  "DATE_US": [
    [
      null
    ]
  ],
  "MONTHNUM": [
    [
      null,
      "12"
    ]
  ],
  "MONTHDAY": [
    [
      null,
      "21"
    ]
  ],
  "YEAR": [
    [
      null,
      "08"
    ]
  ],
  "DATE_EU": [
    [
      "21-12-08"
    ]
  ],
  "TIME": [
    [
      "14:02:49.899"
    ]
  ],
  "HOUR": [
    [
      "14"
    ]
  ],
  "MINUTE": [
    [
      "02"
    ]
  ],
  "SECOND": [
    [
      "49.899"
    ]
  ],
  "log_level": [
    [
      "INFO"
    ]
  ],
  "worker": [
    [
      "bwEngThread:In-Memory Process Worker-1"
    ]
  ],
  "message_type": [
    [
      "c.t.b.p.g.L.E.LogMessage"
    ]
  ],
  "json_message": [
    [
      "{"env":"<mark>DEV</mark>","appName":"GenLogs","transactionID":"449d8241-392f-4877-a5ea-dddeebed3c29","timestamp":"1638972169775","srcApplication":"EPIC","operation":"testLog","type":"INFO","message":"New message logged.","payload":"<timer:TimerOutputSchema xmlns:timer=\\"http://tns.tibco.com/bw/activity/timer/xsd/output\\"><Now>1638972169328</Now><Hour>2</Hour><Minute>2</Minute><Second>49</Second><Week>50</Week><Month>12</Month><Year>2021</Year><Date>2021-12-08</Date><Time>2:02:49 PM</Time><DayOfMonth>8</DayOfMonth></timer:TimerOutputSchema>"}"
    ]
  ]
}

Then you can send the json_message field through a JSON filter and break that apart even further so that you can get individual fields in the root of your json object.

Hope this helps!

Thank you so much for taking time out for helping me. I will give this a try.

Hi Andres,

I had to remove logstash from the stack as customer wants only fluentbit and elastic.
I read that grok does not work with fluentbit. IS there any other way to achieve what I want with just fluentbit and elastic?

Thank you

You can use Ingest Processors in Elastic. Here is a simulation so you can see how it's done. Run in Dev Tools in Kibana.

POST _ingest/pipeline/_simulate
{
  "pipeline": {
    "description": "...",
    "processors": [
      {
        "grok": {
          "field": "message",
          "patterns": [
            """
              %{DATE}T%{TIME} %{NOTSPACE:log_level} \[%{DATA:worker}\] %{DATA:message_type} - %{GREEDYDATA:json_message}
            """
          ]
        }
      },
      {
        "json": {
          "field": "json_message",
          "add_to_root": true
        }
      },
      {
        "remove": {
          "field": [
            "message",
            "json_message"
          ]
        }
      }
    ]
  },
  "docs": [
    {
      "_source": {
        "message": """
          2021-12-08T14:02:49.899 INFO [bwEngThread:In-Memory Process Worker-1] c.t.b.p.g.L.E.LogMessage - {"env":"<mark>DEV</mark>","appName":"GenLogs","transactionID":"449d8241-392f-4877-a5ea-dddeebed3c29","timestamp":"1638972169775","srcApplication":"EPIC","operation":"testLog","type":"INFO","message":"New message logged.","payload":"<timer:TimerOutputSchema xmlns:timer=\"http://tns.tibco.com/bw/activity/timer/xsd/output\"><Now>1638972169328</Now><Hour>2</Hour><Minute>2</Minute><Second>49</Second><Week>50</Week><Month>12</Month><Year>2021</Year><Date>2021-12-08</Date><Time>2:02:49 PM</Time><DayOfMonth>8</DayOfMonth></timer:TimerOutputSchema>"}
        """
      }
    }
  ]
}

Thank you

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.