Grok Filter not working with File beat Log Message

I'm using file beat for as log collector.

my log data :

2023-01-05T11:57:48.179Z [ERROR] ABC Company {"Pod": "7d45bf43lbr", "Service": "liveX", "failed to create patch:": "invalid JSON Document"}

I supposed to get Key value pairs in above log line. but i got everything in one field , so that i used filters in log stash level, I think "message" not parsing through filters . because i can see in Kibana , again entire log line in one filed.

I'm clueless about my mistake. please help me.

filebeat config:

enabled: true

filebeat.inputs:
- type: filestream
  id: my-filestream-1
  paths:
    - C:/beats.log
output.logstash:
  hosts: ["localhost:5044"]

log stash config:

input {
  beats {
    port => 5044
    }
}

filter {
  grok { 
    match => { "message" => "%{TIMESTAMP_ISO8601:time}\t+\[%{LOGLEVEL:loglevel}\]\t+%{DATA:textData}\t+%{GREEDYDATA:jsonMessage}" }

  }
  dissect {
    mapping => {
          "message" => "%{time}	[%{loglevel}]	%{textData}	%{jsonMessage}"

    }
  }
  json{
      source => "jsonMessage"
      remove_field=>["jsonMessage"]
  }
}


output {
  elasticsearch {
    hosts=> ["localhost:9200"]
    index=> "stream1"
    user=> "elastic"
    password => "YZwXX7fTYcCSw4K"

  }
  # stdout { codec => rubydebug }
  stdout { codec => plain{charset => 'UTF-16BE'} }
}

Thanks in Advance.

My previous ticket:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.