Logstash skips string in the beginning of the event and ends up with json parse error

[2024-06-27T07:38:58,131][ERROR][logstash.codecs.json     ][main][f606102f50879b5c431b527abd5f18d638386c850d0f511237fefcb5bd81e725] JSON parse error, original data now in message field {:message=>"incompatible json object type=java.lang.String , only hash map or arrays are supported", :exception=>LogStash::Json::ParserError, :data=>"\"EventStats\",\"Severity\":\"Medium\",\"autoguid\":\"1405da0b-e38c-4f5c-9336-f51b158c5321\",\"direction\":\"read\",\"pldseverity\":\"INFO\",\"***************\":\"04d12a23-b39e-4046-8a0f-0eef336ff22c\",\"failed\":\"no\",\"source\":\"[\\\"axl://okta-qa-china-1a-0.okta-qa-china-1a-headdown.qa-china-databus.svc.cluster.local:9092,okta-qa-china-1a-1.okta-qa-china-1a-headdown.qa-china-databus.svc.cluster.local:9092,okta-qa-china-1a-2.okta-qa-china-1a-headdown.qa-china-databus.svc.cluster.local:9092/alp.incident.raw/group0/0\\\",\\\"axl://okta-qa-china-1a-0.okta-qa-china-1a-headdown.qa-china-databus.svc.cluster.local:9092,okta-qa-china-1a-1.okta-qa-china-1a-headdown.qa-china-databus.svc.cluster.local:9092,okta-qa-china-1a-2.okta-qa-china-1a-headdown.qa-china-databus.svc.cluster.local:9092/alp.incident.nonopg/group0/0\\\"]\",\"tenant\":\"

this are the source logs that is generated


{ "@timestamp":"2024-06-25T05:27:59.693Z","log":{"level":"Information"}, "container":{"name":"uta","id":"alp-uta-1.2.500-***************"}, "labels": {"Description":"Statistical Information Per Event","TestClassID":"EventStats","Severity":"Medium","autoguid":"41e4a115-2270-4265-985a-25ce39e4434d","direction":"read","pldseverity":"INFO","***************":"23455-3186-49b4-8576-***********","failed":"no","source":"[\"axl://okta-qa-china-1a-0.okta-qa-china-1a-headdown.qa-china-databus.svc.cluster.local:9092,okta-qa-china-1a-1.okta-qa-china-1a-headdown.qa-china-databus.svc.cluster.local:9092,okta-qa-china-1a-2.okta-qa-china-1a-headdown.qa-china-databus.svc.cluster.local:9092/alp.incident.raw/group0/0\",\"axl://okta-qa-china-1a-0.okta-qa-china-1a-headdown.qa-china-databus.svc.cluster.local:9092,okta-qa-china-1a-1.okta-qa-china-1a-headdown.qa-china-databus.svc.cluster.local:9092,okta-qa-china-1a-2.okta-qa-china-1a-headdown.qa-china-databus.svc.cluster.local:9092/alp.incident.nonopg/group0/0\"]","tenant":"adfghh" ,"id":6963,"alpeventid":19108} }

grok

input {
file {
        path => "test/test2/logs.*"
        start_position => "beginning"
        sincedb_path => "test/logs/.sincedb"
        type => "test2-stats"
        codec => json
      }
}

filter {
 if [type] =~ "test2-stats" {
            mutate {
                 add_field => { "[@metadata][typenew]" => "%{[labels][TestClassID]}" }
            }
            mutate {
              lowercase => [ "[@metadata][typenew]" ]
            }
}

output {
       if [type] == "test2-stats"  {
        elasticsearch {
          hosts => ["************************"]
          template_overwrite => true
          ssl => true
          index => "test2-%{+YYYY.MM.dd}"
          document_id => "%{fingerprint}"
        }
}
  
            

fingerprint.conf:
    filter {
      fingerprint {
        method => "SHA1"
        concatenate_sources => true
        source => ["@timestamp", "log.level", "container.name", "message"]
        target => "fingerprint"
      }
    }

sometimes I also recieve the below

Rotation In Progress - inode change detected and original content is not fully read, file is closed and path points to new content

please help me out.

I m not able to replicate this issue in developement environment as I don't inode changes happening.

logstash parses it.

please help