Issue in parsing huge json file

I have a huge json file in a log for one success api call. I am able to see that response at when there is no filter is applied over it.(seeing as string)

***Small json are working as intended.

When I am applying the json filter, it is not coming at the logstash output (I have tried writing the output to a file for confirming).

Is there any logstash configuration which needs to be taken care.

My config:

input {

kafka {
bootstrap_servers => 'server1:9092, server2:9092, server3:9092' #add the kafka hosts here
topics => ["xxxxxxxxxx"]
codec => "json"
}
}

filter {
json {
source => "message"
skip_on_invalid_json => "true"
}

mutate {
	add_field => {"index_name" => '%{type}'}
	add_field => {"type_name" => '%{type}'}

}
}

output {

if [type_name] == "xxxxxxx" {
elasticsearch {
hosts => ["xxxxxxxxxxx:80"]
index => "%{[type_name]}-%{+YYYY.MM.dd}"
document_type => "%{[type_name]}"
}
file {
path => "/srv/logs/xxxxxxx.log"
codec =>
line {
format => "message: %{message}"
}
}
}

I'm confused. You have the json codec decoding the JSON messages from Kafka, then you also have the json filter to reparse the message field? Is there JSON inside of JSON here?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.