Codec Json not sending data to elasticsearch

INPUT: json

{"userid": 125,"type": "SELL"}
{"userid": 127,"type": "SELL"}

LOGSTASH CONF FILE:

input {
kafka {
bootstrap_servers => ""
topics => ["topic1"]
codec => "json"
}
}
output {
amazon_es {
hosts => ["XXXX"]
region => "XXXXX"
aws_access_key_id => 'xxxxxxxxxxxx'
aws_secret_access_key => 'xxxxxxxxxxx'
index => "indexname"
}

stdout { codec => rubydebug }
}

stdout output:
{
"userid" => 127,
"@version" => "1",
"@timestamp" => 2018-10-18T13:54:37.641Z,
"type" => "SELL"
}

The output looks exactly like what I want. But this will just not go into the elasticsearch topic.
If I do not use the json filter the entire json goes as 'message' into ES.

Any help will be appreciated.

I figured the problem. I was using a field called type in my json file data which was causing the problem. logstash stored it as _type in ES which is no more allowed with the newer versions of elasticsearch. And that is why if my type was 'BUY' it would take the entry and if it was 'SELL' it would reject it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.