Hi there i have a logstash config like this:
input{
    kafka
    {
            bootstrap_servers => "localhost:9092"
            topics => ["xx"]
            client_id => "v1"
            group_id => "v1-g"
    }
}
output {
  elasticsearch {
    hosts => ["localhost:9200"]
    id => "1"
    index => "xx"
    user => ""
    password => ""
  }
}
when i check my index, i see this mapping in my index:
@timestamp: -------
message: "---"
my data is JSON like:
{
    "bss": 502,
   "psw": 40,
   "time": "2020-12-15T12:12:12",
   "name": "no name"
}
how can i decode message of kafka (without avro) and import that message to index, not kafka's whole data !