Logstash pipeline for kafka message

Hi Team, Need some guidance to prepare log stash pipeline for below message. I am getting below message from KAFKA topic and i need to stash it into Elasticsearch though log stash. I am able to do it but I want to insert JSON fields as a separate field in Elasticsearch. not as a one complete message.

sample kafka message -
{
"request": {
"cardnumber": "4545454545454554",
"servicename": "ENQUIRY",
"funcation": "VTSservice",
"uniqueID": 124578545421,
"application": "XYZ",
"timestamp": "2020-05-21T15:13:18.853+04:00",
"messagetrace": "tesing in progress",
"status": "success"
}
}

CONFIG FILE IS -

input {
   kafka {
         bootstrap_servers  => "localhost:9092"
        topics => "mytopic"

}
}


output {
elasticsearch {
hosts => ["XX.XXX.0.8:9200"]
index => "kafkalogs"
}
stdout { codec => rubydebug }
}


Elasticsearch snippet-

image

I want it display / stash my message in individual field in Elasticsearch.

I want help in preparing config file with filters..

Use a json filter.

HI @badger, Let me ask in another way.
I have created index in Elasticsearch with 5 fields. like field1, field2.....field5.
In input Json file is also having 5 fields. field A, field B, field C..... Field E.

Now i want to map field A to field 1 in perticuler index. and here i want to do some data conversion.

how can map the input json fields to index fields.

You can use a mutate filter to rename fields and to do type conversions.