Add Nested document into the elasticsearch via logstash

` I have a problem, multiple events will come from Kafka for the same message-id. I want to use logstash to push the events into elastic search in a nested documents. How we can push the data into elasticsearch via logstash as nested documents.

Can someone help in this regards?

Example:

  1. Msg_id - m1 event - msg_requested timestamp - 1234
  2. Msg_id - m1 event - msg_received timestmap - 1235
  3. Msg_id - m1 event - msg_delivered timestmap - 1236

These events can come in different timestamp, the first event for the msg_id will create the parent doc. I want the data inside the elastic search to be looks like

"ts": "1234", "id": "m1", "sets": [ { "ts": "1234", "et": "msg_requested" }, { "ts": "1235", "et": "msg_received" }, { "ts": "1236", "et": "msg_delivered" } ] } `

Can someone help on this?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.