Kafka sink Connector to Elasticsearch

I am trying to set up ingestion pipeline to elasticsearch cluster via kafka sink connector. A question I have is if I have a doc that haas multiple json objects like this:

[
{"name":"abc", "company":"123","dept":"test"},
{"name":"def", "company":"456","dept":"dev"},
{"name":"ghi", "company":"654","dept":"qa"},
{"name":"jkl", "company":"567","dept":"test"},
{"name":"mno", "company":"786","dept":"qa"}
]

would kafka connector be able to parse each of the log line and ingest as a separate document or would it ingest the entirety of the array as one doc?

The example in this just talks about one doc. So curious if a file has multiple json objects. If possible, what tweaks need to be done?

Please help!

That can not be indexed into Elasticsearch as a single document so if the connector did not transform the data it would result in an error. I suspect you may need to reach out the Kafka community with questions about this.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.