I want to build an ingestion pipeline that takes my lidar data from the kafka topic and ingests it to the current active index.
I have created an index_template that match my index name and has the required mappings (convert timestamps, ...), and has the following settings:
{
"index": {
"lifecycle": {
"name": "lidar_ilm_policy",
"rollover_alias": "lidar_alias"
}
}
}
I have also created the policy that handles the rollover for my index (i.e. lidar-*), the policy creates an index every 5 minutes, and delete the old every 10 minutes.
How to configure the kafka connector to always point to the the alias?
Let's assume the connector have the following initial config:
{
"connector.class": "io.confluent.connect.elasticsearch.ElasticsearchSinkConnector",
"tasks.max": "12",
"topics": "lidar_object_events",
"connection.url": "<url>",
"type.name": "_doc",
"schema.ignore": "true",
"key.ignore": "true",
"flush.size": "10",
"flush.timeout.ms": "10000",
}
What needs to be added in the connector's config? is this process doable in the first place?