Elasticsearch-river-kafka not creating daily indexes

Hello,

We are transferring data from logstash > kafka > elasticsearch-river-kafka > elasticsearch || kibana. While this method works, we are noticing that daily indexes are not created. Is there a way to have daily indexes created while using the elasticsearch-river-kafka?

We are using the following river: https://github.com/mariamhakobyan/elasticsearch-river-kafka

Below are the steps we used:

1) Create the index
curl -XPUT 'http://elasticsearch.nvnnas.com:9200/test-app-prod-index/'

2) Create Timestamp mapping:
curl -XPUT 'http://elasticsearch.nvnnas.com:9200/test-app-prod-index/logs/_mapping' -d '{
"logs" : {
"properties" : {
"ts" : {"type" : "date", "index":"not_analyzed", "format": "yyyy-MM-dd HH:mm:ss.SSS"}
}
}
}'

3) Enable meta fields like timestamp and _ttl as follows:-
curl -XPUT 'http://elasticsearch.nvnnas.com:9200/test-app-prod-index/logs/_mapping' -d '
{
"logs" : {
"_timestamp" : { "enabled" : true,"type":"date", "format": "yyyy-MM-dd HH:mm:ss.SSS", "time_zone":"CST"},
"_ttl" : { "enabled" : true, "default" : "15d" }
}
}'

4) The last step – create the river and associate it with the index and type created in first two steps.
curl -XPUT 'elasticsearch.nvnnas.com:9200/_river/kafka-river-1/_meta' -d '
{
"type" : "kafka",
"kafka" : {
"zookeeper.connect" : "nvnkaf01v:2181,nvnkaf02v:2181,nvnkaf03v:2181", 
"zookeeper.connection.timeout.ms" : 10000,
"topic" : "test-app-prod",
"message.type" : "json"
},
"index" : {
"index" : "test-app-prod-index",
"type" : "logs",
"bulk.size" : 100,
"concurrent.requests" : 5,
"action.type" : "index"
}
}'

Rivers are deprecated, you should move to using Logstash with the Kafka input.

Thanks Mark for your reply, we can go that route if needed. However I'm interested in finding out how to make daily indexes, if possible, with the river.

I don't think you can.
You could create a river instance per day using a CRON and remove the old one?

But as Mark said I would not work on that as you will throw away all your code with 2.0.

Got it, thanks David and Mark!