Redis-logstash-elasticsearch

hi everybody,

since i'm new to elasticsearch and logstash i wanted to ask what some of you would consider as best practice to get data via logstash from redis to elasticsearch.

we have a sorted timeset in redis with a json string as memberattribute e.g.

timeset1, datetimescore, {'name':'name1', 'attr1':'attrvalue1' ... 'attr16':'attrvalue16' }

we are using redis as a kind of daily buffer for our measurement data and there will be up to 5.000.000 records per day which should be transfered via logstash.

we've tried a few things and i just wanted to ask a few short questions additionally:

  • logstash is a continously running service when started, right? so it constantly transfering data from redis to elasticsearch?
  • since we've a json string, what is the best way to import this into elasticsearch to have the opportunity to analyse every attribute data of the json string.
  • how does an efficient logstash config look like for such a task?

thanks for you support and help!

best andy

1 Like

logstash is a continously running service when started, right? so it constantly transfering data from redis to elasticsearch?

Yes.

since we've a json string, what is the best way to import this into elasticsearch to have the opportunity to analyse every attribute data of the json string.

If each message is a complete JSON object you can use the json codec (codec => json) for the redis input (which is actually the default). Otherwise you can use the json filter.

how does an efficient logstash config look like for such a task?

You don't need to do anything special to make it efficient, especially not for a mere 5M msg/day. Just add a redis input and an elasticsearch output, plus any filters that you might need.

input {
  redis {
    host => "redishost.example.com"
  }
}

output {
  elasticsearch {
    host => ["localhost"]
    protocol => "http"
  }
}
2 Likes

thanks a lot!!! i'll give it a try tomorrow! thanks again! best andy