since i'm new to elasticsearch and logstash i wanted to ask what some of you would consider as best practice to get data via logstash from redis to elasticsearch.
we have a sorted timeset in redis with a json string as memberattribute e.g.
we are using redis as a kind of daily buffer for our measurement data and there will be up to 5.000.000 records per day which should be transfered via logstash.
we've tried a few things and i just wanted to ask a few short questions additionally:
logstash is a continously running service when started, right? so it constantly transfering data from redis to elasticsearch?
since we've a json string, what is the best way to import this into elasticsearch to have the opportunity to analyse every attribute data of the json string.
how does an efficient logstash config look like for such a task?
logstash is a continously running service when started, right? so it constantly transfering data from redis to elasticsearch?
Yes.
since we've a json string, what is the best way to import this into elasticsearch to have the opportunity to analyse every attribute data of the json string.
If each message is a complete JSON object you can use the json codec (codec => json) for the redis input (which is actually the default). Otherwise you can use the json filter.
how does an efficient logstash config look like for such a task?
You don't need to do anything special to make it efficient, especially not for a mere 5M msg/day. Just add a redis input and an elasticsearch output, plus any filters that you might need.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.