Also, I want to ask that is it possible give field value in elasticsearch to key? If can, is it possible save one by one separately to redis?
For example, I have 3 document in elasticsearch, like
name type
apple A
banana B
orange C
and I want to output to redis and save like
key value
apple {"name":"apple","type":"A"}
banana {"name":"banana","type":"B"}
orange {"name":"orange","type":"C"}
Does anyone know how to solve these problems?
Thanks
Kase
Hi @Badger,
thanks for your answer, after add plus sign, it have changed to today's date.
Still trying send document field name to key, I have tried key => "${+name}", but it didn't work.
Does anyone know how to do this? or it's not possible to do?
Hi @Badger,
thanks for your help, this doc really helpful.
But I still stuck in output field value..., I have fields named flow.src_addr, flow.dst_addr, flow.src_port, and flow.dst_port. I saw doc says
If you are referring to a top-level field , you can omit the [] and simply use fieldname . To refer to a nested field , you specify the full path to that field: [top-level field][nested field]
so I tried this, key => "${[flow][src_addr]}#|${[flow][dst_addr]}#|${[flow][src_port]}#|${[flow][dst_addr]}", but it doesn't change to value, is it because I have over 1 document so logstash don't know which document's value to fill in?
What I really want to do is use these fields to save documents to redis separately.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.