Logstash output to redis with dynamic value save in key

Hi there,
https://www.elastic.co/guide/en/logstash/current/plugins-outputs-redis.html#plugins-outputs-redis-key
In this doc, it says key can save dynamic name, like logstash-%{type} .
I have tried key => "netflow-%{YYYY.MM.dd}"
but when I check redis key, it show me netflow-%{YYYY.MM.dd}, doesn't change to netflow-2020.06.30

Also, I want to ask that is it possible give field value in elasticsearch to key? If can, is it possible save one by one separately to redis?
For example, I have 3 document in elasticsearch, like

  name                       type
 apple                         A
 banana                        B
 orange                        C

and I want to output to redis and save like

    key                       value
   apple                   {"name":"apple","type":"A"}
   banana                  {"name":"banana","type":"B"}
   orange                  {"name":"orange","type":"C"}

Does anyone know how to solve these problems?
Thanks
Kase

Does it work if you change that by adding a plus sign?

key => "netflow-%{+YYYY.MM.dd}"

Hi @Badger,
thanks for your answer, after add plus sign, it have changed to today's date.
Still trying send document field name to key, I have tried key => "${+name}", but it didn't work.
Does anyone know how to do this? or it's not possible to do?

Read this.

If you want to an "output" option to the value of a field [foo] then use

 option => "%{foo}"

If you want the output option to be set based on the value of the [@timestamp] value of an event then use

option => "something-%{+YYYY.MM.dd}"

If you want to an option to be set based on the value of an environment variable then use

option => "${envVariable}"
1 Like

Hi @Badger,
thanks for your help, this doc really helpful.
But I still stuck in output field value..., I have fields named flow.src_addr, flow.dst_addr, flow.src_port, and flow.dst_port. I saw doc says

If you are referring to a top-level field , you can omit the [] and simply use fieldname . To refer to a nested field , you specify the full path to that field: [top-level field][nested field]

so I tried this, key => "${[flow][src_addr]}#|${[flow][dst_addr]}#|${[flow][src_port]}#|${[flow][dst_addr]}", but it doesn't change to value, is it because I have over 1 document so logstash don't know which document's value to fill in?
What I really want to do is use these fields to save documents to redis separately.

Okay.....I found mistake that I used "$" not "%", after I modify this, it output value.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.