File output with data from field

I collect different logs from various sources.
I replaced logstash timestamp with machine logs timestamp to have time based events into elasticsearch and I added a "processed_at" field with a part of logstash timestamp.

This is my conf:

input {
       beats {
            port => 5044
       }
}
filter{
 mutate {
              add_field => { "processed_at" => "%{+YYYY-MM-dd_HH-mm}" }
              remove_field => ["beat","tags"]
      }
     date{
          match => [ "sys_created_on", "dd/MM/yyyy HH:mm:ss" ]
     }
}

output{
       file {
             path => "/var/log/%{processed_at}-test.log"
       }
elasticsearch {
                      index => "test-%{+YYYY.MM.dd}"
                      hosts  => [localhost:9200]
                      }
}

Now I want to create elasticsearch indexes daily but with "processed_at" date field and not with replaced timestamp.
Can I use part of "processed_at" field without hours and minute or I need to create another field.

That will always reference @timestamp. If you want to build an index name using a different field you could do it in a ruby filter using strftime.

Thanks.

I thinked another field like this:

mutate {
              add_field => { "processed_time" => "%{+YYYY-MM-dd_HH-mm}" }
              add_field => { "processed_data" => "%{+YYYY-MM-dd}" }
      }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.