Can I substring event data's value and use this in elasitcsearch index name in logstash config?

following is my config

input {
  file {
    path => "/*.log"
    start_position => "beginning"
    sincedb_path => "/sincedb.log"
    exclude => ["*current.log", "*sincedb.log"]

filter {
  json {
    source => "message"
    # add_field => { "new_field2" => "My name is %{controller}" }
    remove_field => [ "message" ]

output {
  elasticsearch {
    hosts => ""
    index => "log_${???}"   # what i want to know
    codec => "json"


  • input
    i'm reading multiple json lines in multiple files. (read mode)
    in these json strings including 'date' property (yyyy-MM-dd HH:mm:ss)

  • filter
    i added logstash's message to source and delete message

  • output
    i want to insert this log into 'log_2019.08.13' index in elasticsearch.

i has created indices name with server date.
but, sometime 'date' key's value in log and actual server time is not same.

for example, value about log's 'date' key is '2019-08-12 23:59:59'.
but when this logs going in index, and server time has changed to next day,
this log will be come to 'log_2019.08.13'. so, my aggregation result has lower correctness.

so, i decided to make index name's date 'yyyy.MM.dd' format via log's 'date' field.

how can i substring(for substring ' HH:mm:ss', 'yyyy-MM-dd')
and join(for join "yyyy + '.' + MM + '.' + dd")
and use in index name in logstash config?

If you use a time reference in the index name, like "logstash-%{+YYYY.MM.dd}", then the YYYY, MM, and dd values are extracted from the [@timestamp] field of the event. So you just need a date filter to parse your date field into @timestamp.

1 Like

@Badger thx for your response.

i will test and reply later.

@Badger, your suggestion is working perfectly as i intended. thx.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.