Converting date to YYYY-MM-DD from YYYY/MM/DD HH:MM:SS

Hi,

I am having nginx error logs of the form:-

2015/09/30 22:19:38 [error] 32317#0: *23 [lua] responses.lua:61: handler(): Cassandra error: Error during UNIQUE check: Cassandra error: connection refused, client: 127.0.0.1, server: , request: "POST /consumers/ HTTP/1.1", host: "localhost:8001"

As mentioned here I am able to parse this logs.

Can some one let me know how can I create a field containing the log timestamp in the format YYYY-MM-DD. So as per the above example I want to have field log_day having value 2015-09-30?

The filters that I am using is shown below:-

filter {  
  grok {
      match => {
        "message" => [
          "%{DATESTAMP:mydate} \[%{DATA:severity}\] (%{NUMBER:pid:int}#%{NUMBER}: \*%{NUMBER}|\*%{NUMBER}) %{GREEDYDATA:mymessage}",
          "%{DATESTAMP:mydate} \[%{DATA:severity}\] %{GREEDYDATA:mymessage}",
          "%{DATESTAMP:mydate} %{GREEDYDATA:mymessage}"
        ]
      }
      add_tag => ["nginx_error_pattern"]
    }

    if ("nginx_error_pattern" in [tags]) {      
      grok {
        match => {
          "mymessage" => [
            "server: %{DATA:[request_server]},"
          ]
        }        
      }

      grok {
        match => {
          "mymessage" => [
            "host: \"%{IPORHOST:[request_host]}:%{NUMBER:[port]}\""
          ]
        }        
      }

      grok {
        match => {
          "mymessage" => [
            "request: \"%{WORD:[request_method]} %{DATA:[request_uri]} HTTP/%{NUMBER:[request_version]:float}\""
          ]
        }        
      }

      grok {
        match => {
          "mymessage" => [
            "client: %{IPORHOST:[clientip]}",
            "client %{IP:[clientip]} "
          ]
        }        
      }

      grok {
        match => {
          "mymessage" => [
            "referrer: \"%{DATA:[request_referrer]}\""
          ]
        }       
      }      

      grok {
        match => [
          "mydate", "^(?<app_log_time>%{DATE_EU})"
        ]
      }      
    }
}

What's the purpose of the log_day field? Just making sure we're attempting to solve the right problem.

Side notes:

  • I suspect server: %{DATA:[request_server]}, is matching more than you'd like it too.

  • Merging the current grok expressions into a single expression should improve performance.

  • You may want to consider using the kv filter for parsing this data.

If you do not plan to use the date filter to replace @timestamp with the value of mydate and then use the syntax %{+yyyy-MM-dd} then you might want to try out this filter that is pending review for integration in logstash-plugins but already published to rubygems https://rubygems.org/gems/logstash-filter-date_formatter

You have first to parse your date string into a date field , then use this date field in the date_formatter

filter {
  date {
    match => [ "mydate", "YYYY/MM/dd HH:mm:ss"]
    target => "my_date_as_date"
  }
  date_formatter {
    source => "my_date_as_date"
    target => "log_day"
    pattern => "yyyy-MM-dd"
  }
}
1 Like

Thanks @wiibaa