Logstash: error when use date filter

Hi, I'm setting up an ELK Stack to process some logs that are sent to us from AKAMAI. An example line:

2020-11-18 14:58:27 - - - - GET /unaurl.es/N3X3QDOPYNHGPO5R6ZYCEOWCNM.png - 200 1 36513 848 1 80 HTTP/1.1 "Mozilla/5.0 (X11; U; Linux x86_64; en-US) AkamaiImageServer VelocitudeMP/1.0;IM/1.0" "-" "-"

Config file:

input {
    file {
        path => "/var/data/logs/*"
        start_position => "beginning"
        sincedb_path => "/dev/null"

filter {
    grok {
        patterns_dir => ["/etc/logstash/conf.d/patterns"]
        match => {
            "message" => "%{TIMESTAMP_ISO8601:timestamp} %{IPV4:c-ip} %{USERNAME:cs-username} %{DATA:s-sitename} %{DATA:s-computername} %{CUSTOMIP:s-ip} %{WORD:cs-method} %{URIPATH:cs-uri-stem} %{CUSTOMURIPATH:cs-uri-query} %{NUMBER:sc-status} %{NUMBER:sc-win32-status} %{NUMBER:sc-bytes} %{NUMBER:cs-bytes} %{NUMBER:time-taken} %{CUSTOMPORT:s-port} %{DATA:cs-protocol} %{QS:cs-user-agent} %{DATA:cs-cookie} %{DATA:cs-referer}"
    date {
        match => ["timestamp", "YYYY-MM-dd HH:mm:ss"]
        target => "@timestamp"

output {
    elasticsearch {
        hosts => ["localhost:9200"]
        index => "index-%{+YYYY.MM.dd}"
    stdout {
        codec => rubydebug

Patterns file:

CUSTOMIP (?:%{IP}|-)

The problem is that if I remove the date filter, all the fields are processed but of course the timestamp that catches me is the moment of reading the file.

If I leave the date filter, it gives me a grok parse failure, which I do not understand because it parses well when date filter is gone.

Sorry, the problem is that some lines in logs not matches grok filter. The filter date is correct.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.