Date FIlter - _dateparsefailure [SOLVED]

Hi,

I'm new with the date filter.

I set up this configuration :

filter {
        if "mail" in [type] and "maillog" in [source] {
                grok {
                    match => ["message","%{CISCOTIMESTAMP:mail_date} %{DATA:server_name} %{DATA:service}/%{DATA:process}\[%{DATA:pid}\]:",
                              "message","%{CISCOTIMESTAMP:mail_date} %{DATA:server_name} %{DATA:service}\[%{DATA:pid}\]:"]
                    break_on_match => true
                }

                date {
                    match => ["mail_date", "MMM d HH:mm:ss", "MMM dd HH:mm:ss"]
                }
        }
}

To match this timestamp :

Nov 2 10:40:49

I see a _dateparsefailure in the tags field, i don't know why this tag appeared and how to fix the problem.

It seems to work fine with that date pattern:

% echo "Nov 2 10:40:49" | bin/logstash -e 'input { stdin {} } filter { date { match => [ "message", "MMM d HH:mm:ss", "MMM dd HH:mm:ss"] } }'
{
    "@timestamp" => 2016-11-02T10:40:49.000Z,
      "@version" => "1",
          "host" => "Joaos-MBP-5.lan",
       "message" => "Nov 2 10:40:49"
}

Are those events also populated with the _grokparsefailure? or just the _dateparsefailure?

1 Like

Just the _dateparsefailure

I agree, it seems to work, i just have this tags...

Do you think a remove_tag will be a good option ?

if that tag is applied then the date filter is failing..this should be investigated. could you be running another configuration file alongside this one?

Yes, i have other configurations files alongside this one

it could happen that the events are also passing to the other configuration file which also have date filters.
If so you need to isolate the flows using tags/fields and conditionals

I don't have any other date filter for the moment.

And i already use conditionnals to root the log to the correct filter.

can you post the full configuration and some logging at debug level?

here is my main conf file :

input {
    redis {
           host => "127.0.0.1"
           port => 6379
           data_type => "list"
           type => "redis-input"
           key => "logstash"
    }

    beats {
            port => 5045
            type => "filebeat-input"
            congestion_threshold => 99999999
    }
}

output {
        if "_grokparsefailure" not in [tags] {
                if "tango" in [tags] {
                        elasticsearch {
                                hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
                                index => "logstash-tango-%{+YYYY.MM.dd}"
                        }
                }
                if "nginx" in [tags] {
                        elasticsearch {
                                hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
                                index => "logstash-isg-%{+YYYY.MM.dd}"
                        }
                }
                if "scarlette" in [tags] {
                        elasticsearch {
                                hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
                                index => "logstash-scarlette-%{+YYYY.MM.dd}"
                        }
                }
                if "serveur_owncloud" in [tags] {
                        elasticsearch {
                                hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
                                index => "logstash-owncloud-%{+YYYY.MM.dd}"
                        }
                }
                if "brouette" in [tags] or "poussette" in [tags] {
                        elasticsearch {
                                hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
                                index => "logstash-mta-%{+YYYY.MM.dd}"
                        }
                }
                if "serveur_proxy" in [tags] or "serveur_dns" in [tags] {
                        elasticsearch {
                                hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
                                index => "logstash-proxydns-%{+YYYY.MM.dd}"
                        }
                }
        }
 }

Here is my filter with the date filter :

filter {
        if "mail" in [type] and "maillog" in [source] {
                grok {
                        match => ["message","%{CISCOTIMESTAMP:mail_date} %{DATA:server_name} %{DATA:service}/%{DATA:process}\[%{DATA:pid}\]:",
                                  "message","%{CISCOTIMESTAMP:mail_date} %{DATA:server_name} %{DATA:service}\[%{DATA:pid}\]:"]
                        break_on_match => true
                }

                date {
                        match => ["mail_date", "MMM d HH:mm:ss", "MMM dd HH:mm:ss"]
                        timezone => "Europe/Paris"
                }
        }
}

My log who match with the date filter, aslo match with another filter. I can't show you this filter here because it contain to many characters...

Try a different order of the values in the date match list. e.g.

["mail_date", "MMM dd HH:mm:ss", "MMM d HH:mm:ss"]

Do you get _dateparsefailure it this case?

Yes i also have _dateparsefailurein this case.

Maybe the error is just because the date field doesn't match with the pattern MMM dd HH:mm:ss

But when the date will be Nov 11 10:40:49 instead of Nov 2 10:40:49, the date field will continue to match with the pattern MMM d HH:mm:ss ?

Also according to RFC 3164 The BSD syslog Protocol a singular day (1-9) should have a space in front e.g. Nov 2 10:10:10.

Your pattern should be MMM<space><space>d HH:mm:ss

1 Like

This problem of double space fix my problem thanks !

@Clement_Ros
The date filter will try to match either pattern.
Unfortunately it uses a slow mechanism to "see" the failure from the earlier pattern before trying the later patterns.
This means that you should put the most likeliest pattern first, and because there are more days from 10 to the end of a month - you will get better performance checking MMM<space>dd<space>HH:mm:ss first when the syslog message is from days 10 through 28|19|30|31 of any month.

Thanks for the help, i understood :grinning:

You are welcome.

I have a similar issue. Help is much appreciated - new here:grin:

Ful config is -

input { 
stdin {} 
} 
 
filter { 
	grok { 
		match => { "message" => "%{TIMESTAMP_ISO8601:@timestamp}" , }
		}

	date {
      match => { "@timestamp" => "yyyy MMM dd HH:mm:ss"}
    }	
		 } 
 
output { 
stdout { codec => rubydebug } 

elasticsearch { hosts => ["localhost:9200"]
				index => "test1-%{+YYYY.MM.dd}"
}
}

And the Input is -

2017 Jun 13 01:56:51

The logs are as below -

> [2017-06-22T05:31:52,846][DEBUG][logstash.runner          ] --------------- Logstash Settings -------------------
> [2017-06-22T05:31:52,881][DEBUG][logstash.agent           ] Agent: Configuring metric collection
> [2017-06-22T05:31:52,883][DEBUG][logstash.instrument.periodicpoller.os] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
> [2017-06-22T05:31:52,899][DEBUG][logstash.instrument.periodicpoller.jvm] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
> [2017-06-22T05:31:52,930][DEBUG][logstash.instrument.periodicpoller.persistentqueue] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
> [2017-06-22T05:31:52,946][DEBUG][logstash.agent           ] Reading config file {:config_file=>"C:/elastic/logstash/bin/test1.conf"}
> [2017-06-22T05:31:53,084][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"stdin", :type=>"input", :class=>LogStash::Inputs::Stdin}
> [2017-06-22T05:31:53,115][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"line", :type=>"codec", :class=>LogStash::Codecs::Line}
> [2017-06-22T05:31:53,184][DEBUG][logstash.inputs.stdin    ] config LogStash::Inputs::Stdin/@add_field = {}
> [2017-06-22T05:31:53,221][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"grok", :type=>"filter", :class=>LogStash::Filters::Grok}
> [2017-06-22T05:31:53,230][DEBUG][logstash.filters.grok    ] config LogStash::Filters::Grok/@match = {"message"=>"%{TIMESTAMP_ISO8601:@timestamp}"}
> [2017-06-22T05:31:53,236][DEBUG][logstash.filters.grok    ] config LogStash::Filters::Grok/@id = "a37ca75f76b51ef1203771d08a1b35b3a67fe7b1-2"
> [2017-06-22T05:31:53,262][DEBUG][logstash.filters.grok    ] config LogStash::Filters::Grok/@periodic_flush = false
> [2017-06-22T05:31:53,267][DEBUG][logstash.filters.grok    ] config LogStash::Filters::Grok/@patterns_dir = []
> [2017-06-22T05:31:53,275][DEBUG][logstash.filters.grok    ] config LogStash::Filters::Grok/@pattern_definitions = {}
> [2017-06-22T05:31:53,281][DEBUG][logstash.filters.grok    ] config LogStash::Filters::Grok/@patterns_files_glob = "*"
> [2017-06-22T05:31:53,302][DEBUG][logstash.filters.grok    ] config LogStash::Filters::Grok/@tag_on_failure = ["_grokparsefailure"]
> [2017-06-22T05:31:53,306][DEBUG][logstash.filters.grok    ] config LogStash::Filters::Grok/@timeout_millis = 30000
> [2017-06-22T05:31:53,310][DEBUG][logstash.filters.grok    ] config LogStash::Filters::Grok/@tag_on_timeout = "_groktimeout"
> [2017-06-22T05:31:53,314][DEBUG][logstash.filters.grok    ] config LogStash::Filters::Grok/@overwrite = []
> [2017-06-22T05:31:53,324][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"date", :type=>"filter", :class=>LogStash::Filters::Date}
> [2017-06-22T05:31:53,340][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@match = {"@timestamp"=>"yyyy MMM dd HH:mm:ss"}
> [2017-06-22T05:31:53,340][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@id = "a37ca75f76b51ef1203771d08a1b35b3a67fe7b1-3"
> [2017-06-22T05:31:53,355][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@enable_metric = true
> [2017-06-22T05:31:53,355][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@add_tag = []
> [2017-06-22T05:31:53,355][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@remove_tag = []
> [2017-06-22T05:31:53,355][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@add_field = {}
> [2017-06-22T05:31:53,355][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@remove_field = []
> [2017-06-22T05:31:53,355][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@periodic_flush = false
> [2017-06-22T05:31:53,371][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@target = "@timestamp"
> [2017-06-22T05:31:53,371][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@tag_on_failure = ["_dateparsefailure"]
> [2017-06-22T05:31:53,387][ERROR][logstash.agent           ] Cannot create pipeline {:reason=>"translation missing: en.logstash.agent.configuration.invalid_plugin_register", :backtrace=>["C:/elastic/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-date-3.1.5/lib/logstash/filters/date.rb:162:in `initialize'", "C:/elastic/logstash/logstash-core/lib/logstash/filter_delegator.rb:21:in `initialize'", "C:/elastic/logstash/logstash-core/lib/logstash/pipeline.rb:96:in `plugin'", "(eval):37:in `initialize'", "org/jruby/RubyKernel.java:1079:in `eval'", "C:/elastic/logstash/logstash-core/lib/logstash/pipeline.rb:63:in `initialize'", "C:/elastic/logstash/logstash-core/lib/logstash/pipeline.rb:145:in `initialize'", "C:/elastic/logstash/logstash-core/lib/logstash/agent.rb:286:in `create_pipeline'", "C:/elastic/logstash/logstash-core/lib/logstash/agent.rb:95:in `register_pipeline'", "C:/elastic/logstash/logstash-core/lib/logstash/runner.rb:274:in `execute'", "C:/elastic/logstash/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:67:in `run'", "C:/elastic/logstash/logstash-core/lib/logstash/runner.rb:185:in `run'", "C:/elastic/logstash/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:132:in `run'", "C:\\elastic\\logstash\\lib\\bootstrap\\environment.rb:71:in `(root)'"]}
> [2017-06-22T05:31:53,424][DEBUG][logstash.agent           ] starting agent
> [2017-06-22T05:31:53,440][DEBUG][logstash.agent           ] Starting puma
> [2017-06-22T05:31:53,440][DEBUG][logstash.instrument.periodicpoller.os] PeriodicPoller: Stopping
> [2017-06-22T05:31:53,440][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
> [2017-06-22T05:31:53,440][DEBUG][logstash.instrument.periodicpoller.jvm] PeriodicPoller: Stopping
> [2017-06-22T05:31:53,440][DEBUG][logstash.api.service     ] [api-service] start
> [2017-06-22T05:31:53,456][DEBUG][logstash.instrument.periodicpoller.persistentqueue] PeriodicPoller: Stopping