Datetime conversion error Logsatsh

Hi Im having the below configuration in my logstash

filter {
 xml {
        source => "result"
        target => "Incident"
        force_array => false
    }
 date{
    match => [ "sys_created_on","yyyy-MM-dd HH:mm:ss"]
    match => [ "closed_at","yyyy-MM-dd HH:mm:ss"]
    match => [ "opened_at","yyyy-MM-dd HH:mm:ss"]
    match => [ "resolved_at","yyyy-MM-dd HH:mm:ss"]
    match => [ "sys_updated_on","yyyy-MM-dd HH:mm:ss"]
 }

}

Log Error:

[2019-12-30T03:00:37,950][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
[2019-12-30T03:00:48,655][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.2.0"}
[2019-12-30T03:00:52,724][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:snow, :exception=>"Java::JavaLang::IllegalArgumentException", :message=>"Illegal pattern component: c", :backtrace=>["org.joda.time.format.DateTimeFormat.parsePatternTo(org/joda/time/format/DateTimeFormat.java:566)", "org.joda.time.format.DateTimeFormat.createFormatterForPattern(org/joda/time/format/DateTimeFormat.java:687)", "org.joda.time.format.DateTimeFormat.forPattern(org/joda/time/format/DateTimeFormat.java:177)", "org.logstash.filters.parser.JodaParser.<init>(org/logstash/filters/parser/JodaParser.java:58)", "org.logstash.filters.parser.TimestampParserFactory.makeParser(org/logstash/filters/parser/TimestampParserFactory.java:60)", "org.logstash.filters.parser.TimestampParserFactory.makeParser(org/logstash/filters/parser/TimestampParserFactory.java:69)", "org.logstash.filters.DateFilter.acceptFilterConfig(org/logstash/filters/DateFilter.java:66)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:498)", "org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:485)", "org.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:340)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_filter_minus_date_minus_3_dot_1_dot_9.lib.logstash.filters.date.initialize(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-date-3.1.9/lib/logstash/filters/date.rb:185)", "org.jruby.RubyArray.collect(org/jruby/RubyArray.java:2563)", "org.jruby.RubyArray.map(org/jruby/RubyArray.java:2577)", "org.jruby.RubyArray$INVOKER$i$0$0$map19.call(org/jruby/RubyArray$INVOKER$i$0$0$map19.gen)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_filter_minus_date_minus_3_dot_1_dot_9.lib.logstash.filters.date.initialize(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-date-3.1.9/lib/logstash/filters/date.rb:184)", "org.jruby.RubyClass.newInstance(org/jruby/RubyClass.java:894)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.RubyClass.finvoke(org/jruby/RubyClass.java:798)", "org.jruby.RubyBasicObject.callMethod(org/jruby/RubyBasicObject.java:363)", "org.logstash.plugins.PluginFactoryExt$Plugins.filter_delegator(org/logstash/plugins/PluginFactoryExt.java:81)", "org.logstash.plugins.PluginFactoryExt$Plugins.plugin(org/logstash/plugins/PluginFactoryExt.java:251)", "org.logstash.plugins.PluginFactoryExt$Plugins.buildFilter(org/logstash/plugins/PluginFactoryExt.java:160)", "org.logstash.config.ir.CompiledPipeline.setupFilters(org/logstash/config/ir/CompiledPipeline.java:133)", "org.logstash.config.ir.CompiledPipeline.<init>(org/logstash/config/ir/CompiledPipeline.java:81)", "org.logstash.execution.JavaBasePipelineExt.initialize(org/logstash/execution/JavaBasePipelineExt.java:50)", "org.logstash.execution.JavaBasePipelineExt$INVOKER$i$1$0$initialize.call(org/logstash/execution/JavaBasePipelineExt$INVOKER$i$1$0$initialize.gen)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.initialize(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:24)", "org.jruby.RubyClass.newInstance(org/jruby/RubyClass.java:915)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen)", "RUBY.execute(/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36)", "usr.share.logstash.logstash_minus_core.lib.logstash.agent.converge_state(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:325)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:295)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:274)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:270)", "java.lang.Thread.run(java/lang/Thread.java:748)"]}
[2019-12-30T03:00:53,295][FATAL][logstash.runner          ] An unexpected error occurred! {:error=>#<LogStash::Error: Don't know how to handle `Java::JavaLang::IllegalArgumentException` for `PipelineAction::Create<snow>`>, :backtrace=>["org/logstash/execution/ConvergeResultExt.java:109:in `create'", "org/logstash/execution/ConvergeResultExt.java:37:in `add'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:338:in `block in converge_state'"]}

Time field in my xml:

<sys_created_on>2019-12-18 03:25:37</sys_created_on>

Kibana always recognizes my time fields as string if i do not apply any filters.

Where am I going wrong?

Update:
I had the error cleared with only having one column at a date filetr.

date{
        match => [ "sys_created_on","yyyy-MM-dd HH:mm:ss"]
     }

Yet my data is still recognized as a string in Kibana.
Where am i wrong?

When you have multiple instances of an option, the option is converted to an array. So

date{
    match => [ "sys_created_on","yyyy-MM-dd HH:mm:ss"]
    match => [ "closed_at","yyyy-MM-dd HH:mm:ss"]
    match => [ "opened_at","yyyy-MM-dd HH:mm:ss"]
    match => [ "resolved_at","yyyy-MM-dd HH:mm:ss"]
    match => [ "sys_updated_on","yyyy-MM-dd HH:mm:ss"]
 }

is equivalent to

date{
    match => [ "sys_created_on","yyyy-MM-dd HH:mm:ss", "closed_at","yyyy-MM-dd HH:mm:ss", "opened_at","yyyy-MM-dd HH:mm:ss", "resolved_at","yyyy-MM-dd HH:mm:ss", "sys_updated_on","yyyy-MM-dd HH:mm:ss"]
 }

A date filter takes the first entry in the array as a field name, and the rest of the entries as date patterns. But "closed_at" is not a valid date pattern. In fact it starts with an "Illegal pattern component: c"

You need to use five separate date filters.

@Badger, thank you for your response.
I tried with one match only (sys_created_on), and it still isn't recognizing as a date.
In kibana, when I'm importing the index freshly, it shows as string.

Also, I'm not able to fully understand the issue with closed_at. These are all columns in my incident table. Can you help me understand this better? Hope did I go about it?

By default, a date filter modifies the value of @timestamp. If you want it to convert sys_created_on to a timestamp then you must use the target option on the date filter.

Okay so of I understand this right,
The value of sys_created_on will get saved on to @timestamp value. Unless I say that the target value is sys_created_on again.

Can I still use closed_at the same way? Why was the error only targeted on closed_at? I still need to covert it. Will it provide me the same error of I use a separate date filter for each column with a target?
@Badger

date {
    match => [ "sys_created_on","yyyy-MM-dd HH:mm:ss", "closed_at","yyyy-MM-dd HH:mm:ss", "opened_at","yyyy-MM-dd HH:mm:ss", "resolved_at","yyyy-MM-dd HH:mm:ss", "sys_updated_on","yyyy-MM-dd HH:mm:ss"]
}

As I said, the date filter takes the value of [sys_created_on] and tries to match it against each of the rest of the array entries as a date pattern. During initialization it will build a parser for each of the patterns. It will succeed for "yyyy-MM-dd HH:mm:ss", but then it tries to interpret "closed_at" as a date pattern, not a field value, and it throws an exception because it is unable to do so.

You need something like

date { match => [ "sys_created_on", "yyyy-MM-dd HH:mm:ss"] target => "sys_created_on" }
date { match => [ "closed_at", "yyyy-MM-dd HH:mm:ss"] target => "closed_at" }

etc.

1 Like

Hi @Badger,
It still doesn't seem to work out for me.
I did try the below,

date { match => [ "sys_created_on", "yyyy-MM-dd HH:mm:ss"] target => "sys_created_on" }
date { match => [ "sys_updated_on", "yyyy-MM-dd HH:mm:ss"] target => "sys_updated_on" }
date { match => [ "opened_at", "yyyy-MM-dd HH:mm:ss"] target => "opened_at" }
date { match => [ "closed_at", "yyyy-MM-dd HH:mm:ss"] target => "closed_at" }
date { match => [ "resolved_at", "yyyy-MM-dd HH:mm:ss"] target => "resolved_at" }

And i still see my columns as a string in Kibana.
image

Note: I also triedresult.closed_at, etc, still no result.

update:
i tried to give the target value to a different name, like

date { match => [ "sys_created_on", "yyyy-MM-dd HH:mm:ss"] target => "created_on" }

and I see that this created_on column isn't even created in my ES.
Is there a problem with the date filter configuration?

Any suggestion/help is appreciated :slight_smile:

Thanks!
Katara.

Looking at your xml filter, you have the target option set, so the sys_created_on field should not even exist, it would be [Incident][sys_created_on]

@Badger, I'm not using an xml filter anymore. The input was actually json so I'm using

filter
{
json {source => "result" }
split{ field => ["result"] }
Date {.....}
}

However, I can try [result][sys_created_on], based on how you are trying to map it.
Is that right?

If this does not create [created_on] and does not add a _dateparsefailure tag that indicates that [sys_created_on] does not exist. I cannot see your data, so I do not know what fields exist.

@Badger,
I do get the _dateparsefailure error tag and I don't see the column getting created.
Here's is my sample input,

{"result":[
{
"made_sla":"true",
"upon_reject":"Cancel all future Tasks",
"sys_created_on":"2019-12-23 05:00:00",
"number":"INC0010275",
"category":"Network"} ,
{
"made_sla":"true",
"upon_reject":"Cancel all future Tasks",
"sys_created_on":"2019-12-24 07:00:00",
"number":"INC0010567",
"category":"DB"}]}

My elasticsearch index shows these columns prefixed with result. to all the columns arriving from the json result set.
I've just added a few columns to not make it too messy. However sys_updated_on, opened_at, closed_at, resolved_at are the existing date columns, just like the date column sys_created_on in the input I've attached.

OK, so if you are parsing that JSON and then splitting the result I would expect you to end up with [result][sys_created_on], which would indeed show up in kibana as result.sys_created_on. Try using "[result][sys_created_on]" in the date filter.

@Badger ,

I tried

date {
  match => ["[result][sys_created_on]","yyyy-MM-dd HH:mm:ss"]
  target => "closed_at"
     }

and it works perfectly.

Thanks much! :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.