Convert date to string string

(Yoav Ben Moha) #1

Hi,
I am using ELK7 .
I have a csv file with 3 columns
I have successfully converted the first two columns from string to integer , but failed to convert
the 3rd column from string to date.
In Kibana the 3rd field is still a string

The csv file looks as follow:
CALLING_NUMBER,CALLED_NUMBER,ING_SIGNAL_START_TIME
61280037234,61262488358,2019-04-18+05:58:10.772
38580723144,61892967871,2019-04-18+05:58:08.966
18324621472,23598050505,2019-04-18+05:57:22.294

I have tried to convert the 3rd column as follow:
filter {
csv {
separator => ","
columns => ["CALLING_NUMBER","CALLED_NUMBER","ING_SIGNAL_START_TIME"]
}

date {
        match => ["ING_SIGNAL_START_TIME", "ISO8601"]
}

mutate { convert => ["CALLING_NUMBER","integer"] }
mutate { convert => ["CALLED_NUMBER","integer"] }

}

I have also tried the following option
date {
match => [ "ING_SIGNAL_START_TIME", "YYYY-MM-dd HH:mm:ss.SSS" ]
target => "ING_SIGNAL_START_TIME"
}

None of the option above successded to convert the string to date.

Please advise
Thanks

(Charlie) #2

There you go:

filter {
 csv {
  separator => ","
  columns => ["CALLING_NUMBER","CALLED_NUMBER","ING_SIGNAL_START_TIME"]
 }

#2019-04-18+05:58:10.772
 date {
  match => ["ING_SIGNAL_START_TIME", "yyyy-MM-dd'+'HH:mm:ss.SSS", "MMM dd HH:mm:ss", "ISO8601" ]
 }

 mutate { convert => ["CALLING_NUMBER","integer"] }
 mutate { convert => ["CALLED_NUMBER","integer"] }
}

Test data:
18324621472,23598050505,2019-04-18+05:57:22.294
{
"CALLING_NUMBER" => 18324621472,
"@version" => "1",
"@timestamp" => 2019-04-18T03:57:22.294Z,
"host" => "debugging_Date",
"CALLED_NUMBER" => 23598050505,
"ING_SIGNAL_START_TIME" => "2019-04-18+05:57:22.294",
"message" => "18324621472,23598050505,2019-04-18+05:57:22.294"
}

#3

Your field does not have a space between the date and time it has +, so your pattern has to match that

match => [ "ING_SIGNAL_START_TIME", "YYYY-MM-dd'+'HH:mm:ss.SSS" ]
Grok pattern for date
(Yoav Ben Moha) #4

Hi,
I have followed yor suggestion - It didnt help.
I have insert the data into a new index , but it still didnt converted to date.

input {
        file {
                path => "/home/logstash/cdr_files/tst1.csv"
                start_position => "beginning"
                sincedb_path => "/dev/null"
              }
}

filter {
        csv {
                separator => ","
                columns => ["CALLING_NUMBER","CALLED_NUMBER","ING_SIGNAL_START_TIME"]
            }

       date {
              match => ["ING_SIGNAL_START_TIME", "yyyy-MM-dd'+'HH:mm:ss.SSS", "MMM dd HH:mm:ss", "ISO8601" ]
            }

        mutate { convert => ["CALLING_NUMBER","integer"] }
        mutate { convert => ["CALLED_NUMBER","integer"] }
}

output {
        elasticsearch {
                        hosts => "localhost"
                        index => "tst100"
                        document_type => "test100_logs"
                      }
        stdout {}
}
#5

On stdout, do you see ING_SIGNAL_START_TIME as a string, surrounded by quotes, or as a timestamp, like 2019-05-15T14:04:53.879Z? Does the event have a _dateparsefailure tag?

(Yoav Ben Moha) #6

Hi,
As you ca see bellow ING_SIGNAL_START_TIME apper as a string, surrounded by quotes.

{
                 "@version" => "1",
                  "message" => "423115,140407,2019-04-18+05:58:25.209\r",
            "CALLED_NUMBER" => 140407,
                     "path" => "/home/logstash/cdr_files/tst1.csv",
                     "host" => "elk7-lab",
           "CALLING_NUMBER" => 423115,
    "ING_SIGNAL_START_TIME" => "2019-04-18+05:58:25.209",
               "@timestamp" => 2019-04-18T02:58:25.209Z
} 

Thanks

(Yoav Ben Moha) #7

I have solve the the problem as follow

date {
      match => [ "ING_SIGNAL_START_TIME", "YYYY-MM-dd'+'HH:mm:ss.SSS" ]
      target => [ "newdatetime" ]
      remove_field => [ "ING_SIGNAL_START_TIME" ]
     }

And stdout looks ok
{
"host" => "elk7-lab",
"CALLING_NUMBER" => 32470,
"message" => "32470,20087,2019-04-18+05:58:39.291\r",
"@timestamp" => 2019-05-16T11:54:49.262Z,
"path" => "/home/logstash/cdr_files/tst1.csv",
"CALLED_NUMBER" => 20087,
"@version" => "1",
"newdatetime" => 2019-04-18T02:58:39.291Z
}

Than i have added few more date fields , but this time the logstatsh failed to run
date {
match => [ "ING_SIGNAL_START_TIME", "YYYY-MM-dd'+'HH:mm:ss.SSS" ]
target => [ "newdate_1" ]
remove_field => [ "ING_SIGNAL_START_TIME" ]

        match => [ "ING_ADDRESS_COMPLETE_TIME", "YYYY-MM-dd'+'HH:mm:ss.SSS" ]
	target => [ "newdate_2" ]
        remove_field => [ "ING_ADDRESS_COMPLETE_TIME" ]

        .....

        match => [ "EGR_RELEASE_COMPLETE_TIME", "YYYY-MM-dd'+'HH:mm:ss.SSS" ]
        target => [ "newdate_9" ]
        remove_field => [ "EGR_RELEASE_COMPLETE_TIME" ]	 
    }

The error is -
[INFO ] 2019-05-16 15:02:43.870 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"7.0.1"}
[ERROR] 2019-05-16 15:02:51.331 [Converge PipelineAction::Create] date - Invalid setting for date filter plugin:

filter {
date {
# This setting must be a string
# Expected string, got ["newdate_1", "newdate_2", "newdate_3", "newdate_4", "newdate_5", "newdate_6", "newdate_7", "newdate_8", "newdate_9"]
target => ["newdate_1", "newdate_2", "newdate_3", "newdate_4", "newdate_5", "newdate_6", "newdate_7", "newdate_8", "newdate_9"]
...
}
}
[ERROR] 2019-05-16 15:02:51.336 [Converge PipelineAction::Create] agent - Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Something is wrong with your configuration.", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/config/mixin.rb:86:in config_init'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:126:ininitialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-date-3.1.9/lib/logstash/filters/date.rb:158:in initialize'", "org/logstash/plugins/PluginFactoryExt.java:78:infilter_delegator'", "org/logstash/plugins/PluginFactoryExt.java:248:in plugin'", "org/logstash/execution/JavaBasePipelineExt.java:50:ininitialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:23:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:inexecute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:325:in `block in converge_state'"]}
[INFO ] 2019-05-16 15:02:51.692 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}
[INFO ] 2019-05-16 15:02:56.630 [LogStash::Runner] runner - Logstash shut down.

Please advise

#8

Right. You did not set target, so it parsed it into @timestamp.

In your other case, you cannot have multiple match+target pairs in the same date filter. They should be separate filters.

(Yoav Ben Moha) #9

In your other case, you cannot have multiple match+target pairs in the same date filter. They
should be separate filters.

I am sorry, But i didnt understand what i should do if i have several date columns.
Can you please share a simple test case how to handle such senario ?

Thanks !!!!

#10
date {
    match => [ "ING_SIGNAL_START_TIME", "YYYY-MM-dd'+'HH:mm:ss.SSS" ]
    target => [ "newdate_1" ]
    remove_field => [ "ING_SIGNAL_START_TIME" ]
}
date {
    match => [ "ING_ADDRESS_COMPLETE_TIME", "YYYY-MM-dd'+'HH:mm:ss.SSS" ]
    target => [ "newdate_2" ]
    remove_field => [ "ING_ADDRESS_COMPLETE_TIME" ]
}
    .....
(Yoav Ben Moha) #11

I have folowed the recommanded step .

date {
    match => [ "ING_SIGNAL_START_TIME", "YYYY-MM-dd'+'HH:mm:ss.SSS" ]
    target => [ "newdate_1" ]
    remove_field => [ "ING_SIGNAL_START_TIME" ]
}
date {
    match => [ "ING_ADDRESS_COMPLETE_TIME", "YYYY-MM-dd'+'HH:mm:ss.SSS" ]
    target => [ "newdate_2" ]
    remove_field => [ "ING_ADDRESS_COMPLETE_TIME" ]

All the columns which marked as removed_field=> .... Did NOT removed , and actually all the date fields are still appear as Strings.
In the stdout i dont see the "newdate_xx" fields

Example

....
"CALLED_NUMBER" => 4478,
"EGR_CALL_RELEASE_TIME" => "04/18/2019 05:58:38",
"ING_CALL_ANSWER_TIME" => "04/18/2019 05:58:36",
"ING_RELEASE_COMPLETE_TIME" => "04/18/2019 05:58:38",
....
"@version" => "1",
}
{
"tags" => [
[0] "_dateparsefailure"
],

Any advise ... ?

Thanks

#12

That does not match "YYYY-MM-dd'+'HH:mm:ss.SSS" do your different filters have different formats? Did you change the format of the data being input?