Convert String to timestamp example

Hi all,

I want to convert a String to be used in addition to @timestamp. Please help me as to how I do it.
I used the "date" filter as mentioned in lot of posts but was unable to get it working.

My message:

message:1488479436.576,2017-03-02 10:02:35.788,867 @version:1 @timestamp:March 30th 2017, 23:48:14.050 path:/home/abi/TOTAL_FD_mp.csv host:abi type:csv time/s:1488479436.576 date and time:2017-03-02, 10:02:35.788 open FDs:867 Date:1488479436.576 Open:2017-03-02 10:02:35.788 High:867 _id:AVsjHy-8wPycLm9WrZ7d _type:csv _index:tester _score:

I want to query based on "2017-03-02 10:02:35.788" etc. So which I assume means converting the string into a new timestamp.

I tried this but doesn't help.

input {
file {
path => "/home/abi/TOTAL_FD_mp.csv"
type => "csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["time/s","date and time","open FDs"]
}

mutate {
convert => ["open FDs", "integer"]
}

date {
match => ["date and time","yyyy-MM-dd HH::mm:ss.SSS" ]

   }

}
output {
elasticsearch {
hosts => "localhost"
action => "index"
workers => 1
index => "in7"
}
stdout { codec => rubydebug }
}

you have two ":" separating hour and minute: HH::mm

does changing from:
yyyy-MM-dd HH::mm:ss.SSS to
yyyy-MM-dd HH:mm:ss.SSS solve the issue?

Thanks Joao. That overcame the date conversion issues that I was seeing. But now I am seeing that the logs are totally messed up. I see the logs like:

I just want a new timestamp with which I can visualize data.

If I remove my "date" field, I get the logs I expect to see again. Please let me know what am I missing.

Can someone please help ?

My log:

message:1488479436.576,2017-03-02 10:02:35.788,867

I want to create a new new timestamp based on "2017-03-02 10:02:35.788" from my message. How can I do that. I have tried lot of ways looking at different threads but it doesnt help.

--- Config file ---

input {
file {
path => "/home/abi/TOTAL_FD_mp.csv"
type => "csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["time/s","date and time","open FDs"]
}

mutate {
convert => ["open FDs", "integer"]
}

    date {
            match => ["message", "YYYY-MM-dd HH:mm:ss.SSS"]
            target => "customlog"
    }

}
output {
elasticsearch {
hosts => "localhost"
action => "index"
workers => 1
index => "soak10"
}
stdout { codec => rubydebug }
}

I see this:

{
"message" => "1488466595.398,2017-03-02 06:02:34.699,852",
"@version" => "1",
"@timestamp" => "2017-03-31T19:23:39.775Z",
"path" => "/home/abi/TOTAL_FD_mp.csv",
"host" => "abi",
"type" => "csv",
"time/s" => "1488466595.398",
"date and time" => "2017-03-02 06:02:34.699",
"open FDs" => 852,
"tags" => [
[0] "_dateparsefailure"
]
}

And in logs,

Failed parsing date from field {:field=>"message", :value=>"1488454654.538,2017-03-02 03:02:33.769,856", :exception=>"Invalid format: "1488454654.538,2017-03-02 03:02:33.769,856" is malformed at "4.538,2017-03-02 03:02:33.769,856"", :config_parsers=>"YYYY-MM-dd HH:mm:ss.SSS", :config_locale=>"default=en_US", :level=>:warn}
Failed parsing date from field {:field=>"message", :value=>"1488411330.648,2017-03-01 15:01:30.216,822", :exception=>"Invalid format: "1488411330.648,2017-03-01 15:01:30.216,822" is malformed at "0.648,2017-03-01 15:01:30.216,822"", :config_parsers=>"YYYY-MM-dd HH:mm:ss.SSS", :config_locale=>"default=en_US", :level=>:warn}

How do i get around this ? Please help

you first must use grok to break the message field into subfields like field1 field2, or message_timestamp.

you can read https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html#_grok_basics to learn how to do this
then you can apply the date filter to the new fields created by grok. The date filter must target a field that contains only the date string and nothing else

Thank you. I played around with the syntax and it works. Thanks for the help.

{
"message" => "1488466415.404,2017-03-02 06:02:34.702,854",
"@version" => "1",
"@timestamp" => "2017-03-31T20:32:57.008Z",
"path" => "/home/abi/TOTAL_FD_mp.csv",
"host" => "abi",
"type" => "csv",
"time/s" => "1488466415.404",
"date and time" => "2017-03-02 06:02:34.702",
"open FDs" => 854,
"customlog" => "2017-03-02T14:02:34.702Z"
}

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.