Logstash Date Type

I am using logstash to parse my logs on S3(gz).
My log has data like:
172.31.0.14 - - [15/May/2016:06:49:02 +0000] "GET /ottsale/youthstars.html?color=112&manufacturer=116&ram=384&utm_campaign=mi_2704&utm_medium=post&utm_source=facebook HTTP/1.1" 200 38513 "-" "Mozilla/5.0 (Windows NT 5.1; rv:6.0.2) Gecko/20100101 Firefox/6.0.2"

I am using grok parser as:
%{IP:client} %{USERNAME} %{USERNAME} [%{HTTPDATE:log_timestamp}] (?:"%{WORD:request} %{URIPATHPARAM:path} HTTP/%{NUMBER:version}" %{NUMBER:reponse:int} %{NUMBER:bytes} "%{USERNAME}" %{GREEDYDATA:responseMessage})

If i visualize my log in kibana it is showing log_timestamp field as a string.I would like use this as timestamp.Please help.

I have also used filter in logstash conf file as but its not helping.
filter {

    if [type] == "s3" {
            grok {
                    match => { "message" => "%{NGINXACCESS}" }
                            patterns_dir => ["/opt/logstash/pattterns"]

}

date {
match => ["log_timestamp" ,"dd/MMM/yyyy:HH:mm:ss Z"]
}

    }
    }

Unless told otherwise the date filter stores the parsed timestamp in the @timestamp field. Does that field contain the correct data? If yes, just use that field and remove log_timestamp after you've parsed the date.

You want me to remove date filter here in conf.

No. Keep the date filter. Make sure it successfully parses log_timestamp into @timestamp. Delete the log_timestamp field.

Little confused.It is creating @timestamp with the current time.I want my log time to be of type timstamp type.It is currently showing as string.

Is this you are taking about?
date {
match => ["@timestamp","dd/MMM/yyyy:HH:mm:ss Z"]
}

It is creating @timestamp with the current time.I want my log time to be of type timstamp type.

If @timestamp contains the current time rather than the time parsed from the log_timestamp field the date filter isn't working. Your event will have a _dateparsefailure tag and the Logstash log will contain details about the failure.

It is currently showing as string.

How do you reach that conclusion?

Is this you are taking about?

That's the date filter, yes.

I can see from kibana my log_timestamp field is string where the inbuilt timestamp that logstash push is the only timestamp field. Can i have log_timestamp as timestamp?
Snapshot from kibana.

This is what i am getting
i want log_timetamp to be datetime rather than string .

Can i have log_timestamp as timestamp?

Yes, but I suggest you use the standard @timestamp field instead. As a beginner stick to the defaults.

But that will nor work because i want to plot log_timestamp vs response graph not @timestamp vs response.
This will give clear picture.

One more time: Fix your date filter. When your date filter works as intended the timestamp in @timestamp will be the same as log_timestamp and you won't need log_timestamp anymore.

here is my filter.

filter {

    if [type] == "s3" {
            grok {
                    match => { "message" => "%{NGINXACCESS}" }
                            patterns_dir => ["/opt/logstash/pattterns"]

}

date {
match => ["@timestamp" ,"dd/MMM/yyyy:HH:mm:ss Z"]
}

    }
    }

match => ["@timestamp" ,"dd/MMM/yyyy:HH:mm:ss Z"]

Not @timestamp. You wantlog_timestamp here. The field listed here is the name of the field you want to parse.

that is what i did initially which is not working.

If the date filter fails it will add a _dateparsefailure tag to the event and the Logstash log will contain details about the failure.

It is not failing.

Then it's either working as expected (which doesn't seem to be the case) or your date filter isn't actually used. Starting Logstash with --log.level debug and --config.debug may give additional clues about what's going on.

Running on debug mode still not getting error and data is being dumped on elastic search. Don't know what am i doing wrong dropped and created index again.Still getting same.

Event from debug:

output received {:event=>{"message"=>"172.31.26.87 - - [08/May/2016:09:05:51 +0000] "GET /accessories/type/speakers-docks.html?dir=asc&limit=20&manufacturer=116&mode=list&order=name HTTP/1.1" 200 32450 "-" "Mozilla/5.0 (compatible; YandexBot/3.0; +http://yandex.com/bots)"\n", "@version"=>"1", "@timestamp"=>"2016-05-08T09:05:51.000Z", "type"=>"s3", "source"=>"gzfiles", "client"=>"172.31.26.87", "log_timestamp"=>"08/May/2016:09:05:51 +0000", "request"=>"GET", "path"=>"/accessories/type/speakers-docks.html?dir=asc&limit=20&manufacturer=116&mode=list&order=name", "version"=>1, "response"=>200, "bytes"=>"32450", "responseMessage"=>""Mozilla/5.0 (compatible; YandexBot/3.0; +http://yandex.com/bots)"\n"}, :level=>:debug, :file=>"(eval)", :line=>"73", :method=>"output_func"}

Okay, so the date filter is successful after all then. log_timestamp has successfully been parsed into @timestamp.

08/May/2016:09:05:51 +0000 <=> 2016-05-08T09:05:51.000Z