DATESTAMP_OTHER seems not parsing properly

Hello!

I started playing with Logstash and Kibana and I got stuck with my logfile. I tried different approaches but I just can't parse timestamp properly, elastic keeps mapping that field as string...

Anybody help, please?

The example input is:

Tue May 17 00:13:43 UTC 2016 OrderType: ASK Current_Top_Bid: 453.02 minimum_to_ask: 459.616175 Current_StopLoss: 0 Aborting.

My configuration looks like:

input {
  file {
    path => "C:\logs\logs.log"
    start_position => "beginning"
  }
}

filter {
  grok {
    match => { "message" => "%{DATESTAMP_OTHER:logdate} OrderType: %{WORD:orderType} %{WORD:current_top}: %{BASE10NUM:top_value:float} (maximum_to_bid:|minimum_to_ask:) %{BASE10NUM:value_cash:float} Current_StopLoss: %{BASE10NUM:stoploss:float} %{WORD:action}" }
  }
  
  mutate {
    add_field => [ "cust_time", "%{YEAR} %{MONTH} %{MONTHDAY} %{TIME}"]
  }

  date {
    match => [ "cust_time","yyyy MMM dd HH:mm:ss" ]
    remove_field => [ "cust_time" ]
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
  }
}

You need to use the date filter on the logdate field, is that is what you want mapped.