How to convert the date string into datetime format

Hello Folks,

I have searched around all the possible elastic.co similar related posts and googled the possible stuff but did not get the answer to go with, i've below lrow og data which looks like below ..

my-ldap-server;952282;28/Jul/2019:06:30;172.18.200.170;28/Jul/2019:06:30;dn="cn=directory manager"
my-ldap-server;952283;28/Jul/2019:06:30;172.18.200.170;28/Jul/2019:06:30;dn="cn=directory manager"
my-ldap-server;952697;28/Jul/2019:13:13;10.208.48.22;28/Jul/2019:13:13;dn="uid=ldapbind1,ou=groups,o=example.com"
my-ldap-server;952697;28/Jul/2019:13:13;10.208.48.22;28/Jul/2019:13:13;dn="uid=portal-r,ou=people,o=example.com"

My current logstash pipeline below where i'm trying to convert the connect_string and bind_string into datetime Format ...

input {
  file {
    path =>  [ "/data/vlsj-opendj-testcorpm1_ldap_connections-29072019" ]
    start_position => beginning
    sincedb_path => "/data/metadata"
    max_open_files => 64000
    type => "test-ldaplog"
  }
}
filter {
  if [type] == "test-ldaplog" {
    grok {
      match => { "message" => "%{HOSTNAME:hostname};%{INT:number};%{MONTHDAY:cd}/%{MONTH:cm}/%{YEAR:cy}:%{HOUR:chr}:%{MINUTE:cmm};%{IPV4:ip_address};%{MONTHDAY:bd}/%{MONTH:bm}/%{YEAR:by}:%{HOUR:bhr}:%{MINUTE:bmm};%{GREEDYDATA:data}" }
  }
  mutate {
    add_field => {
      "connect_string" => "%{cd}/%{cm}/%{cy} %{chr}:%{cmm}"
      "bind_string" => "%{bd}/%{bm}/%{by} %{bhr}:%{bmm}"
    }
    remove_field => ["cd", "bd", "cm", "bm", "cy", "by", "chr", "bhr", "cmm", "bmm"]
  }
  date{
    match => ["connect_string", "dd/MMM/YYYY hh:mm"]
    target => "connect_string"
    }
  date{
    match => ["bind_string", "dd/MMM/YYYY hh:mm"]
    target => "bind_string"
    }
  }
}
output {
        if [type] == "test-ldaplog" {
        elasticsearch {
                hosts => ["sj-elastic01:9200"]
                manage_template => false
                index => "ldap-log-%{+YYYY.MM.dd}"
  }
 }
}

Please suggest.

What is the question? Those filters appear to work just fine...

   "bind_string" => 2019-07-28T06:30:00.000Z,
        "number" => "952282",
    "ip_address" => "172.18.200.170",
"connect_string" => 2019-07-28T06:30:00.000Z,
      "hostname" => "my-ldap-server",
    "@timestamp" => 2019-07-30T16:25:28.022Z,

Thanks @Badger for the reply as always.. i want bind_string and connect_string to be a Type as date but i see them still as string i don't know why .

Still while plotting them over Kibana i still see them as e.g 29/Jul/2019 22:00 , Is there anything need to be changed in the logstash filter.

There are no quotes around the value of bind_string, so in logstash it is a LogStash::Timestamp. Normally that would get mapped to a date type in elasticsearch.

Is it possible you ingested some documents where bind_string was a string? If it gets mapped as a string then everything will get converted to a string.

Is it possible you have a mapping template that forces field names that end in string to be strings?

Sorry, i didn't get your question clearly , However i have the exact data as i have posted in the POST above, Would you be able to provide some hints or example config to show the correction the logstash which you are referring to.

apologies for the inconvenience.

I don't think you need to change anything in logstash. I think the issue is on the elasticsearch side.

What does the mapping of the index look like?

Can you change the index name to something else so that you are writing to a newly created index and see if you have the same problem?

@Badger Sure Badger, let me try that. Good though.

Now i see the bind_string and connect_string as type that after changing the index name, However i see it now as July 29th 2019, 12:40:00.000

However, in the logstash logs i see now below warning error..

[2019-07-30T10:15:57,284][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"test-ldap", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x37e0604b>], :response=>{"index"=>{"_index"=>"test-ldap", "_type"=>"doc", "_id"=>"0u7hQ2wBDKIipSvWpecr", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [bind_string] of type [date] in document with id '0u7hQ2wBDKIipSvWpecr'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"29/Jul/2019 20:46\" is malformed at \"/Jul/2019 20:46\""}}}}}

looks like parser formatting issue.

Index looks like as follows:

green open test-ldap lDLxpMzVSJm2jbeln3Srzw 5 1 2322 0 2.4mb 1.2mb

I cannot explain that.

no problem, thanks for all the help, may be someone from team elastic when they will get time.

Can anyone help to understand please.