Another timestamp parse question

I'm trying to parse a log with lines like this:

[03:03:2016:20:18:56.131400] 0.1 ServerName HostName !ERROR [24667-4107126528] DNS resolution error ...

I want to parse the first field into a Kibana timestamp but have not been able to.

I've tried various filters, including one like this:

filter {

  grok {
    match => { "message" => "\[%{GREEDYDATA:my_timestamp}\]" }
  }

  date {
    match => [ "my_timestamp", "MM:dd:yyyy:HH:mm:ss.SSSSSS" ]
  }
}

But logstash throws errors like this:

Failed parsing date from field {:field=>"my_timestamp", :value=>"03:03:2016:20:18:56.131400] ServerName HostName  !ERROR [24667-4107126528] DNS resolution error ... SkyController whoman-vm1 !...\" is malformed at \"] 0.1 SkyController whoman-vm1 !...\"", :config_parsers=>"MM:dd:yyyy:HH:mm:ss.SSSSSS", :config_locale=>"default=en_US", :level=>:warn}
{
         "message" => "[03:03:2016:20:18:56.131400] ServerName HostName  !ERROR [24667-4107126528] DNS resolution error ...",
        "@version" => "1",
      "@timestamp" => "2016-07-27T04:18:27.654Z",
            "path" => "/data/logstash/input/tmp.log",
            "host" => "0.0.0.0",
    "my_timestamp" => "03:03:2016:20:18:56.131400] ServerName HostName  !ERROR [24667-4107126528] DNS resolution error ...",
            "tags" => [
        [0] "_dateparsefailure"
    ]
}

Any ideas?

Don't use GREEDYDATA here. As its name implies it's greedy and will match as much as it can. At the very least use the non-greedy DATA, but (?<my_timestamp>[^\]]+) is better (meaning: match everything up to the next ]).

Thanks - that helped. Eventually got date parser working with this:

filter {

  grok {
    match => { "message" => "(?<my_timestamp>[^\s]+)" }
  }

  date {
    match => [ "my_timestamp", "[MM:dd:yyyy:HH:mm:ss.SSSSSS]" ]
    target => "@timestamp"
    timezone => "America/Los_Angeles"
  }
}

Since log file has no timezone needed to express one here to get the conversion to UTC right.

Now, on to the next fields!