Logstash Date Filter, UNIX date format parse error, date is malformed

I am trying to parse the Unix date from Squid logs with the Date plugin using UNIX date format. The date does not convert and there are errors in the Logstash log stating the date is malformed at the last number in the date string. I have verified the date is correct and parses correctly with external web time converters. I have tested the message with the Grok pattern in the Grok debugger and everything works as expected.

Any help on this would be greatly appreciated
Thank you
Rick

This is the error message:
{:timestamp=>"2016-10-21T12:33:07.780000-0700", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"1477078382", :exception=>"Invalid format: "1477078382" is malformed at "2"", :config_parsers=>"yyyy-MM-dd'T'HH:mm:ss.SSSSSS+HH:mm", :config_locale=>"default=en_US", :level=>:warn}

This is the message from the Squid Log:
1477078382.596 103 192.168.5.184 TCP_REFRESH_UNMODIFIED/304 350 GET http://data.cnn.com/jsonp/breaking_news/domestic.json? - HIER_DIRECT/23.59.189.98 application/javascript

This is my Filter config:
#208 Squid Proxy filter
filter {
if [type] == "squid_log" {
grok {
match => {"message" => "%{POSINT:timestamp}.%{POSINT:timestamp_ms}\s+%{NUMBER:response_time}\s+%{IPORHOST:user} %{WORD:result}/%{NUMBER:status_codes} %{NUMBER:transfer_size} %{WORD:request_method} (?=%{NOTSPACE:request_url})((?:(http://)%{IPORHOST:domain}(?::%{POSINT:port})?)(?:|%{URIPATH:url_path})(?:|%{URIPARAM:url_querystring})) (?:-|%{NOTSPACE:client_identity}) %{WORD:peering_code}/(?:-|%{NOTSPACE:peerhost}) (?:-|%{NOTSPACE:content_type})"}
}
if "_grokparsefailure" in [tags] {
drop { }
}
}
date {
match => [ "timestamp", "UNIX" ]
}

Tested the time stamp and as shown below it converts properly. I cannot recreate the malformed date error outside of Logstash.

Timestamp Converter - http://www.unixtimestamp.com/index.php

1477078382.596

    Is equivalent to:

10/21/2016 @ 7:33pm (UTC)
2016-10-21T19:33:02+00:00 in ISO 8601
Fri, 21 Oct 2016 19:33:02 +0000 in RFC 822, 1036, 1123, 2822
Friday, 21-Oct-16 19:33:02 UTC in RFC 2822
2016-10-21T19:33:02+00:00 in RFC 3339

Works fine for me:

$ cat test.config 
input { stdin { } }
output { stdout { codec => rubydebug } }
filter {
  grok {
    match => {
      "message" => "%{POSINT:timestamp}.%{POSINT:timestamp_ms}"
    }
  }
  date {
    match => [ "timestamp", "UNIX" ]
  }
}
$ echo '1477078382.596 103 192.168.5.184 TCP_REFRESH_UNMODIFIED/304 350 GET http://data.cnn.com/jsonp/breaking_news/domestic.json? - HIER_DIRECT/23.59.189.98 application/javascript' | logstash -f test.config
Settings: Default pipeline workers: 8
Pipeline main started
{
         "message" => "1477078382.596 103 192.168.5.184 TCP_REFRESH_UNMODIFIED/304 350 GET http://data.cnn.com/jsonp/breaking_news/domestic.json? - HIER_DIRECT/23.59.189.98 application/javascript",
        "@version" => "1",
      "@timestamp" => "2016-10-21T19:33:02.000Z",
            "host" => "bertie",
       "timestamp" => "1477078382",
    "timestamp_ms" => "596"
}
Pipeline main has been shutdown
stopping pipeline {:id=>"main"}

Magnus, Thank you for validating this.

What I do not understand is why if this is successful, the Logstash.log file is filling up with date parsing errors, and I have a tag inserted of "_dateparsefailure". I posted the error message at the beginning of this thread. The interesting thing is that the error log indicates the last character of the UNIX time stamp is malformed, yet I cannot identify anything malformed in the date data being parsed.

I doubt this matters, but I am using filebeat as the data shipper to logstash.

Thank you,
Rick