Epoch time in milliseconds with decimal

Elasticsearch doesn't seem to recognize epoch time in milliseconds with a decimal (ex 1413680667.549). When importing the json logs with Logstash, the dynamic field created is "double". I've setup an index template forcing this field to "date" but get invalid format warnings.

:response=>{"create"=>{"_index"=>"aaa", "_type"=>"json", "_id"=>"AVUsTuWp2AOatQ9Jkzgr", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [meta.timestamp]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "1413680667.549" is malformed at "680667.549""}}}}, :level=>:warn}

Has anyone run into this before?

Epoch is in milliseconds AFAIK? Oh, no it is seconds, sorry!

date fields are internally represented as long values. Currently Elasticsearch does not support decimal values of millisecond dates.

It's coming in v5 though, thanks to bigint.