S3 Input - No Time Information Error

Getting an issue when trying to import log files from an S3 bucket to ES, be assured that I do have the credentials to hit the bucket and can in fact copy the files and process them with a file input, but this limits our ability to just point Logstash at the bucket and let it run all day.

Error seems to be date/time related...

A plugin had an unrecoverable error. Will restart this plugin.
 Plugin: <LogStash::Inputs::S3 bucket=>"XXXXXXXXXX", region=>"us-east-1", aws_credentials_file=>"XXXXXXXXXX", prefix=>"XXXXXXXXXX", temporary_directory=>"/tmp", type=>"XXXXXXXXXX", sincedb_path=>"/dev/null", debug=>false, codec=><LogStash::Codecs::Plain charset=>"UTF-8">, use_ssl=>true, delete=>false, interval=>60>
  Error: no time information in "" {:level=>:error}

The error seems ot indicate that I am not specifying a time field, but as you can see from config below, I am.

input {
    s3 {
        bucket => "XXXXXXXXXX"
        region => "us-east-1"
        aws_credentials_file => "XXXXXXXXXX"
        prefix => "XXXXXXXXXX"
        temporary_directory => "/tmp"
        type => "XXXXXXXXXX"
        sincedb_path => "/dev/null"

filter {
        json {
            source => "message"
            add_field => [ "urlstring", "XXXXXXXXXX" ]
            add_field => [ "hardware_type", "XXXXXXXXXX" ]
            add_field => [ "duration", "XXXXXXXXXX" ]
            add_field => [ "time_occurred", "XXXXXXXXXX" ]
        date {
            match => [ "time_occurred", "UNIX_MS" ]
            target => "time_occurred"

output {
    stdout { codec => rubydebug }

Any ideas on what I'm doing wrong?

I belive its complaining about this.

It can't read the time from the sincedb.