Docker S3 log error

I'm trying to analice an asset log from AWS S3 bucket, I have a docker container with the ELK stack integrated, my logstash config file is this one:

input {
file {
path => "/home/*"
start_position => beginning
}
}

filter {
grok {
match => { "message" => "%{S3_ACCESS_LOG}"}
}

date {
locale => "en"
match => [ "timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}

output {
elasticsearch { hosts => ["localhost"] }
stdout { codec => rubydebug }
}

I have multiple logs downloaded in my computer, for that reason I use the path "/home/*", my log info looks like this one:

aac4092fe3b00a5f6f84224f4694f4c46c9f2d147d9d97ddbe7aed04dfd9e99d assets.quefilo.com [21/Jul/2016:22:28:16 +0000] 10.219.249.29 arn:aws:iam::038493190737:user/josue.murillo 9D5242A3E8ABC2C4 REST.GET.NOTIFICATION - "GET /?notification HTTP/1.1" 200 - 115 - 23 - "-" "aws-internal/3" -

and I'm using this command to run logstash config file: /opt/logstash/bin/logstash -f /etc/logstash/conf.d/

but logstash is not creating the elasticsearch index, and is not returning any results in terminal, so my question is, why index aren't being created? thank for help!

If the files are older than 24 hours you need to adjust the file input's ignore_older option. Apart from that I'm pretty sure there are clues in the logs. You may have to increase the logging verbosity by starting Logstash with --verbose or even --debug.

Thanks a lot @magnusbaeck using ignore_older option now logs are being mapped, I've improved my log configuration file to this one:

input {
file {
path => "/home/*"
start_position => beginning
ignore_older => 0
}
}

filter {
grok {
match => { "message" => "%{SYSLOG5424SD:timestamp}.%{IP} arn:aws:iam%{IP}.:user%{URIPATHPARAM} .* %{JAVACLASS:method}"}
}

date {
match => [ "timestamp" , "[dd/MMM/YYYY:HH:mm:ss Z]"]
target => "@timestamp"
}

}

output {
elasticsearch { hosts => ["localhost"] }
stdout { codec => rubydebug }
}

and I can't get the @timestamp from message, what I'm doing wrong?

As discussed in S3 assets log your grok filter isn't working.

What do you have so far that isn't working?

I can't get timestamp from log message, but logs are being mapped and can see them in kibana, using configuration mentioned above, and you told me my grok filter wasn't working, but what should be proper grok expression for log mentioned above, to get timestamp from log message?