Missing and disordered when using logstash elasticsearch input and file output

I'm trying to query elasticsearch with logstash elasticsearch input plugin and save the query result to files, messages in the file appears disordered, although I've put in a sort clause in the query string, and not all messages are there for the duration. Not sure whether this is expected, or something wrong with my configuration.
Elasticsearch/logstash version: 2.4
Logstash configuration as below:

elasticsearch {
hosts => ["serverA", "serverB", "serverC"]
query => '{ "query": {"match_all": {}}, "filter": {"range": {"@timestamp": { "gte": "now-5m", "lt": "now"} } },"sort": [{"@timestamp": {"order": "asc"}}]} '
index => "logstash-*"
scroll => "5m"

filter {
ruby {
code => "
tstamp_ticks = event['@timestamp'].to_i
tstamp_gmt = Time.at(tstamp_ticks)
tstamp_local = tstamp_gmt.getlocal
event['time_str'] = tstamp_local.strftime('%Y-%m-%d-%H')
event['min_index'] = tstamp_local.min/5 "

output {
file {
codec => line { format => "%{@timestamp},%{message}"}
path => "/tmp/data/logstash-%{time_str}-%{min_index}.txt"

add on, when I use exact time frame in the range filter (using epoch time), number of records returned are correct.

Realised that if the record doesn't contain the field I'm trying to write to the file, the record will not be written to the file. I still have no clue why the order is incorrect, anyone have any ideas?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.