RemoteTransportException/SearchContextMissingException

Hello,

I'm currently using Logstash (2.1.1) to extract data from Elasticsearch (1.4.4) to a CSV.

However I'm having a few errors:

{:timestamp=>"2018-10-02T12:37:24.368000+0200", :message=>"A plugin had an unrecoverable error. Will restart this plugin.\n Plugin: <LogStash::Inputs::Elasticsearch hosts=>[\"host\"], index=>\"index\", codec=><LogStash::Codecs::JSON charset=>\"UTF-8\">, query=>\"{\\\"query\\\": { \\\"match_all\\\": {} } }\", scan=>true, size=>1000, scroll=>\"1m\", docinfo=>false, docinfo_target=>\"@metadata\", docinfo_fields=>[\"_index\", \"_type\", \"_id\"], ssl=>false>\n Error: [404] {\"_scroll_id\":\"c2NhbjswOzE7dG90YWxfaGl0czozMDgyMzc1Ow==\",\"took\":3,\"timed_out\":false,\"_shards\":{\"total\":5,\"successful\":0,\"failed\":5,\"failures\":[{\"status\":404,\"reason\":\"SearchContextMissingException[No search context found for id [244806435]]\"},{\"status\":404,\"reason\":\"RemoteTransportException[[machine][inet[ip:port]][indices:data/read/search[phase/scan/scroll]]]; nested: SearchContextMissingException[No search context found for id [380979448]]; \"},{\"status\":404,\"reason\":\"RemoteTransportException[[machine][inet[/ip:port]][indices:data/read/search[phase/scan/scroll]]]; nested: SearchContextMissingException[No search context found for id [380979446]]; \"},{\"status\":404,\"reason\":\"SearchContextMissingException[No search context found for id [244806436]]\"},{\"status\":404,\"reason\":\"RemoteTransportException[[machine][inet[/ip:port]][indices:data/read/search[phase/scan/scroll]]]; nested: SearchContextMissingException[No search context found for id [380979447]]; \"}]},\"hits\":{\"total\":3082375,\"max_score\":0.0,\"hits\":[]}}", :level=>:error}

I think that whenever this happens the extraction starts from the beginning, so I'll have duplicated data.

My conf file is the following:

input {
elasticsearch {
hosts => "${VUG_OIDX1_HOST}"
index => "index"
}
}

filter{
if[field1] == "2001-01-01T00:00:00"{
drop{
}
}
if[field2] {
ruby {
code => "event['field2'] = event['field2'].to_i + 2000"
}
}
}

output {
csv {
path => "${VUG_OIDX1_FILE}"
fields => [multiple fields]
}
}

Do you have any idea on how I can prevent this exceptions?

Apparently I've solved this issue by putting 30 minutes in the scroll:

scroll => "30m"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.