Using Elasticsearch as an input Errors

Hopefully someone will be able to help with this as it is driving me mad at the moment!

I have successfully managed to load 2.8m records into an elasticsearch cluster using sql server as the source. I am now trying to copy the data from one index to a new index. If i try and use elasticsearch input i get a generic error of:

Error: [400] {"error":{"root_cause":[{"type":"action_request_validation_except
ion","reason":"Validation Failed: 1: scrollId is missing;"}],"type":"action_requ
est_validation_exception","reason":"Validation Failed: 1: scrollId is missing;"}

I have tried various setting but i have been unable to get an input working for logshash using elasticsearch as the input even if i use a match_all query. The input i am using at the moment is:

input {
elasticsearch {
hosts => "localhost"
query => ' "fields": [
"screendataid",
"accountid",
"feedtypeid",
"sourceid",
"externalfeedid",
"url",
"title",
"description",
"screenhtml",
"screentext",
"articledate",
"createddate",
"rowversion",
"translatorlanguagecodeid",
"tweetdataid",
"articleimageid",
"urlhash",
"externalfeedidhash",
"expirydate",
"removaldate",
"displaydomain"
],
"query": {
"match": {
"_index": "screendata"
}
},
"filter": {
"term": {
"accountid": "3"
}
}'
}
}

Am I just missing something?

Hi Andy,

this happens because your index is not existing. If this is the case, Elasticsearch is not returning any scroll id. Just create the index and run it again, it should work then.

b

Hi,

I am working on Logstash and Elasticsearch on both V 2.4.

I also encountered the same error as above and got it fixed as the Index was not yet there for my other config as below who pulls data using Elasticsearch as input.

input {
 #Read all documents from Elasticsearch matching the given query
  elasticsearch {
   hosts => "localhost"
   index => "proxy_new_hash_gz*"
   query => '{ "fields": [ "Fld1", "Fld2", "Fld3", "Fld4" ], "query": { "match_all" : {} } }'
   }
}

 output {
  file {
    path => "/logs/Hashlogs-%{+YYYY-MM-dd-H}.gz"
    gzip => true
  }
}

However, I get the below error in the logstash logs,

{:timestamp=>"2016-11-01T18:28:29.237000+0530", :message=>"A plugin had an unrecoverable error. Will restart this plugin.\n Plugin: <LogStash::Inputs::Elasticsearch hosts=>["localhost"], index=>"proxy_new_hash_gz*", query=>"{ \"fields\": [ \"Fld1\", \"Fld2\", \"Fld3\", \"Fld4\" ], \"query\": { \"match_all\" : {} } }", codec=><LogStash::Codecs::JSON charset=>"UTF-8">, scan=>true, size=>1000, scroll=>"1m", docinfo=>false, docinfo_target=>"@metadata", docinfo_fields=>["_index", "_type", "_id"], ssl=>false>\n Error: undefined method `[]' for nil:NilClass", :level=>:error}

Could anyone suggest how to fix this please?

Thanks