Elasticsearch to elasticsearch index transfer

I am trying to move an index from my development cluster to our production cluster via logstash. I am getting the below response in the logs that looks like a web response. The development has no security so I'm not real sure what is answering. Advice would be appreciated.

input {
elasticsearch {
hosts => "135.89.18.199:9200"
index => "apigw-example"
query => '{ "query": { "match_all": { } } }'
}
}
filter {

}

output {
stdout { codec => dots }
elasticsearch {
hosts => "roacamu01.gcsc.att.com:9200"
user => "elastic"
password => "xx"
index => "goss-example-incident"
document_type => "%{[@metadata][_type]}"
document_id => "%{[@metadata][_id]}"
}
}

2018-05-15T08:52:41,550][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-05-15T08:52:44,425][WARN ][logstash.agent ] stopping pipeline {:id=>"main"}
[2018-05-15T08:54:00,276][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://elastic:xxxxxx@roacamu01.gcsc.att.com:9200/]}}
[2018-05-15T08:54:00,284][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://elastic:xxxxxx@roacamu01.gcsc.att.com:9200/, :path=>"/"}
[2018-05-15T08:54:00,694][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#Java::JavaNet::URI:0x326b5725}
[2018-05-15T08:54:00,697][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-05-15T08:54:00,804][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-05-15T08:54:00,864][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>[#Java::JavaNet::URI:0x45b05ea]}
[2018-05-15T08:54:00,869][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2018-05-15T08:54:00,916][INFO ][logstash.pipeline ] Pipeline main started
[2018-05-15T08:54:01,058][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-05-15T08:54:01,105][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::Elasticsearch hosts=>["localhost:9200"], index=>"apigw-example", query=>"{ "query": { "match_all": { } } }", id=>"a33a4d8243e09141da783c0bd6b0f87a777f45d9-1", enable_metric=>true, codec=><LogStash::Codecs::JSON id=>"json_c9b1a58f-2e22-4704-91aa-9f2497dc6d3e", enable_metric=>true, charset=>"UTF-8">, size=>1000, scroll=>"1m", docinfo=>false, docinfo_target=>"@metadata", docinfo_fields=>["_index", "_type", "_id"], ssl=>false>
Error: [407]

Access Denied
Access Denied (authentication_failed)

Your credentials could not be authenticated: "Credentials are missing.". You will not be permitted access until your credentials can be verified.
This is typically caused by an incorrect username and/or password, but could also be caused by network problems.

The error message says it is talking to localhost:9200, which does not match the configuration you posted. Which is correct?

Actually both. I tried with localhost and then with the ip address. Both do the same. I just got my paste mixed up. Sorry.

I have restarted the conf and it ran for bit. Uploaded around 260,000 of around 6 million. At that point it started with the authorization again.

[2018-05-15T13:23:09,830][ERROR][logstash.pipeline ] A plugin had an unre
coverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::Elasticsearch hosts=>["135.89.18.199:9200"], index=

"apigw-example", query=>"{ "query": { "match_all": { } } }", size=>500, scr
oll=>"30s", id=>"98c338c98a064ea1057c8a48eac8166a04e36ba0-1", enable_metric=>tru
e, codec=><LogStash::Codecs::JSON id=>"json_faaa54ff-fa52-49ec-8e9e-c9dd9565714e
", enable_metric=>true, charset=>"UTF-8">, docinfo=>false, docinfo_target=>"@met
adata", docinfo_fields=>["_index", "_type", "_id"], ssl=>false>
Error: [407]

Access Denied
Access Denied (authentication_failed)

Your credentials could not be authenticated: "Credentials are missing.". You wil l not be permitted access until your credentials can be verified.
This is typically caused by an incorrect username and/or password, but could als o be caused by network problems.

I did modify the conf fiel with a size and scroll.

input {
elasticsearch {
hosts => "135.89.18.199:9200"
index => "apigw-example"
query => '{ "query": { "match_all": { } } }'
size => 500
scroll => "30s"
}
}
filter {

}

output {
stdout { codec => dots }
elasticsearch {
hosts => "roacamu01.gcsc.att.com:9200"
user => "elastic"
password => "xx"
index => "goss-example-incident"
}
}

Shouldn't you add user and password to the input aswell? seems like to going wrong there...

The input is off my development machine and it does not have security active. However, I have found the problem I believe. I am dealing the data across a subnet and I think the internal security is challenging me.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.