Upgrade 1.0.3 to 6.3.1

Hi,
I am at a customer with an older version of elasticsearch (1.0.3). One of those versions without documentation. I cannot seem to figure out the right syntax for this version. Does someone have an idea about the best way to migrate the content from the old version to a new 6.3.1 version. I don't mind doing stuff like intermediate versions in a file based record, if necessary I might be able to upgrade to 1.7 first. Does someone have experience with that?

Thanks for thinking with me

I'd reindex.

May be with logstash with an elasticsearch input and an elasticsearch output.

Using logstash is of course possible, it seems I am running into the same issue as with python. The scroll_id gives me issues. I see this in the logstash log and it seems that the scan starts at the beginning each time:

Error: [400] {"error":"ElasticsearchIllegalArgumentException[Failed to decode scrollId]; nested: IOException[Bad Base64 input character decimal 123 in array position 0]; ","status":400}
  Exception: Elasticsearch::Transport::Transport::Errors::BadRequest
  Stack: /Users/jettrocoenradie/Development/elastic/logstash/logstash-6.2.3/vendor/bundle/jruby/2.3.0/gems/elasticsearch-transport-5.0.4/lib/elasticsearch/transport/transport/base.rb:202:in `__raise_transport_error'

Any other tips?

I'm suprised by elasticsearch-transport-5.0.4. Does it mean that you are using the Transport layer?

No, now I am using plain elasticsearch input as well as output adapter.

Tried it with the latest logstash version, though I can imagine it has problems with the old elasticsearch version. This is part of the error I get now.

  Pipeline_id:main
  Plugin: <LogStash::Inputs::Elasticsearch hosts=>["localhost:29200"], index=>"logstash-2017.05.24", id=>"d380546ebd5a12c31600fd24d91c287dddbf8ac05c4e6de4fa37804724e28fdd", enable_metric=>true, codec=><LogStash::Codecs::JSON id=>"json_d22276ad-e391-4322-9f52-23d60d0d90c9", enable_metric=>true, charset=>"UTF-8">, query=>"{ \"sort\": [ \"_doc\" ] }", size=>1000, scroll=>"1m", docinfo=>false, docinfo_target=>"@metadata", docinfo_fields=>["_index", "_type", "_id"], ssl=>false>
  Error: [400] {"error":"ElasticsearchIllegalArgumentException[Failed to decode scrollId]; nested: IOException[Bad Base64 input character decimal 123 in array position 0]; ","status":400}
  Exception: Elasticsearch::Transport::Transport::Errors::BadRequest

Could be related to the problem in this issue:

So maybe I need to use an older logstash version

There are more questions reporting this, now that I search using the message from the error.

The logstash path seems hard with this version, however, it turns out I made a mistake in the url when using python. SO it works now using python.

Thanks for thinking with me

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.