Logstash stop logging after few minutes / hours

We are facing crash since we set up a new stack ELK 2.0
We saw errors like this :

{:timestamp=>"2015-11-19T18:59:59.000000+0100", :message=>"Attempted to send a bulk request to Elasticsearch configured at '[\"https://elastic1.domain:443/\", \"https://elastic2.domain:443/\", \"https://elastic3.domain:443/\", \"https://elastic4.domain:443/\"]', but an error occurred and it failed! Are you sure you can reach elasticsearch from this machine using the configuration provided?", :client_config=>{:hosts=>["https://elastic1.domain:443/", "https://elastic2.domain:443/", "https://elastic3.domain:443/", "https://elastic4.domain:443/"], :ssl=>{:verify=>false}, :transport_options=>{:socket_timeout=>0, :request_timeout=>0, :proxy=>nil, :ssl=>{:verify=>false}}, :transport_class=>Elasticsearch::Transport::Transport::HTTP::Manticore, :logger=>nil, :tracer=>nil, :reload_connections=>false, :retry_on_failure=>false, :reload_on_failure=>false, :randomize_hosts=>false}, :error_message=>"[400] {\"error\":{\"root_cause\":[{\"type\":\"illegal_argument_exception\",\"reason\":\"Malformed action/metadata line [613], expected a simple value for field [_type] but found [START_ARRAY]\"}],\"type\":\"illegal_argument_exception\",\"reason\":\"Malformed action/metadata line [613], expected a simple value for field [_type] but found [START_ARRAY]\"},\"status\":400}", :error_class=>"Elasticsearch::Transport::Transport::Errors::BadRequest", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.14/lib/elasticsearch/transport/transport/base.rb:136:in `__raise_transport_error'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.14/lib/elasticsearch/transport/transport/base.rb:228:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.14/lib/elasticsearch/transport/transport/http/manticore.rb:54:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.14/lib/elasticsearch/transport/client.rb:119:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.14/lib/elasticsearch/api/actions/bulk.rb:87:in `bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:56:in `bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.2-java/lib/logstash/outputs/elasticsearch.rb:353:in `submit'", "org/jruby/ext/thread/Mutex.java:149:in `synchronize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.2-java/lib/logstash/outputs/elasticsearch.rb:350:in `submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.2-java/lib/logstash/outputs/elasticsearch.rb:382:in `flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:219:in `buffer_flush'", "org/jruby/RubyHash.java:1342:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:216:in `buffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:193:in `buffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:159:in `buffer_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.2-java/lib/logstash/outputs/elasticsearch.rb:343:in `receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/outputs/base.rb:80:in `handle'", "(eval):403:in `output_func'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/pipeline.rb:252:in `outputworker'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/pipeline.rb:169:in `start_outputs'"], :level=>:error}

It seems that the source of the problem is
Malformed action/metadata line [613], expected a simple value for field [_type] but found [START_ARRAY]

It seems you're trying to index a document with a type field that's an array instead of a string.