Elasticsearch Unreachable error

Hi all!
I spent the whole night yesterday and the night before yesterday to solve this problem but I couldn't find any solution. What's important I configured all environment on the docker container. I want to send some raw data in .csv format to elasticsearch from logstash. When I try to do this after 20/30 minutes I got this warning:
elasticsearch - Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"http://localhost:9200/", :error_type=>LogStash::Outputs::Elasticsearch::HttpClient::Pool::HostUnreachableError, :error=>"Elasticsearch Unreachable: [http://localhost:9200/][Manticore::SocketException] Connection refused (Connection refused)"}
I can connect with elasticsearch. What's strange some data are uploaded to elasticsearch e.x I've got 1000000 rows and maybe half of them are loaded and suddenly we see this warning. Any ideas what's wrong?
I will be very grateful for your support because now I'm totally hopeless.

Below my> data_w.conf:

input {
file {
path => "/etc/logstash/conf.d/data_w.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

filter {
csv {
separator => ","
columns => ['timestamp', '1', '2', '3', '4', '5', '6', '7', '8', '9', '10', '11', '12', '13', '14', '15', '16', '17', '18', '19', '20', '21', '22', '23', '24', '25', '26', '27', '28', '29', '30', '31', '32', '33', '34', '35', '36', '37', '38', '39', '40', '41', '42', '43', '44', '45', '46', '47', '48', '49', '50', '51', '52', '53', '54', '55', '56', '57', '58', '59', '60', '61', '62', '63', '64', '65', '66', '67', '68', '69', '70', '71', '72', '73', '74', '75', '76', '77', '78', '79', '80', '81', '82', '83', '84', '85', '86', '87', '88', '89', '90', '91', '92', '93', '94', '95', '96', '97', '98', '99', '100', '101', '102', '103', '104', '105', '106', '107', '108', '109', '110', '111', '112', '113', '114', '115', '116', '117', '118', '119', '120', '121', '122', '123', '124', '125', '126', '127', '128', '129', '130', '131', '132', '133', '134', '135', '136', '137', '138', '139', '140', '141', '142', '143', '144', '145', '146', '147', '148', '149', '150', '151', '152', '153', '154', '155', '156', '157', '158', '159', '160', '161', '162', '163', '164', '165', '166', '167', '168', '169', '170', '171', '172', '173', '174', '175', '176', '177', '178', '179', '180', '181', '182', '183', '184', '185', '186', '187', '188', '189', '190', '191', '192', '193', '194', '195', '196', '197', '198', '199', '200', '201']
}
}

output {
stdout {}
elasticsearch {
action => "index"
hosts => ["localhost:9200"]
index => "data_w"
document_type => "data"
}
}

I try to change hosts field to ["id_container:9200"] but it doesn't help. I can create index and make visualization from website so elasticsearch, kibana and logstash should be configured correctly I suppose.

Waiting for your response!

I solved this problem and now everything works perfectly!!!! It turned out that my nginx blocked uploading user file if it's size is more than 1MB (it's default configuration). I set client_max_body_size in nginx.conf file -> https://stackoverflow.com/questions/26717013/how-to-edit-nginx-conf-to-increase-file-size-upload

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.