Read timed out .. Attempted to send a bulk request to elasticsearch' but Elasticsearch appears to be unreachable or down!

Hi, I've found thousands of forum post on this topic, but not a single solution. This is my desperate attempt to solve my problem creating my own thread. Any help would be awesome, this is really difficult to setup.

I'm trying to import data from a postgresql database into kibana to create visualizations.

After successful installation and configuration of Logstash on my database server where my data is located. My Wazuh/Kibana server is in the same LAN, and all is reachable using tools like curl and telnet.

Logstash starts and appears to be working but later it complains that elasticsearch is not visible:

[2020-03-30T15:04:13,376][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//192.168.1.49:9200"]}
[2020-03-30T15:04:13,479][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-03-30T15:04:13,538][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2020-03-30T15:04:13,549][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>5, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>10, "pipeline.sources"=>["/etc/logstash/conf.d/db1.conf"], :thread=>"#<Thread:0x10b15a93 run>"}
[2020-03-30T15:04:13,641][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-03-30T15:04:15,084][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-03-30T15:04:15,216][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2020-03-30T15:04:15,772][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2020-03-30T15:04:17,726][INFO ][logstash.inputs.jdbc ][main] (0.089226s) SELECT * from packets limit 1000

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.