Logstash and elasticsearch on different machines

I have logstash and Elastsearch on different machines. When i run Logstash on same machine it works all fine(with 'localhost' in hosts) but when i specify IP address in the Hosts section of Conf file it does not creates index. the output from Logstash is as follows:-

Java HotSpot(TM) 64-Bit Server VM warning: Ignoring option UseConcMarkSweepGC; support was removed in 14.0
Java HotSpot(TM) 64-Bit Server VM warning: Ignoring option CMSInitiatingOccupancyFraction; support was removed in 14.0
Java HotSpot(TM) 64-Bit Server VM warning: Ignoring option UseCMSInitiatingOccupancyOnly; support was removed in 14.0
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by com.headius.backport9.modules.Modules (file:/C:/Project/logstash-7.7.0/logstash-core/lib/jars/jruby-complete-9.2.11.1.jar) to field java.io.Console.cs
WARNING: Please consider reporting this to the maintainers of com.headius.backport9.modules.Modules
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Sending Logstash logs to C:/Project/logstash-7.7.0/logs which is now configured via log4j2.properties
[2020-05-19T19:45:01,169][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-05-19T19:45:01,279][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.7.0"}
[2020-05-19T19:45:02,516][INFO ][org.reflections.Reflections] Reflections took 47 ms to scan 1 urls, producing 21 keys and 41 values
[2020-05-19T19:45:03,723][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://192.168.51.100:9200/]}}
[2020-05-19T19:45:03,911][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://192.168.51.100:9200/"}
[2020-05-19T19:45:03,974][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-05-19T19:45:03,974][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-05-19T19:45:04,052][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://192.168.51.100:9200"]}
[2020-05-19T19:45:04,117][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-05-19T19:45:04,132][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been created for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2020-05-19T19:45:04,132][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["C:/Project/Log/sample.conf"], :thread=>"#<Thread:0x3bdb6c5e run>"}
[2020-05-19T19:45:04,210][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-05-19T19:45:05,271][INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/Project/logstash-7.7.0/data/plugins/inputs/file/.sincedb_8d9566297ac4987e711aafe4a88b2724", :path=>["C:/Project/Log/sample.txt"]}
[2020-05-19T19:45:05,302][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-05-19T19:45:05,346][INFO ][filewatch.observingtail  ][main][253b58041f339951f57d5a400fe9cbebb44b789526885e5c4061ea24665dc057] START, creating Discoverer, Watch with file and sincedb collections
[2020-05-19T19:45:05,348][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-05-19T19:45:05,602][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
1 Like

Sharing your config would be helpful.

Thanks...i solved the issue by putting "DisablePing=true" in NLog.config file

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.