Stuck at Successfully started Logstash API endpoint {:port=>9600} (can’t load data in Kibana/ElasticSearch)

Hello, I'm new to ELK and I'm having an issue to load my data

Here is my imput file (it's just to test, it won't remain that empty)

input {
file {
codec => json
path => "/home/Json_Data/*.json"
start_position => "beginning"
sincedb_path =>"/dev/null"
json {
source => "message"
action => "index"
hosts => ["localhost:9200"]
index => "test-%{+YYYY.MM.dd}"
}stdout {}

By running logstash i'm having the following result on terminal

sudo /usr/share/logstash/bin/logstash -f lucas.conf
Thread.exclusive is deprecated, use Thread::Mutex
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/ Using default config which logs errors to the console
[WARN ] 2019-07-10 15:49:58.620 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2019-07-10 15:49:58.638 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"7.2.0"}
[INFO ] 2019-07-10 15:50:07.015 [[main]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[WARN ] 2019-07-10 15:50:07.252 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://localhost:9200/"}
[INFO ] 2019-07-10 15:50:07.457 [[main]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>7}
[WARN ] 2019-07-10 15:50:07.460 [[main]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[INFO ] 2019-07-10 15:50:07.491 [[main]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[INFO ] 2019-07-10 15:50:07.562 [Ruby-0-Thread-5: :1] elasticsearch - Using default mapping template
[WARN ] 2019-07-10 15:50:07.616 [[main]-pipeline-manager] LazyDelegatingGauge - A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[INFO ] 2019-07-10 15:50:07.620 [[main]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, :thread=>"#<Thread:0x717123f3 run>"}
[INFO ] 2019-07-10 15:50:07.684 [Ruby-0-Thread-5: :1] elasticsearch - Attempting to install template {:manage_template=>{"index_patterns"=>"logstash- ", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>" ", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[INFO ] 2019-07-10 15:50:08.053 [[main]-pipeline-manager] javapipeline - Pipeline started {""=>"main"}
[INFO ] 2019-07-10 15:50:08.146 [[main]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2019-07-10 15:50:08.146 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[INFO ] 2019-07-10 15:50:08.522 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}

I'm sure that it is reading the files, i checked it with --log.level trace.

I'm just not having anything when I try to "check for new data" on Kibana

I'm working on a VM

I'll be very thankful with every help you can give me since I'm on an internship and can't really start my work until i figure how to make this work :grimacing:

Thanks in advance

Hi LL22,

By reading the logstash output you can solve most of the issues.

First, you are running logstash directly from the installation directory, it failed to find your logstash.yml and
—> start the logstash as service
Sudo service start logstash

Sudo systemctl start logstash

Or specify the logstash.yml config location with --path.settings

Second, if you have enabled module in logstash: Module concept in logstash is in beta state and now it is moved to beats. When you enable module, your other pipeline configurations are ignored therefor your .conf files won’t be considered by logstash.

—> if you enabled module... easy way to fix is to uninstall logstash and install it from the scratch.

Could you share the logstash.yml file config from /etc/logstash/logstash.yml

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.