You have a syntax error with the json filter. You can test your config using --config.test_and_exit
sudo /usr/share/logstash/bin/logstash -f /tmp/logstash.conf --config.test_and_exit
Thread.exclusive is deprecated, use Thread::Mutex
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[INFO ] 2019-07-10 14:16:51.546 [main] writabledirectory - Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
[INFO ] 2019-07-10 14:16:51.572 [main] writabledirectory - Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
[WARN ] 2019-07-10 14:16:52.173 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[FATAL] 2019-07-10 14:16:52.695 [LogStash::Runner] runner - The given configuration is invalid. Reason: Expected one of #, => at line 9, column 6 (byte 157) after input {
file {
path =>"/home/Json_Data/4365d831-461e-4e34-a531-0388404dc06b.json"
start_position => "beginning"
sincedb_path =>"/dev/null"
}
filter {
json
[ERROR] 2019-07-10 14:16:52.709 [LogStash::Runner] Logstash - java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
Looks like you're combing message and target. If source file is 100% json use codec in your file input. plugins-inputs-file-codec
logstash seems to be launching but i can't see my data loaded in Kibana (I'm 1 day old in elk so i don't get everything yet) :
Thread.exclusive is deprecated, use Thread::Mutex
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[WARN ] 2019-07-10 14:29:11.892 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2019-07-10 14:29:11.910 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"7.2.0"}
[INFO ] 2019-07-10 14:29:20.399 [[main]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[WARN ] 2019-07-10 14:29:20.631 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://localhost:9200/"}
[INFO ] 2019-07-10 14:29:20.837 [[main]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>7}
[WARN ] 2019-07-10 14:29:20.841 [[main]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[INFO ] 2019-07-10 14:29:20.873 [[main]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[INFO ] 2019-07-10 14:29:20.946 [Ruby-0-Thread-5: :1] elasticsearch - Using default mapping template
[WARN ] 2019-07-10 14:29:21.014 [[main]-pipeline-manager] LazyDelegatingGauge - A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[INFO ] 2019-07-10 14:29:21.018 [[main]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, :thread=>"#<Thread:0x3652dfb6 run>"}
[INFO ] 2019-07-10 14:29:21.085 [Ruby-0-Thread-5: :1] elasticsearch - Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[INFO ] 2019-07-10 14:29:21.448 [[main]-pipeline-manager] javapipeline - Pipeline started {"pipeline.id"=>"main"}
[INFO ] 2019-07-10 14:29:21.545 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[INFO ] 2019-07-10 14:29:21.552 [[main]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2019-07-10 14:29:21.899 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}
When posting configurations or logs please block quote them, which makes them much easier to read. In the edit pane, select the text of the log (or configuration) and click on </> in the toolbar above the edit pane. You should see the formatting change in the preview pane on the right.
Block quoted text appears like this
[INFO ] 2019-07-10 14:29:20.946 [Ruby-0-Thread-5: :1] elasticsearch - Using default mapping template
[WARN ] 2019-07-10 14:29:21.014 [[main]-pipeline-manager] LazyDelegatingGauge - A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
Your configuration includes a stdout output, but your logs do not include any events. That suggests the file input is not finding any files. If you are sure that the path option to the file input is correct then run logstash with '--log.level trace' on the command line and see what filewatch has to say.
OK, so it thinks it is reading the file. Are you not seeing documents in elasticsearch? I cannot explain that. If you are using Kibana be sure to use the time picker to choose a time range that includes the logs you are ingesting.
BTW, with a json codec there is no field called message, so the json filter is usually a no-op and can be removed. It is only useful if your JSON contains a field called message that has JSON nested in it.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.