Logstash launching issue imput file

I'm trying to use logstash on json files but after running the following command :

sudo /usr/share/logstash/bin/logstash -f lucasFilter.conf

the command seems correct

I get this error :

[ERROR] 2019-07-10 13:10:55.615 [Converge PipelineAction::Create] agent - Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, => at line 9, column 14 (byte 231) after input {\n file {\n path =>"/home/Json_Data/4365d831-461e-4e34-a531-0388404dc06b.json"\n start_position => "beginning"\n sincedb_path =>"/dev/null"\n }\n\nfilter {\n json ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in block in compile_sources'", "org/jruby/RubyArray.java:2577:in map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:151:in initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:24:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:325:in block in converge_state'"]}
[INFO ] 2019-07-10 13:10:56.037 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}
[INFO ] 2019-07-10 13:11:01.011 [LogStash::Runner] runner - Logstash shut down.

my conf file is like this ( I only took one Json file in order to test, with a simple filter at first ) :

input {
file {
path =>"/home/Json_Data/4365d831-461e-4e34-a531-0388404dc06b.json"
start_position => "beginning"
sincedb_path =>"/dev/null"
}

filter {
json {
source => "message" target=> "theJSON"
}
}
output{
elasticsearch
{
action => "index"
hosts => ["localhost:9200"]
index => "test-%{+YYYY.MM.dd}"
}stdout {}
}

I've been looking around on forums and also on youtube a lot but I can't find what I did wrong :slight_smile:

Thanks in advance for any help you can give me :wink:

You are missing a } to close the input {} section.

1 Like

i feel dumb, thank you

You have a syntax error with the json filter. You can test your config using --config.test_and_exit

sudo /usr/share/logstash/bin/logstash -f /tmp/logstash.conf --config.test_and_exit
Thread.exclusive is deprecated, use Thread::Mutex
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[INFO ] 2019-07-10 14:16:51.546 [main] writabledirectory - Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
[INFO ] 2019-07-10 14:16:51.572 [main] writabledirectory - Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
[WARN ] 2019-07-10 14:16:52.173 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[FATAL] 2019-07-10 14:16:52.695 [LogStash::Runner] runner - The given configuration is invalid. Reason: Expected one of #, => at line 9, column 6 (byte 157) after input {
file {
path =>"/home/Json_Data/4365d831-461e-4e34-a531-0388404dc06b.json"
start_position => "beginning"
sincedb_path =>"/dev/null"
}

filter {
json
[ERROR] 2019-07-10 14:16:52.709 [LogStash::Runner] Logstash - java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

Looks like you're combing message and target. If source file is 100% json use codec in your file input. plugins-inputs-file-codec

1 Like

I have now the following conf file

input {
file {
codec => json
path => "/home/Json_Data/*.json"
start_position => "beginning"
sincedb_path =>"/dev/null"
}
}
filter{
json {
source => "message"
}
}
output{
elasticsearch
{
action => "index"
hosts => ["localhost:9200"]
index => "test-%{+YYYY.MM.dd}"
}stdout {}
}

logstash seems to be launching but i can't see my data loaded in Kibana (I'm 1 day old in elk so i don't get everything yet) :

Thread.exclusive is deprecated, use Thread::Mutex
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[WARN ] 2019-07-10 14:29:11.892 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2019-07-10 14:29:11.910 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"7.2.0"}
[INFO ] 2019-07-10 14:29:20.399 [[main]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[WARN ] 2019-07-10 14:29:20.631 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://localhost:9200/"}
[INFO ] 2019-07-10 14:29:20.837 [[main]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>7}
[WARN ] 2019-07-10 14:29:20.841 [[main]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[INFO ] 2019-07-10 14:29:20.873 [[main]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[INFO ] 2019-07-10 14:29:20.946 [Ruby-0-Thread-5: :1] elasticsearch - Using default mapping template
[WARN ] 2019-07-10 14:29:21.014 [[main]-pipeline-manager] LazyDelegatingGauge - A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[INFO ] 2019-07-10 14:29:21.018 [[main]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, :thread=>"#<Thread:0x3652dfb6 run>"}
[INFO ] 2019-07-10 14:29:21.085 [Ruby-0-Thread-5: :1] elasticsearch - Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[INFO ] 2019-07-10 14:29:21.448 [[main]-pipeline-manager] javapipeline - Pipeline started {"pipeline.id"=>"main"}
[INFO ] 2019-07-10 14:29:21.545 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[INFO ] 2019-07-10 14:29:21.552 [[main]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2019-07-10 14:29:21.899 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}

When posting configurations or logs please block quote them, which makes them much easier to read. In the edit pane, select the text of the log (or configuration) and click on </> in the toolbar above the edit pane. You should see the formatting change in the preview pane on the right.

Block quoted text appears like this

[INFO ] 2019-07-10 14:29:20.946 [Ruby-0-Thread-5: :1] elasticsearch - Using default mapping template
[WARN ] 2019-07-10 14:29:21.014 [[main]-pipeline-manager] LazyDelegatingGauge - A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.

Your configuration includes a stdout output, but your logs do not include any events. That suggests the file input is not finding any files. If you are sure that the path option to the file input is correct then run logstash with '--log.level trace' on the command line and see what filewatch has to say.

1 Like

[TRACE] 2019-07-10 14:45:39.730 [[main]<file] processor - Active - no change {"watched_file"=>"<FileWatch::WatchedFile: @filename='78ffd428-b93a-476e-8f39-f2de714138ff.json', @state='active', @recent_states='[:watched, :watched]', @bytes_read='94597', @bytes_unread='0', current_size='94597', last_stat_size='94597', file_open?='true', @initial=false, @sincedb_key='1059234 0 64770'>"}

I got plenty of these by running --log.level trace
it runs through all the files and start again

(thanks for the quote tip)

OK, so it thinks it is reading the file. Are you not seeing documents in elasticsearch? I cannot explain that. If you are using Kibana be sure to use the time picker to choose a time range that includes the logs you are ingesting.

BTW, with a json codec there is no field called message, so the json filter is usually a no-op and can be removed. It is only useful if your JSON contains a field called message that has JSON nested in it.

1 Like

I don't know if I'm doing the thing correctly
I'm pressing "check for you new data" ..
I'm at the right IP adress so this isn't the problem i guess

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.