Unable to run Logstash

Hi everyone,
I am completely new to the ELK stack and have been following several tutorials to set up log stash and create an index from my csv file. Even following every step, and using the uploaded config files, I get different errors. This is the one that has been most persistent those last days :

Javier$ bin/logstash -f /Users/Javier/ELK/Data/logstash-cars.conf
Sending Logstash logs to /Users/Javier/ELK/logstash-7.9.2/logs which is now configured via log4j2.properties
[2020-10-05T13:54:01,806][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.9.2", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc Java HotSpot(TM) 64-Bit Server VM 25.121-b13 on 1.8.0_121-b13 +indy +jit [darwin-x86_64]"}
[2020-10-05T13:54:02,731][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-10-05T13:54:07,361][INFO ][org.reflections.Reflections] Reflections took 207 ms to scan 1 urls, producing 22 keys and 45 values 
[2020-10-05T13:54:09,265][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"cars", id=>"1f92dfd58e6841dbb87dc7f13c0da9925b7618ee9f475f54adf02e5e0df44b60", hosts=>[//localhost], document_type=>"sold_cars", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_b6547fdc-2c91-485e-a0b0-9ac846cc5895", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", ecs_compatibility=>:disabled, action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2020-10-05T13:54:12,472][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2020-10-05T13:54:12,873][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-10-05T13:54:13,115][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-10-05T13:54:13,125][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-10-05T13:54:13,288][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2020-10-05T13:54:13,410][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2020-10-05T13:54:13,587][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/Users/Javier/ELK/Data/logstash-cars.conf"], :thread=>"#<Thread:0x7015086e run>"}
[2020-10-05T13:54:13,601][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-10-05T13:54:16,847][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>3.24}
[2020-10-05T13:54:18,337][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2020-10-05T13:54:18,904][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2020-10-05T13:54:23,723][INFO ][logstash.runner          ] Logstash shut down.
[2020-10-05T13:54:23,756][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit 

This is the config file downloaded from the course :

input {

file {

path => "cars.csv"
start_position => "beginning"
sincedb_path => "/dev/null"

}

}

filter {
csv {
separator => ","

columns => ['maker', 'model', 'mileage', 'manufacture_year', 'engine_displacement', 'engine_power', 'body_type', 'color_slug', 'stk_year', 'transmission', 'door_count', 'seat_count', 'fuel_type', 'date_created', 'date_last_seen', 'price_eur']

}

mutate {convert => ["mileage", "integer"]}
mutate {convert => ["price_eur", "integer"]}
mutate {convert => ["engine_power", "integer"]}
mutate {convert => ["door_count", "integer"]}
mutate {convert => ["seat_count", "integer"]}

}
output {
	elasticsearch {
		hosts => "localhost"
		index => "cars"
		document_type => "sold_cars"
	}
	stdout {}

}

I am running on a macOS 10.14.3 with ELK 7.9.2 .

Let me know if there are any more details I can provide ; I checked pretty much every thread I found and I still have no clue what could have gone wrong.
Thank you for your help.

Try this

Also the output hosts:

Please use

hosts: ["http://localhost:9200"]

Hi @Javier_Stauffenberg welcome to the community.

With the

sincedb_path => "/dev/null"

On MacOS You may need to run logstash command with sudo

sudo ./bin/logstash -f /Users/Javier/ELK/Data/logstash-cars.conf

Hi Fadjar,
I changed the host and tried your command, but another error does show up :

bin/logstash -f /Users/Javier/ELK/Data
Sending Logstash logs to /Users/Javier/ELK/logstash-7.9.2/logs which is now configured via log4j2.properties
[2020-10-05T15:48:14,528][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.9.2", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc Java HotSpot(TM) 64-Bit Server VM 25.121-b13 on 1.8.0_121-b13 +indy +jit [darwin-x86_64]"}
[2020-10-05T15:48:14,938][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
java.lang.OutOfMemoryError: Java heap space
Dumping heap to java_pid3419.hprof ...
Heap dump file created [576404189 bytes in 2.987 secs]
warning: thread "Agent thread" terminated with exception (report_on_exception is true):
java.lang.OutOfMemoryError: Java heap space
	at org.jruby.util.ByteList.ensure(ByteList.java:345)
	at org.jruby.RubyString.modify(RubyString.java:997)
	at org.jruby.RubyString.modifyExpand(RubyString.java:1007)
	at org.jruby.util.io.EncodingUtils.setStrBuf(EncodingUtils.java:1120)
	at org.jruby.util.io.OpenFile.fread(OpenFile.java:1783)
	at org.jruby.util.io.OpenFile.readAll(OpenFile.java:1695)
	at org.jruby.RubyIO.read(RubyIO.java:3061)
	at org.jruby.RubyIO.read(RubyIO.java:3049)
	at org.jruby.RubyIO.read(RubyIO.java:3781)
	at org.jruby.RubyIO$INVOKER$s$0$3$read.call(RubyIO$INVOKER$s$0$3$read.gen)
	at org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207)
	at java.lang.invoke.LambdaForm$DMH/1715606187.invokeVirtual_L6_L(LambdaForm$DMH)
	at java.lang.invoke.LambdaForm$BMH/2130242983.reinvoke(LambdaForm$BMH)
	at java.lang.invoke.LambdaForm$MH/291284958.linkToCallSite(LambdaForm$MH)
	at Users.Javier.ELK.logstash_minus_7_dot_9_dot_2.logstash_minus_core.lib.logstash.config.source.local.RUBY$block$read$1(/Users/Javier/ELK/logstash-7.9.2/logstash-core/lib/logstash/config/source/local.rb:87)
	at java.lang.invoke.LambdaForm$DMH/1221555852.invokeStatic_L6_L(LambdaForm$DMH)
	at java.lang.invoke.LambdaForm$BMH/366008009.reinvoke(LambdaForm$BMH)
	at java.lang.invoke.LambdaForm$MH/1868366224.invokeExact_MT(LambdaForm$MH)
	at org.jruby.runtime.CompiledIRBlockBody.yieldDirect(CompiledIRBlockBody.java:148)
	at org.jruby.runtime.BlockBody.yield(BlockBody.java:106)
	at org.jruby.runtime.Block.yield(Block.java:184)
	at org.jruby.RubyArray.each(RubyArray.java:1809)
	at org.jruby.RubyArray$INVOKER$i$0$0$each.call(RubyArray$INVOKER$i$0$0$each.gen)
	at org.jruby.internal.runtime.methods.JavaMethod$JavaMethodZeroBlock.call(JavaMethod.java:555)
	at org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:197)
	at java.lang.invoke.LambdaForm$DMH/1715606187.invokeVirtual_L6_L(LambdaForm$DMH)
	at java.lang.invoke.LambdaForm$BMH/1093707336.reinvoke(LambdaForm$BMH)
	at java.lang.invoke.LambdaForm$MH/291284958.linkToCallSite(LambdaForm$MH)
	at Users.Javier.ELK.logstash_minus_7_dot_9_dot_2.logstash_minus_core.lib.logstash.config.source.local.RUBY$method$read$0(/Users/Javier/ELK/logstash-7.9.2/logstash-core/lib/logstash/config/source/local.rb:77)
	at Users.Javier.ELK.logstash_minus_7_dot_9_dot_2.logstash_minus_core.lib.logstash.config.source.local.RUBY$method$read$0$__VARARGS__(/Users/Javier/ELK/logstash-7.9.2/logstash-core/lib/logstash/config/source/local.rb)
	at java.lang.invoke.LambdaForm$DMH/2054881392.invokeStatic_L7_L(LambdaForm$DMH)
	at java.lang.invoke.LambdaForm$MH/2081368312.invokeExact_MT(LambdaForm$MH)
[2020-10-05T15:48:24,124][ERROR][org.logstash.Logstash    ] java.lang.OutOfMemoryError: Java heap space

Hi Stephen,
Thank you. Running with sudo gives me the same error :

logstash-7.9.2 Javier$ sudo bin/logstash -f /Users/Javier/ELK/Data/logstash-cars.conf
Password:
Sending Logstash logs to /Users/Javier/ELK/logstash-7.9.2/logs which is now configured via log4j2.properties
[2020-10-05T15:49:11,062][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.9.2", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc Java HotSpot(TM) 64-Bit Server VM 25.121-b13 on 1.8.0_121-b13 +indy +jit [darwin-x86_64]"}
[2020-10-05T15:49:11,573][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-10-05T15:49:14,989][INFO ][org.reflections.Reflections] Reflections took 77 ms to scan 1 urls, producing 22 keys and 45 values 
[2020-10-05T15:49:16,289][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"cars", id=>"1ccb6d17d167a3bcebcd66ed1f1d93befe7adc586fc150e4e6534f4e71e996cb", hosts=>[http://localhost:9200], document_type=>"sold_cars", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_c8331dc0-c66b-45b7-a2db-084db33bd2ae", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", ecs_compatibility=>:disabled, action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2020-10-05T15:49:20,043][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2020-10-05T15:49:20,519][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-10-05T15:49:20,651][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-10-05T15:49:20,663][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-10-05T15:49:20,791][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2020-10-05T15:49:20,875][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2020-10-05T15:49:21,032][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-10-05T15:49:21,112][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/Users/Javier/ELK/Data/logstash-cars.conf"], :thread=>"#<Thread:0x75a0503c run>"}
[2020-10-05T15:49:22,903][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.78}
[2020-10-05T15:49:23,826][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2020-10-05T15:49:24,325][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2020-10-05T15:49:29,143][INFO ][logstash.runner          ] Logstash shut down.
[2020-10-05T15:49:29,172][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

Possibly unrelated but it might be helpful : when I try running kibana from the command line I also get an error.

bin/kibana
  log   [13:53:10.543] [warning][plugins-discovery] Expect plugin "id" in camelCase, but found: beats_management
  log   [13:53:10.564] [warning][plugins-discovery] Expect plugin "id" in camelCase, but found: triggers_actions_ui
  log   [13:53:44.897] [info][plugins-service] Plugin "visTypeXy" is disabled.
  log   [13:53:44.898] [info][plugins-service] Plugin "auditTrail" is disabled.
  log   [13:53:45.741] [warning][config][deprecation] Config key [monitoring.cluster_alerts.email_notifications.email_address] will be required for email notifications to work in 8.0."
  log   [13:53:45.777] [fatal][root] Error: Port 5601 is already in use. Another instance of Kibana may be running!
    at Root.shutdown (/Users/Javier/ELK/kibana-7.9.2-darwin-x86_64/src/core/server/root/index.js:67:18)
    at Root.setup (/Users/Javier/ELK/kibana-7.9.2-darwin-x86_64/src/core/server/root/index.js:46:18)
    at process._tickCallback (internal/process/next_tick.js:68:7)

 FATAL  Error: Port 5601 is already in use. Another instance of Kibana may be running!

Kibana is actually already up and running on :5601 by simply launching elasticsearch. I can't come up with an explanation, as I literally followed every step of the documentation to run each of them.

Try this and post the logs.

EDIT : CORRECTED COMMAND

sudo bin/logstash -f /Users/Javier/ELK/Data/logstash-cars.conf --log.level debug

Also not sure why your are have duplicate 2 kibanas running, we can't see your instructions :slight_smile:

Throws me immediately an error with the keyword debug :confused:

sudo bin/logstash -f /Users/Javier/ELK/Data/logstash-cars.conf -log.level debug
ERROR: Unknown command 'debug'

Trying without debug, and I get something that seems to be alike to the original log :

sudo bin/logstash -f /Users/Javier/ELK/Data/logstash-cars.conf -log.level
Sending Logstash logs to og.level which is now configured via log4j2.properties
[2020-10-05T16:47:25,576][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.9.2", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc Java HotSpot(TM) 64-Bit Server VM 25.121-b13 on 1.8.0_121-b13 +indy +jit [darwin-x86_64]"}
[2020-10-05T16:47:26,333][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-10-05T16:47:30,345][INFO ][org.reflections.Reflections] Reflections took 205 ms to scan 1 urls, producing 22 keys and 45 values 
[2020-10-05T16:47:31,727][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"cars", id=>"1ccb6d17d167a3bcebcd66ed1f1d93befe7adc586fc150e4e6534f4e71e996cb", hosts=>[http://localhost:9200], document_type=>"sold_cars", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_10c72fec-1ace-4c45-9848-38ff0bbaceda", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", ecs_compatibility=>:disabled, action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2020-10-05T16:47:35,269][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2020-10-05T16:47:35,703][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-10-05T16:47:35,840][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-10-05T16:47:35,848][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-10-05T16:47:35,944][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2020-10-05T16:47:36,057][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2020-10-05T16:47:36,200][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-10-05T16:47:36,205][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/Users/Javier/ELK/Data/logstash-cars.conf"], :thread=>"#<Thread:0x21ed4057 run>"}
[2020-10-05T16:47:39,234][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>3.02}
[2020-10-05T16:47:40,013][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2020-10-05T16:47:40,526][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2020-10-05T16:47:45,358][INFO ][logstash.runner          ] Logstash shut down.
[2020-10-05T16:47:45,398][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

That should be --log.level debug with two dashes.

1 Like

Corrected thanks @Badger

Thank you both. The logs are above the characters limit, I'll split in multiple posts.

Edit : It would take me more than 5 posts to show the entire log, I uploaded it on a Github repo. I hope this is ok.
Logs

The important message is

Pipeline terminated by worker error {:pipeline_id=>"main", :exception=>#<ArgumentError: File paths must be absolute, relative path specified: cars.csv>

I was rather surprised to see that each pipeline worker thread compiles the configuration. Since all workers on a given pipeline are running an identical configuration that seems inefficient!

2 Likes

Shoot should have seen that right in the beginning as @Badger says you need a full path the the cars.csv file

1 Like

Thank you for your help Stephen and Badger, it does work now!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.