Hi everyone,
I am completely new to the ELK stack and have been following several tutorials to set up log stash and create an index from my csv file. Even following every step, and using the uploaded config files, I get different errors. This is the one that has been most persistent those last days :
Javier$ bin/logstash -f /Users/Javier/ELK/Data/logstash-cars.conf
Sending Logstash logs to /Users/Javier/ELK/logstash-7.9.2/logs which is now configured via log4j2.properties
[2020-10-05T13:54:01,806][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.9.2", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc Java HotSpot(TM) 64-Bit Server VM 25.121-b13 on 1.8.0_121-b13 +indy +jit [darwin-x86_64]"}
[2020-10-05T13:54:02,731][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-10-05T13:54:07,361][INFO ][org.reflections.Reflections] Reflections took 207 ms to scan 1 urls, producing 22 keys and 45 values
[2020-10-05T13:54:09,265][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"cars", id=>"1f92dfd58e6841dbb87dc7f13c0da9925b7618ee9f475f54adf02e5e0df44b60", hosts=>[//localhost], document_type=>"sold_cars", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_b6547fdc-2c91-485e-a0b0-9ac846cc5895", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", ecs_compatibility=>:disabled, action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2020-10-05T13:54:12,472][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2020-10-05T13:54:12,873][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-10-05T13:54:13,115][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-10-05T13:54:13,125][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-10-05T13:54:13,288][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2020-10-05T13:54:13,410][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2020-10-05T13:54:13,587][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/Users/Javier/ELK/Data/logstash-cars.conf"], :thread=>"#<Thread:0x7015086e run>"}
[2020-10-05T13:54:13,601][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-10-05T13:54:16,847][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>3.24}
[2020-10-05T13:54:18,337][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2020-10-05T13:54:18,904][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2020-10-05T13:54:23,723][INFO ][logstash.runner ] Logstash shut down.
[2020-10-05T13:54:23,756][ERROR][org.logstash.Logstash ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
This is the config file downloaded from the course :
input {
file {
path => "cars.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ['maker', 'model', 'mileage', 'manufacture_year', 'engine_displacement', 'engine_power', 'body_type', 'color_slug', 'stk_year', 'transmission', 'door_count', 'seat_count', 'fuel_type', 'date_created', 'date_last_seen', 'price_eur']
}
mutate {convert => ["mileage", "integer"]}
mutate {convert => ["price_eur", "integer"]}
mutate {convert => ["engine_power", "integer"]}
mutate {convert => ["door_count", "integer"]}
mutate {convert => ["seat_count", "integer"]}
}
output {
elasticsearch {
hosts => "localhost"
index => "cars"
document_type => "sold_cars"
}
stdout {}
}
I am running on a macOS 10.14.3 with ELK 7.9.2 .
Let me know if there are any more details I can provide ; I checked pretty much every thread I found and I still have no clue what could have gone wrong.
Thank you for your help.