Hello,
sorry to bother, but I am completely new to ES stuff. I just succesfully started ES and Kibana, but I have huge problems with LS. I am trying to simply load a csv file with a simple structure into Kibana trough LS, now that I've managed to start LS, it doesn't load the files, and I cant find them ind Kibana. I guess their not being indexed but I don't know why. Here the the thing that LS writes in cmd and I would expect it to list the fields bellow (as I watched on so many videos).
Sending Logstash logs to C:/Users/BRIGADA/Downloads/logstash-6.5.1/logstash-6.5.1/logs which is now configured via log4j2.properties
[2018-12-10T13:41:43,647][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-12-10T13:41:43,667][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.5.1"}
[2018-12-10T13:41:46,509][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue t
o work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feat
ure If you have any questions about this, please visit the logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::Elasticsearch index=>"nahravky",
id=>"fcb8a8061398d0977ffcc1e75bace87f47c70fb282e4365338b5c3a78fc581e0", hosts=>[http://localhost:9200], document_type=>"uzite_nahravky", enable_metric=>true, codec=><LogStash::Cod
ecs::Plain id=>"plain_2cfff973-13d1-4e80-80e4-9b1e6dddf517", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrit
e=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64,
retry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_dela
y=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2018-12-10T13:41:48,019][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-12-10T13:41:48,459][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2018-12-10T13:41:48,469][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/,
ath=>"/"}
[2018-12-10T13:41:48,629][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-12-10T13:41:48,679][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-12-10T13:41:48,689][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type
event field won't be used to determine the document _type {:es_version
=>6}
[2018-12-10T13:41:48,709][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["http://localhost:9200"]}
[2018-12-10T13:41:48,729][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-12-10T13:41:48,749][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.
refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "
norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_abo
ve"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=
"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-12-10T13:41:49,199][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x61dbab96 run>"}
[2018-12-10T13:41:49,289][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2018-12-10T13:41:49,299][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2018-12-10T13:41:49,569][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
I hope that someone would be able to tell what have I done wrong.
Thank you
J.