Kibana couldn't find any Elasticsearch data?

dataset > cars.csv from kaggle data set (all_anonymized_2015_11_2017_03.csv)

logstash terminal
]$ bin/logstash -f /home/elk/data/logstash.config
[2018-03-22T17:48:18,821][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2018-03-22T17:48:18,832][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-03-22T17:48:19,049][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-03-22T17:48:19,118][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-03-22T17:48:19,124][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-03-22T17:48:19,156][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-03-22T17:48:19,176][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-03-22T17:48:19,224][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2018-03-22T17:48:19,837][INFO ][logstash.pipeline ] Pipeline started succesfully {:pipeline_id=>"main", :thread=>"#<Thread:0x2689c8f6 run>"}
[2018-03-22T17:48:19,979][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}

It got stuck here

logstash config
input {
file {
path => "/home/elk/data/cars.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["maker","model","mileage","manufacture_year","engine_displacement","engine_power","body_type","color_slug","stk_year","transmission","door_count","seat_count","fuel_type","date_created","date_last_seen","price_eur"]
}
mutate {convert => ["mileage","integer"]}
mutate {convert => ["price_eur","float"]}
mutate {convert => ["engine_power","integer"]}
mutate {convert => ["door_count","integer"]}
mutate {convert => ["seat_count","integer"]}
}
output {
elasticsearch {
hosts => "localhost"
index => "cars"
document_type => "sold_cars"
}
stdout {}
}

http://localhost:9200/_cat/indices?v
health status index uuid pri rep docs.count docs.deleted store.size pri.store.size

For testing I have manually POST test/doc and working fine.

Can anyone help on this

I got the error the file name was car.csv and I have mentioned cars.csv.

If any one find the error we need to thoroughly find logstash config file

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.