Unable to update data from CSV file to Elasticsearch cluster

Hi,

I am trying to upload data from CSV file to Elasticsearch, and here is my logstash config file:

logstash.conf

input {
file {
path => ["/home/ubuntu/logstash/logstash-6.4.3/auto.csv"]
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

filter {
csv {
separator => ","
columns => [ "dateCrawled","name","seller","offerType","price","abtest","vehicleType","yearOfRegistration", "gearbox",
"powerPS","model","kilometer", "monthOfRegistration", "fuelType", "brand","notRepairedDamage", "dateCreated",
"nrOfPictures", "postalCode","lastSeen"
]
}
}

output {
elasticsearch {
hosts => ["http://130.58.5.28:31228"]
index => "cars"
}

stdout { codec => rubydebug }

}

When I start the logstash, it shows up like this, but no records have been uploaded to the Elasticsearch. Can you please help me to figure out the real issue in my configuration?
I am very new to this ELK stack,

ubuntu@ip-172-31-10-102:~/logstash/logstash-6.4.3$ bin/logstash -f /home/ubuntu/logstash/logstash-6.4.3/logstash.conf
Sending Logstash logs to /home/ubuntu/logstash/logstash-6.4.3/logs which is now configured via log4j2.properties
[2018-11-12T21:32:14,074][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-11-12T21:32:14,665][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.3"}
[2018-11-12T21:32:18,002][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-11-12T21:32:18,456][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://130.58.5.28:31228/]}}
[2018-11-12T21:32:18,465][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://130.58.5.28:31228/, :path=>"/"}
[2018-11-12T21:32:18,647][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://130.58.5.28:31228/"}
[2018-11-12T21:32:18,714][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-11-12T21:32:18,717][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-11-12T21:32:18,759][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://130.58.5.28:31228"]}
[2018-11-12T21:32:18,783][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-11-12T21:32:18,811][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-11-12T21:32:19,166][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x3129e460 run>"}
[2018-11-12T21:32:19,233][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2018-11-12T21:32:19,287][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2018-11-12T21:32:19,661][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.