Can't upload CSV file to elastic search via logstash

logstash-car.conf

I will go through all the question related to this topic but didn't find the solution yet
input {
file {
path => "C:\Users\vedansh.p\Desktop\ELK\CSVLogsDemo\cars.csv"
start_position => "beginning"
sincedb_path => "NUL"
codec => "plain"
}
}
filter {
csv {
separator => ","
columns => ["maker","model","mileage","manufacture_year","engine_displacement","engine_power","body_type","color_slug","stk_year","transmission","door_count","seat_count","fuel_type","date_created","date_last_seen","price_eur"]
}
mutate {convert => ["mileage", "integer"]}
mutate {convert => ["price_eur", "float"]}
mutate {convert => ["engine_power", "integer"]}
mutate {convert => ["door_count", "integer"]}
mutate {convert => ["seat_count", "integer"]}
}
output {
elasticsearch{
hosts => "http://localhost:9200"
index => "cars"
}
stdout {}
}

-----------------------------------------------------OUT PUT-------------------------------------------------------------

Sending Logstash logs to C:/Users/vedansh.p/Desktop/ELK/logstash-7.3.1/logstash-7.3.1/logs which is now configured via log4j2.properties
[2019-09-13T11:27:33,147][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-09-13T11:27:33,414][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.3.1"}
[2019-09-13T11:27:39,960][INFO ][org.reflections.Reflections] Reflections took 240 ms to scan 1 urls, producing 19 keys and 39 values
[2019-09-13T11:27:51,446][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2019-09-13T11:34:55,983][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-09-13T11:34:57,350][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-09-13T11:34:57,570][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2019-09-13T11:34:57,934][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2019-09-13T11:34:58,158][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-09-13T11:34:58,694][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2019-09-13T11:34:58,890][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x1a2571ed run>"}
[2019-09-13T11:34:59,298][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-09-13T11:35:15,048][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-09-13T11:35:15,251][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-09-13T11:35:15,486][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-09-13T11:35:16,923][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

-----------------------------------Output ------------------------IN debug mode got into the loop----------------------------------------------------------

[2019-09-13T11:11:12,628][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2019-09-13T11:11:13,080][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-09-13T11:11:13,080][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-09-13T11:11:14,722][DEBUG][org.logstash.execution.PeriodicFlush] Pushing flush onto pipeline.
[2019-09-13T11:11:17,632][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2019-09-13T11:11:18,093][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-09-13T11:11:18,093][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-09-13T11:11:19,722][DEBUG][org.logstash.execution.PeriodicFlush] Pushing flush onto pipeline.
[2019-09-13T11:11:22,737][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu

Do not use backslash in the path option of a file input, use forward slash.

Thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.