Logstash error(no data ) while ingesting CSV data with ELK version 8.9.2

I was facing an issue while ingesting an csv file called housing_price_data.csv using logstash. I was using ELK with docker. I was using ELK version 8.11 I did not want to add any security so there is no SSL, passwords or any certificates involved in my file. All my containers elasticsearch, kibana and logstash are active and healthy(green) .I have attached my logstash.conf down below:

input {
  file {
    path => "/logstash_dir/housing_price_data.csv"
    start_position => "beginning"
    sincedb_path => "/dev/null"

filter {
  csv {
    separator => ","
    columns => ["SquareFeet","Bedrooms","Bathrooms","Neighborhood","YearBuilt","Price"]    target => "housing_data"

output {
  elasticsearch {
    index => "housing_data"
    hosts => ["http://es01:9200"]

  #stdout {codec => rubydebug}


I keep getting the following error/problem down below(No data is read from the csv file):

[2023-12-16T20:24:15,786][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2023-12-16T20:24:15,796][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://es01:9200"]}
[2023-12-16T20:24:15,941][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://es01:9200/]}}
[2023-12-16T20:24:16,023][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://es01:9200/"}
[2023-12-16T20:24:16,024][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.11.0) {:es_version=>8}
[2023-12-16T20:24:16,024][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2023-12-16T20:24:16,036][INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"housing_data"}
[2023-12-16T20:24:16,037][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[2023-12-16T20:24:16,061][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2023-12-16T20:24:16,063][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash.conf"], :thread=>"#<Thread:0x67a1121c /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2023-12-16T20:24:16,115][INFO ][logstash.outputs.elasticsearch][main] Installing Elasticsearch template {:name=>"ecs-logstash"}
[2023-12-16T20:24:16,719][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.65}
[2023-12-16T20:24:16,728][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2023-12-16T20:24:16,736][INFO ][filewatch.observingtail  ][main][fe145da8c585a85d9a3afbb400fb78330d56c895f59cfe5997f432c23160bd05] START, creating Discoverer, Watch with file and sincedb collections
[2023-12-16T20:24:16,738][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

I cannot pin point the mistake?

There is no error in the logs you shared, you need to check your source file.

First, does your logstash container have access to the /logstash_dir you shared?

Second, what does the housing_price_data.csv file looks like? Does it have more than one line?

Check does your user which runs LS have permission to read housing_price_data.csv

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.