Logstash gets stuck in pipelines

I have a simple csv file where i like to upload to elasticsearch. My sample csv file contains 2 records. it gets stuck at pipelines. Please see below. I am running this on windows 11

Thank you for your helo.

upload.conf

input {
    file {
        path => "c:/elastic/data2/sample1.txt"
        start_position => "beginning"
        sincedb_path => "NUL"
        codec => plain { charset => "UTF-8" }
    }
}    
filter {
  csv {
    skip_header => true
    skip_empty_rows => true
    separator => ","
    autodetect_column_names => true
    skip_header => true
   
  }
}
output {
    elasticsearch{
        
        hosts => "http://192.168.1.165:9200"
        index => "sample-students"
        user => "xxxxx"
        password => "xxxxxx"
    }
    stdout{ codec => rubydebug }
}
````Preformatted text`


___________________________________
sample1.txt

name,age
sk,63
mk,34



_______________________________________
C:\elastic\logstash-8.8.2\bin>logstash -f c:\elastic\data2\upload.conf 

[2023-06-29T20:00:01,017][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

It gets stuck here.

The plain codec "is for plain text with no delimiting between events". Text files have newlines between events. Try the line codec.

took the code => line out. No change.

No one can help on reading a simple csv file?

Hi,

You have twice skip_header => true in filter section. Please remove it and try it again. How do you start pipeline? Is the path for pipeline correct?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.