Logstash seems to be working but no indices in Kibana

Hi everyone, I am a new learner to the ELK stack
Here is my logstash conf file:
input {
file {
path => "C:\Users\umutc\Desktop\kibanaproject\organisations.csv"
start_position => "beginning"
sincedb_path => "NUL"
}
}
filter {
csv {
separator => ","
columns =>
["Index","Organization_Id","Name","Website","Country","Description","Founded","Industry","Number_of_employees"]
}
date {
match => ["Founded","YYYY"]
target => "Founded"
}
mutate {convert => ["Index", "integer"]}
mutate {convert => ["Number_of_employees", "integer"]}
}

output {
elasticsearch {
hosts => ["localhost:9200"]

  ssl_certificate_verification => false
  
  user => "elastic"
  
  password => "password"
  
  index => "organisations"

}
stdout {codec => json_lines }
}

Here is what my csv file looks like:

Problem solved: "C:\Users\umutc\Desktop\kibanaproject\organisations.csv"
I should have made all these slashes laid the opposite way...

1 Like

Maybe is useful...
If you have the added header, then you don't need to set column names. Also set pipeline.workers: 1 in logstash.yml

  csv {
    separator => ","
    autodetect_column_names => true
    skip_header => false
  }

Also rubydebug is more useful to display the output on the screen.
stdout { codec => rubydebug }

Thx so much. It will help alot.