Logstash CSV output to Kibana

im trying to get my .CSV file in Kibana for visualisation. It feels like Im close to get it work but I can`t figer out how to get my output right.

In Kibana I see my .csv file as:
message: News,test@email.com,10.10.10.10
It looks like my CSV ouput is in 1 field called message. I would like to get 3 different fields: Name,Email,IP. I have tryed a lot of csv files and different codes but no succes yet.

CSV FILE:
Name,Email,IP
Auto,auto@newsuk,10.0.0.196
News,test@email.com,10.10.10.10
ieuwsbrieven,nieuwsbrieven@nl,10.10.10.10

CONF file:
input {
file {
path => "C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv"
start_position => beginning
sincedb_path => "/dev/null"
}}

filter {
    csv {
        separator => ","
        columns => ["Date","Open","High"]
  }
}
output {
    elasticsearch {
        hosts => ["http://localhost:9200"]
        index => "csv_index"
    }
stdout {}
}

filebeat.yml
filebeat.prospectors:
- input_type: log
paths:
- C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv

output.elasticsearch:
  hosts: ["localhost:9200"]
  template.name: "testttt"
  template.overwrite: true

output.logstash:
hosts: ["localhost:5044"]

Thanks for your time.

Your configuration looks correct. Are you sure you're looking at up to date data in Elasticsearch? I suggest you comment out your elasticsearch output use only use a stdout { codec => rubydebug } output until you know the data is processed correctly.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.