Logstash output not recognising columns in kibana


(bobwilly) #1

im trying to get my .CSV file in Kibana for visualisation. It feels like Im close to get it work but I can`t figer out how to get my output right.

In Kibana I see my .csv file as:
message: News,test@email.com,10.10.10.10
It looks like my CSV ouput is in 1 field called message. I would like to get 3 different fields: Name,Email,IP. I have tryed a lot of csv files and different codes but no succes yet.

CSV FILE:
Name,Email,IP
Auto,auto@newsuk,10.0.0.196
News,test@email.com,10.10.10.10
nieuwsbrieven,nieuwsbrieven@nl,10.10.10.10

CONF file:
input {
file {
path => "C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv"
start_position => beginning
sincedb_path => "/dev/null"
}}

filter {
csv {
separator => ","
columns => ["Date","Open","High"]
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "csv_index"
}
stdout {}
}
filebeat.yml
filebeat.prospectors:
- input_type: log
paths:
- C:\Users\JOEY2\Desktop\Deelproblemen\Applicatie\Output\test.csv

output.elasticsearch:
hosts: ["localhost:9200"]
template.name: "testttt"
template.overwrite: true

output.logstash:
hosts: ["localhost:5044"]
Thanks for your time.


(system) #2

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.