How to send csv file to logstash using filebeat

Hello,
I'm trying to send csv data to logstash using filebeat, but i don't get the data on kibana.

filebeat.yml

paths:
- C:\ELK\filebeat-5.6.2\logs*.csv
- C:\ELK\filebeat-5.6.2\logs\test.log
input_type: log
document_type: log-reports

logstash.conf

input {
beats {
port => 5044
}
}
filter {
if "log-facturation" in [type]{
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:date} %{LOGLEVEL:level} [%{DATA:fonction}] %{GREEDYDATA:source} - %{GREEDYDATA:message}" }
}
date {
match => [ "date", "ISO8601" ]
}
}
if "log-reports" in [type]{
csv {
}
}
}
output {
elasticsearch {
index => "toto"
hosts => "localhost:9200"
}
}

Thank you

Look in your logs (both Filebeat and Logstash). Does Filebeat appear to be having issues sending to Logstash? Does Logstash appear to be having issues sending to Elasticsearch?

Hello, thank you for your answer.
No, both of them are running well, i've problem just when sending a csv file

If Filebeat is actually sending anything to Logstash there should be signs of it in the log. Definitely if you increase the log level. To remove an error source you can replace the elasticsearch output with e.g. a file output that dumps the messages to a file in /tmp. What does the rest of your Filebeat configuration look like (non-comment lines only please, and format it as preformatted text so it doesn't get mangled).

Hello,
Sorry for being late :frowning: the problem was in the file encoding.. I created a new file on my desktop, fill it with the informations of the last file, then put the path in the filebeat.yml and it worked.
Thank you so much M.BAECK

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.