Configuring filebeat to transport csv

Im trying to learn using elastic,
for my first case case im trying to push csv data to elasticsearch so i can read it in kibana.
elasticsearch has been installed on the serverside.
Im trying to push it with filebeat from my pc which is client side.
i've downloaded filebeat on my pc.

here is my filebeat.yml file

filebeat.inputs:

  • input_type: log
    paths:
    • C:\Users\Charles\Desktop\DATA\BrentOilPrices.csv
      document_type: test_log_csv
      output.logstash:
      hosts: ["10.64.2.246:5044"]

i've test it with : ./filebeat test config
it return config ok

here is my logstash.conf file on the serverside
input {
beats {
port =>5044
}
}

filter {

if "test_log_csv" in [type]{
csv {
columns=>["Date","Price"]
separator=>","
}
mutate{
convert => ["Price","integer"]
}
date{
match=>["Date","d/MMM/yy"]
}
}
}

output {
if "test_log_csv" in [type]
{
elasticsearch
{
hosts=>"127.0.0.1:9200"
index=>"test_log_csv%{+d/MM/yy}"
}
}

i thought everything is done,
so i Start-Service filebeat on my pc client.
nothing shown on kibana.
did i missed anything?

edited my filebeat.yml into
filebeat.inputs:

  • input_type: log
    paths:
    • 'C:\Users\Charles\Desktop\DATA\BrentOilPrices.csv'
      fields:
      document_type: test_log_csv
      output.logstash:
      hosts: ["10.64.2.246:5044"]

nothing happened

Hi There,

You have to import the index into kibana. Follow the below link.

https://www.elastic.co/guide/en/kibana/current/index-patterns.html

Hi, thank you so much for your reply. But i've set the index name as test_log_csv and when i search it on create index pattern there is no such name.