Loading a CSV file

Hi,
I'm trying to load a csv file (stored on my windows machine) into elasticsearch via logstash. I need to visualize it on Kibana. I have everything installed on my windows machine.

P.S. It's my first time using Logstash.

Can anybody help me on how I can go about it (configuration) ? Or maybe point me to a relevant document ? :slight_smile:

Thank you,
Rahul

you will need a file input, a csv filter and a elasticsearch output.

See:
https://www.elastic.co/guide/en/logstash/current/advanced-pipeline.html (ignore filebeat for now)
https://www.elastic.co/guide/en/logstash/current/plugins-inputs-file.html
https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html
https://www.elastic.co/guide/en/logstash/current/data-deserialization.html (see csv section)

You should experiment with this:

input {
  generator {
    message => "stop,wait for it,go go go - man"
    count => 1
  }
}

filter {
  csv {
    separator => ","
    columns => ["red","amber","green"]
  }
}

output {
  stdout {
    codec => rubydebug
  }
}

Paste in a line from your file into the message quotes. Add your real columns in quotes to the columns array.
Once you are happy that the filter is doing the correct thing then replace the generator input with a file input section - see the doc links above.
Once you are happy with the output then replace the stdout section with a elasticsearch output.

Remember to delete the sincedb file before each run or the CSV file will not be re-read.

Hi,
Thank you for the reply.

So I followed the steps. And Logstash was not being detected when the parent folder name had a space in it. Now logstash is being detected. And in the config folder>pipeline.yml file> I wrote :

pipeline.id: events
path.config: "C:\logstash-6.1.1\config\Logstash.conf"

"Logstash.conf" being the conf file I created. And it contains :

input {
file {
path => "C:\Users\Desktop\Events-Timestamp.csv"
start_position => beginning

}
}

filter {
csv {
columns => ["siteid", "hostaddress", "hostname", "service", "state", "Timestamp"]
}
}

output {
stdout { codec => rubydebug }
elasticsearch {
host => "localhost"
index => "Events-Timestamp"
}
}

And when I try to run logstash, I get an error that says Could not read Pipeline.yml.

I'm not sure where I'm going wrong. Please help.

Thank you,
Rahul

Did you follow the YAML syntax?

See: https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html

Yes, I got logstash running. This is the output I received in cmd.

C:\logstash-6.1.1\bin>logstash -f logstash.conf

Sending Logstash's logs to C:/logstash-6.1.1/logs which is now configured via log4j2.properties
[2018-01-04T14:18:45,793][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/logstash-6.1.1/modules/fb_apache/configuration"}
[2018-01-04T14:18:45,824][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/logstash-6.1.1/modules/netflow/configuration"}
[2018-01-04T14:18:46,247][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-01-04T14:18:47,411][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.1.1"}
[2018-01-04T14:18:47,658][INFO ][logstash.config.source.local.configpathloader] No config files found in path {:path=>"C:/logstash-6.1.1/bin/logstash.conf"}
[ERROR] 2018-01-04 14:18:47.689 [Ruby-0-Thread-1: C:\logstash-6.1.1\lib\bootstrap\environment.rb:6] sourceloader - No configuration found in the configured sources.
[2018-01-04T14:18:48,172][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Can you please tell me how I can visulaize my data in Kibana?

Thank you,
Rahul

It is a complex topic. You should read up on that in the Kibana docs.
Use the Kibana forum
Do a search for getting started posts then post your questions there.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.