Viewing csv file in kibana

Hi all,

I have successfully loaded a csv file into logstash but I'm having problems visualizing it in Kibana. Here is my configuration:
I believe it is a problem with the filter section obviously...

> input {
>   file {
>     path => "/home/bm1391/reports/report-0c0ed529-ceda-4a2f-a9a6-6abad6d3ee05.csv"
>     start_position => beginning
>     # to read from the beginning of file
>     sincedb_path => "/dev/null"
>   }
> }
> 
> filter {
>     csv {
>         separator => ","
>         columns => ["IP", "Hostname", "OS", "Scan Start", "Scan End", "CVSS", "Severity", "High", "Medium", "Low", "Log", "False Positive", "Total"]
>         
>     }
>     mutate {
>         convert => ["High", "integer"]
>         convert => ["Medium", "integer"]
>         convert => ["Low", "integer"]
>     }
> }
> 
> output {
>    elasticsearch {
>      action => "index"
>      hosts => "10.100.1.16:9200"
>      index => "testing"
>      workers => 1
>   }
> stdout {}
> }

Here is a sample of my csv file:

IP Hostname OS Scan Start Scan End CVSS Severity High Medium Low Log False Positive Total
192.168.1.1 Wireless_Broadband_Router.home cpe:/o:linux:linux_kernel:2.6 2017-08-27T14:12:25Z 2017-08-27T14:31:46Z 6.8 Medium 0 23 1 42 0 66

I Would like to basically map the values based on the IP address, for example, a simple bar graph (x axis: IP, and different bars for Low, Medium, High etc...).
Any help to point me in the right direction would be great.

What problems are you having?

I believe the problem occurs when I try to map the results and all the "types" are strings. The convert to integers that you see above does not happen

What does the mapping look like?

Well, actually now the pipeline is being started fine and there are no errors in the logs but looking at my fields, they aren't appearing.. Look here

Here is a better picture of csv file:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.