Unable to parse CSV input from Filebeat -> Logstash

Hi,

I am trying to read a CSV file using Filebeat and sending the data to Logstash.
On Logstash, I need to parse this CSV data and load it into ElasticSearch.
I am able to ship the data from Filebeat to Logstash to Elastic search using CSV plugin,
however, the event data is not split into respective columns. Index doesn't have fields pertaining to CSV column headers.
Entire data seems to be dumped under "message" field.
I tried multiple options, but no luck.
Here are my configurations for reference:

FileCsv.csv

ProcessorTime, Rate, Interval, Dontwantthis, WantThis, DontWantThisOneToo
Sample1, 2, 3, A, 4, B
Sample2, 2, 3, Y, 4, Z

Filebeat.conf:

filebeat.inputs:
- type: log
  paths: 
        - /Users/amit_joshi/ElasticData/spool/FileReader*.log
  fields: 
        document_type: detectionLogs

- type: log 
  paths:
        - /Users/amit_joshi/ElasticData/spool/FileCsv*.csv
  fields:
        document_type: perfMonLogs

include_lines: ['^FINEST']
exclude_lines: ['^ProcessorTime']

output.logstash:
  hosts: ["localhost:5044"]

Logstash.conf:

input {
  beats {
    port => 5044
  }
}
filter {

if ([fields][log_type] == "detectionLogs") 
    {
        #Some processing statements
    }
else if ([fields][log_type] == "perfMonLogs") 
    {
        csv {
                columns => ["ProcessorTime", "Rate","Interval","Dontwantthis","WantThis","DontWantThisOneToo"]
                separator => ","
            }
    }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "mock_20march_2019"
    #user => "elastic"
    #password => "changeme"
  }
stdout { codec => rubydebug }

}

You add a field called document_type, but you test a field called log_type. Neither the if nor the else if will match.

@Badger, That's really bad on my part. Changing the field name in filebeat config to "log_type", resolved the issue.
Thanks a ton!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.