CSV file into ES through logstash

Newbie here; I am trying to get a CSV file from a windows server and send it to ES through logstash; I am hoping to get the columns in ES so we can do proper searches; but it seems all the CSV output goes into message field.

I have installed file beats with follwing config:

filebeat.inputs:

  • type: log
    • C:\Scripts\Daily*.csv
      output.logstash:
      hosts: ["192.168.1.101:5044"]
      ===================

LogStash has following config:

cat /etc/logstash/conf.d/02-beats-input.conf

input {
beats {
port => 5044
}
}

filter {
if [beat.name] == "server15" and [prospector.type] == "log" {
csv {
separator => ","
columns => [ "Received","SenderAddress","RecipientAddress","Subject","Status","ToIP","FromIP","Size","MessageId","MessageTraceId" ]
}
}
}

output {
elasticsearch {
hosts => ["http://192.168.1.102:9200"]
}
}

Can someone please give me some tips how to get the CSV data into separate fields in ES.

[beat.name] (which refers to a field called beat.name) should be [beat][name] (which refers to the field named name in the [beat] object). The same may be true for [prospector.log]

1 Like

awesome; thanks Badger; its fixed.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.