Csv -> filebeat -> logstash not parsing?

I have configured a server using elk stack with filebeat.
In a folder, i have multiple CSV files. I have managed to get filebeat to read them using:

  paths:
     - /var/capture/dpi-tcp/*.csv

Now for logstash, I have 4 configuration files in conf.d as following:

02-beats-input.conf

input {
  beats {
port => 5044
  }
}

06-tcp-filter.conf (THIS IS SUPPOSED TO PARSE THE CSV)

filter {
  if [path] =~ "tcp" {
   csv {
      columns => [
          "logdate",
          "ppoeip",
          "userip",
          "protocol",
          "logserverip",
          "destinationip",
          "remove1",
          "remove2",
          "sourceport",
          "destinationport",
          "description-url"
        ]
        separator => ";"
        remove_field => ["remove1", "remove2"]
   }
  }
}

10-syslog-filter.conf

filter {
  if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
    }
    syslog_pri { }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
  }
}

30-elasticsearch-output.conf

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    sniffing => true
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }
}

The Problem
My issue is, the data are being sent. I can see them in Kibana. However,
All the CSV file is not being parsed, instead, the records are appearing in a filebeat object inside a "message field"

I'm not sure what I'm doing wrong.
Any advise would be helpful.
Thanks

NVM.

Stupidly enough, I was outputting directly to elasticsearch instead of logstash.

Thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.