Top-level configuration not working

Dear Logstash community,

I have challenged myself to capture my network data using TShark and make custom dashboards in Kibana.
I have the following setup:

  • 1 Ubuntu 18.04 VM with ElasticSearch, Logstash and Kibana dockerized
  • 1 Ubuntu 18.04 VM with TShark and Filebeat running on the host

I run TShark using the specified -T ek flag and export the capture to a rolling CSV file.
Using Filebeat I send the CSV file to Logstash. Logstash will then send the file to Elastic Search which will be queried by Kibana.

My problem is: Kibana shows the Logstash logs as it's main input and per event includes one whole CSV capture line in a single field. How do I solve this that the CSV becomes the top-level input?

This is my logstash.conf:

input {
  beats {
    port => 5044
  }
}

filter {
	csv {
		source => "message"
		columns => [ "col.Time", "col.Source", "col.Destination", "ip.src", "ip.dst", "tcp.srcport", "tcp.dstport", "col.Protocol", "ip.len", "col.Info" ]
		}
   
	mutate {
      convert => [ "ip-len", "integer"]
	}
	date {
	   match => [ "col.time", "YYYY-MM-DD HH:mm:ss.SSSSSSSSS" ]
       target => "@timestamp"
	}
}  
  
output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}" 
    document_type => "%{[@metadata][type]}" 
    user => elastic
    password => changeme
  }
}

This is my filebeat.yml:

filebeat.modules:
- module: system
  syslog:
    enabled: false
  auth:
    enabled: true
    var.paths: ["/home/user/Documents/tsharkcap/tshark.csv"]
output.logstash:
  hosts: ["192.168.234.134:5044"]

I run ElasticSearch with default config:

sudo docker run -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" docker.elastic.co/elasticsearch/elasticsearch:7.5.2

I run Kibana with default config:

sudo docker run --link docker-cont:elasticsearch -p5601:5601 docker.elastic.co/kibana/kibana:7.5.2

Thanks in advance,
ELK4Life

You need to enable multiline configuration in your filebeat.yml

Example:
multiline.pattern: ',\d+,[^",]+$'
multiline.negate: true
multiline.match: before

See this link for multiline configuration => https://www.elastic.co/guide/en/beats/filebeat/current/multiline-examples.html

You can also check your pattern here =>
https://play.golang.org/p/uAd5XHxscu

Solved!

I was using the wrong cmd for tshark, thus outputting json data instead of CSV data.
@inhinyera16 correctly identified this mistake.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.