Hi all
I'm having some issues with using the influxdb plugin for logstash. Here is the config that I am using:
input {
file {
path => "/var/log/flowlog/flowlog.log"
type => "conntrack-flowlog"
start_position => "beginning"
}
}
filter {
if [type] == "conntrack-flowlog" {
grok {
patterns_dir => ["./patterns"]
match => {
"message" => "%{TIMESTAMP_ISO8601:timestamp} %{DATA:logger_program} %{DATA:logger_prefix} %{DATA} %{DATA} SRC=%{IPV4:src_ip} DST=%{IPV4:dst_ip
} PROTO=%{WORD:proto} SPT=%{INT:src_port} DPT=%{INT:dest_port} PKTS=%{INT:spkts} BYTES=%{INT:sent_bytes} %{MATCH_TO_SRC:cruft}%{MATCH_TO_BYTE:more_cruft}BYTES
=%{INT:recv_bytes}"
}
}
}
}
output {
influxdb {
host => "xxx"
db => "flowlog"
measurement => "tcp_flows"
send_as_tags => ["src_ip", "dst_port"]
data_points => {'src_ip' => [src_ip],'dst_ip' => [dst_ip],'ip_proto' => [proto],'src_port' => [src_port],'dst_port' => [dst_port],'sent_bytes' => [
sent_bytes],'recv_bytes' => [recv_bytes] }
data_points => {}
flush_size => 10
}
}
output {
stdout{ codec=>rubydebug }
}
==
Everything works fine when I use the stdout output. However there seems to be some issue with the data_points field - config test tells me:
logstash -f . -v --configtest
Error: Expected one of #, } at line 26, column 48 (byte 815) after output {
influxdb {
host => "xxx"
db => "flowlog"
measurement => "tcp_flows"
send_as_tags => ["src_ip", "dst_port"]
data_points => {'src_ip' => [src_ip]
Any idea as to what is going wrong here? The documentation seems to indicate that I should pass a hash but it is not clear what hash to pass exactly.