Logstash influxdb issues

Hi all
I'm having some issues with using the influxdb plugin for logstash. Here is the config that I am using:

input {
file {
path => "/var/log/flowlog/flowlog.log"
type => "conntrack-flowlog"
start_position => "beginning"
}
}

filter {
if [type] == "conntrack-flowlog" {
grok {
patterns_dir => ["./patterns"]
match => {
"message" => "%{TIMESTAMP_ISO8601:timestamp} %{DATA:logger_program} %{DATA:logger_prefix} %{DATA} %{DATA} SRC=%{IPV4:src_ip} DST=%{IPV4:dst_ip
} PROTO=%{WORD:proto} SPT=%{INT:src_port} DPT=%{INT:dest_port} PKTS=%{INT:spkts} BYTES=%{INT:sent_bytes} %{MATCH_TO_SRC:cruft}%{MATCH_TO_BYTE:more_cruft}BYTES
=%{INT:recv_bytes}"
}
}
}
}

output {
influxdb {
host => "xxx"
db => "flowlog"
measurement => "tcp_flows"
send_as_tags => ["src_ip", "dst_port"]
data_points => {'src_ip' => [src_ip],'dst_ip' => [dst_ip],'ip_proto' => [proto],'src_port' => [src_port],'dst_port' => [dst_port],'sent_bytes' => [
sent_bytes],'recv_bytes' => [recv_bytes] }

data_points => {}

        flush_size => 10
       }

}

output {

stdout{ codec=>rubydebug }

}

==
Everything works fine when I use the stdout output. However there seems to be some issue with the data_points field - config test tells me:

logstash -f . -v --configtest
Error: Expected one of #, } at line 26, column 48 (byte 815) after output {
influxdb {
host => "xxx"
db => "flowlog"
measurement => "tcp_flows"
send_as_tags => ["src_ip", "dst_port"]
data_points => {'src_ip' => [src_ip]

Any idea as to what is going wrong here? The documentation seems to indicate that I should pass a hash but it is not clear what hash to pass exactly.

i've got the same problems, i solved it by using such a config style

      influxdb {
         db => "toast"
         host => "localhost"
         measurement => "myseries"
         allow_time_override => true
         use_event_fields_for_data_points => true
         exclude_fields => ["@version", "@timestamp", "sequence", "message", "type", "host"]
     send_as_tags => ["bar", "feild", "test1", "test"]
         
	}

although i've a problem that my fields from redis are all strings, not integer or float!

Hi Andreas
I solved the original problem by using a style for datapoints like:

==
{'src_ip' => "%{src_ip}" 'dst_ip' => "%{dst_ip}" 'ip_proto' => "%{proto}" 'src_port' => "%{src_port}" 'dest_port' => "%{dest_port}" 'sent_bytes' => "%{sent_bytes}" 'recv_bytes' => "%{recv_bytes}" }

Why the documentation is so completely wrong is beyond me. Furthermore I had to set the retention policy to autogen (not default) since there is no policy called default in recent influxdbs.

What should have taken 1/2 an hour landed up taking way way more.

@Logstash folks: Want me to file a doc PR?

hi

thanks for posting, i get an error when using this style, but will try it again tomorrow!

i also hat to set the retention policy to autogen ...

so ... maybe we could exchange some knowledge about this issues because it seems not many are using this type of logstash connection?

as an input i use redis and this is completly confusing what is happening because i've this redis list

e.g
foo=10207 bar=1 sensor2=1 sensor3=33.3 time=1489686662
foo=10207 bar=1 sensor2=1 sensor3=33.3 time=1489686662
foo=10207 bar=1 sensor2=1 sensor3=33.3 time=1489686662
foo=10207 bar=1 sensor2=1 sensor3=33.3 time=1489686662

but every field is created as string inside influxdb - no chance of predefining a schema as in elasticsearch or so ...

hey, when i try to use this form

output {

stdout { codec => rubydebug }

       influxdb {
	     db => "toast"
         host => "localhost"
         measurement => "myseries"
         allow_time_override => true
     
        exclude_fields => ["@version", "@timestamp", "sequence", "message", "type", "host"]
		send_as_tags => ["bar", "feild", "test1", "test"]
		data_points => {'foo' => "%{foo}",'sensor2' => "%{sensor2_ip}",'sensor3' => "%{sensor3}"}
         
	}
  }

i get an error. how did you solve this problem, could you post your connection logstash file please? this would really help a lot!!

JAVA_OPTS was set to [ -Xmx1g -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:+CMSP
arallelRemarkEnabled -XX:SurvivorRatio=8 -XX:MaxTenuringThreshold=1 -XX:CMSIniti
atingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -XX:+HeapDumpOnOutO
fMemoryError -XX:HeapDumpPath="C:\tools\logstash-5.2.2/heapdump.hprof"]. Logstas
h will trust these options, and not set any defaults that it might usually set
Could not find log4j2 configuration at path /tools/logstash-5.2.2/config/log4j2.
properties. Using default config which logs to console
10:31:41.517 [LogStash::Runner] ERROR logstash.agent - Cannot load an invalid co
nfiguration {:reason=>"Expected one of #, {, } at line 37, column 37 (byte 731)
after output {\n\n\tstdout { codec => rubydebug }\n\n influxdb {\n\t\t
db => "toast"\n host => "localhost"\n measureme
nt => "myseries"\n allow_time_override => true\n \n
exclude_fields => ["@version", "@timestamp", "sequence", "message",
"type", "host"]\n\t\t\tsend_as_tags => ["bar", "feild", "test1", "te
st"]\n\t\t\tdata_points => {'foo' => "%{foo}""}

got it, removed the "," between the fieldpoints!!

It is rather idiotic. The documentation says that you need , between fields, but of course it ain't so. And if you don't quote the datapoint fields "%{foo}" it also barfs. Oh, joy!!

you are so right, i tried for hours to get the right syntax ...

here is my config file for transferring redis to influx via logstash ... works as i can say, except i get other errors now but one step after the other! thanks a lot mate!

input {

redis{
	host => "localhost"
	data_type => "list"
	key => "vortex"
	threads => 4
	type => "testrecord"
	codec => "plain"

}
}

filter {

kv {}
mutate {

convert => {
  "foo"   => "integer"
  "sensor2" => "float"
  "sensor3" => "float"
}

}
}

output {

stdout { codec => rubydebug }

       influxdb {
	    db => "gronk"
        host => "localhost"
        measurement => "wonk"
        allow_time_override => true
		use_event_fields_for_data_points => true			
        exclude_fields => ["@version", "@timestamp", "sequence", "message", "type", "host"]
		send_as_tags => ["bar", "feild", "test1", "test"]				
         
	}
  }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.