Jdbc_streaming filter - unable to map netflow fields?

Hi all, having trouble mapping fields from netflow records, onto an SQL statement exected by the jdbc_streaming driver against a MySQL database.

Scenario: logstash is configured to read netflow data from UDP port 2055. This part works - I see the data hit Elastic no worries. Now, I want to enrich the data by looking up a field (edudomain) from a table in MySQL, using the src_addr field from the netflow record. A successful fetch using the statement itself looks like this:

MariaDB [ipdata]> select edudomain from ipdata where inet_aton('10.159.192.22') between network and network + (4294967295 - mask) limit 1;
+-----------+
| edudomain |
+-----------+
| 10mb-test |
+-----------+
1 row in set (0.00 sec)

Now, the logstash config:

filter {
	jdbc_streaming {
		jdbc_driver_library => "/usr/share/mysql-connector-java-5.1.44/mysql-connector-java-5.1.44-bin.jar"
		jdbc_driver_class => "com.mysql.jdbc.Driver"
		jdbc_connection_string => "jdbc:mysql://localhost:3306/ipdata"
		jdbc_user => "logreader"
		jdbc_password => "abc123"
		statement => "select edudomain from ipdata where inet_aton(':src_addr') between network and network + (4294967295 - mask) limit 1"
		parameters => { "src_addr" => "src_addr" }
		target => "edudomain"
	}
}

An example netflow record, as written using the file output:

{"netflow":{"dst_as":0,"dst_locality":"private","first_switched":"2017-10-01T10:48:05.999Z","dst_mask_len":0,"src_port_name":"TCP/80 (http)","sampling_algorithm":0,"packets":3,"protocol":6,"ip_version":"IPv4","tcp_flags":26,"next_hop":"10.246.202.209","dst_addr":"10.143.108.130","dst_port_name":"TCP/55065","tos":32,"src_as":0,"direction":"ingress","output_snmp":8,"src_locality":"private","tcp_flags_label":"SYN-PSH-ACK","src_mask_len":0,"src_addr":"10.10.21.128","version":"Netflow v5","flow_seq_num":457764157,"flow_records":2,"src_port":80,"flow_locality":"private","engine_type":0,"engine_id":0,"input_snmp":9,"last_switched":"2017-10-01T10:48:05.999Z","bytes":764,"dst_port":55065,"tcp_flag_tags":["SYN","PSH","ACK"],"sampling_interval":0,"protocol_name":"TCP"},"@timestamp":"2017-10-01T10:48:20.000Z","@version":"1","host":"10.244.1.249","geoip_dst":{"autonomous_system":"PRIVATE"},"edudomain":[{}],"geoip_src":{"autonomous_system":"PRIVATE"},"type":"netflow","tags":["_jdbcstreamingdefaultsused"]}

However, if I do a tcpdump to look at what jdbc-connector-mysql is actually sending the database, here's what it looks like:

select edudomain from ipdata where inet_aton('NULL') between network and network + (4294967295 - mask) limit 1

No matter what I do, I can't seem to get fields mapped from netflow records into the SQL statement. There is always a NULL in place of the data I expect. The reference for the parameters & statement configuration options in the jdbc_streaming_plugin is pretty sparse. I also tried "netflow.src_addr" in the parameters configuration option, to no avail.

Any clues what I might be doing wrong?

Sorry, one more question. My use-case seems very similar to the CIDR plugin. The IP data I'm looking up doesn't change often, therefore loading it once (at run-time) would be acceptable. I can see that plugin has a network_path configuration option, which seems to allow the plugin to reference an external file to load its data, but it's not clear if this can be used to load multiple network definitions. I have around 6,100 all up. Is this plugin a good fit or should I persist with trying to get it using the jdbc_streaming filter?

An update. It seems there were two problems - the parameters line was incorrect:

parameters => { "src_addr" => "src_addr" } should actually read parameters => { "src_addr" => "[netflow][src_addr]" }

Also, it seems the jdbc_streaming plugin applies quotations to named parameters, so these needed to be removed from my config.

After doing that, the statement being sent to MySQL is now valid.

Documentation updated & pull request proposed on GitHub.

1 Like

PR is here https://github.com/logstash-plugins/logstash-filter-jdbc_streaming/pull/8 :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.