SNMP/SNMPTRAP input with filter and output to CSV


I want to be able to grab the output of any SNMP/SNMPTRAP and structure it so it can be exported to a CSV file. I will then take the file and import it to another box running ElasticSearch and Kibana.

Can you please point me to the direction of a grok filter and output section that can help me understand how to do this?

From what I see not too many people in this forum are using the ELK to parse and manipulate SNMP data.

Output lines are for testing, I know I have to do a CSV plugin output from logstash.


My *.conf file:


input {
snmptrap {
host => ""
community => "public"
port => "162"

input {
snmp {
get => [".", "."]
hosts => [{host => "udp:" community => "public" version => "2c" retries => 2 timeout => 1000}]

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "mikrotik"
stdout { codec => rubydebug }


If I use that snmp input I get

  "" => 202330,
"" => "error: no such instance currently exists at this OID",

Personally I would adjust that using 'oid_root_skip => 6' to get the easier to read

  "system.sysUpTime.sysUpTimeInstance" => 198404,
"interfaces.ifTable.ifEntry.ifDescr.0" => "error: no such instance currently exists at this OID",

Unfortunately this is a terrible match for a csv filter. I would suggest something like

output { file { path => "/some/path/snmp.txt" codec => json_lines } }

Not sure what you are looking to do with grok. Are there specific fields you need to parse?

That is a simple example config. Not finished entering all the OID info for other devices. I will have numerous devices and traps. What I really need is a filter to organize the data and export it to CSV. For example: use grok filter to grab (hostname, timestamp, interfaces, uptime, memory) and create a CSV file that is organized.


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.