ELK Stack for Suricata

Hello,

Looking for some help in configuring ELK to work with Suricata-IDS. My setup is detailed below:

elasticsearch-1.7.3.deb
kibana-4.1.2-linux-x64.tar.gz
logstash_1.5.4-1_all.deb
logstash-forwarder_0.4.0_amd64.deb

I have logstash-forwarder configured on a remote system that is running Suricata. Suricata is configured to to log events to "/logs/suricata/eve.json". Based on what i'm seeing in "/var/log/logstash-forwarder/logstash-forwarder.err" the connection to my ELK server is working fine:

2015/10/20 12:08:38.303232 Waiting for 1 prospectors to initialise
2015/10/20 12:08:38.303408 Launching harvester on new file: /storage/suricata/eve.json
2015/10/20 12:08:38.303535 harvest: "/storage/suricata/eve.json" (offset snapshot:0)
2015/10/20 12:08:38.304140 All prospectors initialised with 0 states to persist
2015/10/20 12:08:38.304511 Setting trusted CA from file: /etc/pki/tls/certs/logstash-forwarder.crt
2015/10/20 12:08:38.306909 Connecting to [10.1.1.2]:5043 (10.1.1.2)
2015/10/20 12:08:38.718065 Connected to 10.1.1.2
2015/10/20 12:08:48.937568 Registrar: processing 177 events

My logstash-forwarder.conf file is below:

{
"network": {
"servers": [ "10.1.1.2:5043" ],
"ssl certificate": "/etc/pki/tls/certs/logstash-forwarder.crt",
"ssl ca": "/etc/pki/tls/certs/logstash-forwarder.crt"
},
"files": [
{
"paths": [ "/logs/suricata/eve.json" ],
"codec": { "type": "json" }
}
]
}

My logstash.conf from my ELK server:

input {
lumberjack {
# The port to listen on
port => 5043

# The paths to your ssl cert and key
ssl_certificate => "/etc/logstash/pki/certs/logstash-forwarder.crt"
ssl_key => "/etc/logstash/pki/private/logstash-forwarder.key"
# Set this to whatever you want.
codec => json

}
}

#geoip part
filter {
if [src_ip] {
geoip {
source => "src_ip"
target => "geoip"
database => "/etc/logstash/geoip/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float" ]
}
}
}

output {
elasticsearch { embedded => true }
}

My issue is that I don't seem to see any data when I hit the kibana web interface. I would be that i'm missing something simple, hoping for a push in the right direction! I guess a good place to start would be verifying that the Suricata "eve.json" file on the remote machine (machine with logstash-forwarder installed) is making it to the ELK server.

Thanks,
sck

elasticsearch { embedded => true }

Don't run ES in embedded mode. It sounds like you're already running ES as a standalone service and then you must configure the elasticsearch output accordingly.

Mangus,

Thanks -- just to be clear you are suggesting I change that to false, correct?

Regards,
sck

Yes, or preferably just drop it since its default is false and the option will be removed eventually. Then make other configuration changes as required by your ES setup. Options you'll want to consider include protocol (HTTP is recommended), host, and maybe cluster (unless you're going for HTTP).

Hmm ok. Guess i'm still a bit confused as the Suricata-IDS "eve.json" log doesn't even look like its being sent from my remote machine (running logstash-forwarder) to my ELK stack server.

Any other specific config changes to make on either:

  1. The ELK Stack server (not running suricata)
  2. The remote server running Logstash-forwarder (also running suricata, logging events to "eve.json")

Or any specific troubleshooting tips on either of the above?

Thanks,
sck

The logstash-forwarder logfile should be clear about whether stuff is being picked up and sent to Logstash. You can inspect the state file (.logstash-forwarder) to see the current position in each file being monitored. If that number is being updated it means LSF is able to send data which means that Logstash is accepting it.

On the Logstash side, comment out the elasticsearch output and use a simple stdout { codec => rubydebug } output to simplify things and eliminate error sources.