Hi,
I want to ship data from Logstash A (client) to Logstash B (server).
Logtash A will need to decode netflow traffic, add and remove some fields and then send it to Logstash B for further processing and finally send data to Elastic.
I've got a lumberjack output and input working, I'm receiving data but it doesn't appear lumberjack is outputting any data.
Received data:
{
"_index": "test2017.10.13",
"_type": "logs",
"_id": "AV8T7SDgV08ceArt086i",
"_version": 1,
"_score": null,
"_source": {
"@timestamp": "2017-10-13T04:10:28.433Z",
"input_name": "input_location_a",
"@version": "1",
"message": "2017-10-13T04:08:25.000Z 10.0.0.246 %{message}"
},
"fields": {
"@timestamp": [
1507867828433
]
},
"sort": [
1507867828433
]
}
This doesn't contain any of the netflow data.
Client config
input {
udp {
port => 9995
codec => netflow {
versions => [9]
}
}
}
filter {
mutate {
add_field => { "output_location_name" => "output_location_a" }
}
}
output {
lumberjack {
id => "fortinetflowtest"
hosts => "ipofserver"
port => 9998
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
}
}
Server config
input {
lumberjack {
id => "fortinetflowtest"
port => 9998
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}
filter {
mutate {
add_field => { "input_name" => "input_location_a" }
}
}
output {
elasticsearch {
hosts => localhost
index => "test%{+YYYY.MM.dd}"
}
}