I have set up a Fleet Server in Kibana, and my Meraki Firewall is sending VPN data to Logstash. I’ve configured a Logstash filter to process the logs and store the filtered data at a custom path.
Additionally, I have installed the Elastic Agent on the same server where Logstash is running. The Elastic Agent picks up the filtered data from the custom path and sends it to the Fleet Server in Kibana. As a result, the data appears in the Discovery tab in Kibana.
logstash.conf
input {
udp {
port => 514
type => "meraki_flow"
codec => plain {
charset => "UTF-8"
}
}
}
filter {
grok {
match => {
"message" => "client_vpn_connect_v2 user id '%{DATA:user_name}' local ip %{IP:destination_ip} connected from %{IP:source_ip}"
}
}
mutate {
rename => {
"user_name" => "[user][name]"
"source_ip" => "[source][ip]"
"destination_ip" => "[destination][ip]"
}
}
# Preserve original message
if [event][original] {
mutate {
add_field => { "original_message" => "%{[event][original]}" }
}
}
}
output {
file {
path => "/var/log/logstash/flow.log"
}
}
after applying the filter data log is showing like this in kibana:
client_vpn_connect_v2 user id 'huzaifakhan@o3.com' local ip 108.853.0.50 connected from 155.55.92.852","host":{"ip":"108.853.0.55"},"user":{"name":"huzaifakhan@o3.com"},"destination":{"ip":"108.853.0.50"},"message":"<134>1 1741262789.826878914 o3_Meraki_MX_95 events client_vpn_connect_v2 user id 'huzaifakhan@o3.com' local ip 108.853.0.50 connected from 155.55.92.852","event":{"original":"<134>1 1741262789.826878914 Folio3_Meraki_MX_95 events client_vpn_connect_v2 user id 'huzaifakhan@folio3.com' local ip 108.853.0.50 connected from 155.55.92.852"}}
i want the local ip should be in destination.ip field inkibana same for source ip and user.name should be huzaifakhan@o3.com in kibana fields.
I have attach the SS of kibana and hide the sensative info.
I have follow the above but unable to correctly parse the message fields, instead user.name and destination.ip fields are adding in it.
Can anyone guide to came out from this issue?