Hi,
I have some firewall logs sent to "syslog" and i have filebeat read the syslog, send it to Logstash then to Elasticsearch.
with this filter(1):
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}" }
}
the documents shows in Elasticsearch (as it is)
but with this filter(2):
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{GREEDYDATA:message}" }
the message part of the document are duplicated (or i say, appended twice into message field). The stdout shows"
message[0]=> ..........
message[1]=>........
How do i extract or reference the IP addresses individually? %{IPV4} always matches the first one "192.168.87.252". I'd like to separately get dst:"123.124.125.126", nat_rulenum:"22", s_port:"14346" and src:"212.213.214.215".
Well that's just what you have the filter plugins grok/dissect for, you'll have to create patterns to dig out the individual info components of your log events and name each of them, best as many as possible according the ECS scheme.
thank you @stefws and @Badger! I came across dissect/kv yesterday but got stuck with kv delimeters that i have to use. @Badger, thanks again and I will try out the filter you noted.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.