Hello. I've seen some logstash pipelines for parsing Palo Alto logs by using CSV plugin. I'm curious if anyone has taken this a step further and aligned the fields with the Elastic Common Schema. I am working on doing this, and some of the field mappings seem nebulous.
I currently have some of the log types mapped with some additional fields not defined in ECS framework, I am just hoping to find someone else on this path with which I can compare notes.
Andrew,
Do you think discuss is a better place to have a discussion on the topic, or should I talk about it on the Github issue?
I'd also be interested in just talking about mappings from log field to ECS field, as I'm probably not going to send to a central syslog server with filebeat installed, rather sending from Palo Alto using the syslog forwarder to a load balancer and straight into Logstash where I map the fields myself.
I'm good with using that issue to generally discuss the mappings. I haven't really taken a close look at the fields yet, but Mike has. I'll ask him to make those spreadsheets publicly readable so you can see the mappings he started.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.