I´m trying to ingest log file using ECS (Elastic Common Schema). For doing it, I have to rename source fields following the ECS convention. I made a filter which is not working:
I don´t know why it´s not working. I´m using rubydebug output and no data is showed in console.
While I was using that filter without mutate part (only json used), it was properly working, but obviusly ingested fields were not properly named following ECS.
The source log file is a json like this one: {"sourceFile":"dns.csv.gz","EventType":"DNSLog","Timestamp":"2020-03-03 20:41:31","MostGranularIdentity":"TRYINGDNS","Identities":"TRYINGDNS","InternalIp":"10.10.10.10","ExternalIp":"10.10.10.10","Action":"Allowed","QueryType":"1 (C)","ResponseCode":"NOERROR","Domain":"www.google.com","Categories":"Software/Technology,Business Services"}
Am I doing it properly? Why it´s not working?
I also tryed without using the "remove_field". Also does not work.
And I just decided which fields can correspond with ECS fields... I thought that renaming fields was the way to get ECS convention, I mean, mapping the fields to ECS fields so follow ECS convention.
If you have a "dns" object that contains a type field then in elasticsearch that would be called dns.type. In logstash it is referred to as [dns][type]
Oh! you are right. I tryed adding lines and it worked.
How can I do to rename the field Timestamp to ECS? what should be the ECS field name? Is Timestamp valid?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.