How to filter multiple content types in Logstash

(Jagadeesh Koka) #1

Hi,

I need to design a system to read different types of Payloads and a Json Log file.

I am able to read both of them and create Indexes in ElS (Filebeat--> Logstash-->ES-->Kiabana) individually.

But now I have two problems

  1. These Payloads are in different formats some are in JSON, some are in XML, they dont have file extensions or anything, we can identiy them only by seeing the content in it. can any filter help me to Identify the content ? I want to identify the content type so that Kiabana can give me better viewing.
  1. I have another filebeat which reads Json log file, this log file has filepath to the payload for that transaction in local disk, which I am reading in different filebeat or ES index, can I merge these two filebeats and create single index where I can see the payload along with the Transaction log.

Are these two challenges can be solved ? if so please direct me to some documentation.

Thanks
J

#2

For the first question, you could test what the message looks like before parsing it. For example if it starts with < then it could be XML...

if [message] =~ /\s*</ {

For JSON that would be

if [message] =~ /\s*{/ {

However, you could just go ahead and run every message through both a json filter and an xml filter. You will _jsonparsefailure or _xmlparsefailure on every message (since if it matches one it is certain to fail on the other) but those can be removed.

For the second question ... you could use a ruby filter to attach the contents of the file to the filepath. This assumes the file is not very large.

ruby { code => 'event.set("payload", File.read(event.get("filepath")))' }
(system) closed #3

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.