I am receiving google apigee logs in syslog at 5444 port. Logs are being generating via multiple applications and format is also different like XML and Json.
Need help how can we configure logstash.conf to parse both type of data receiving via TCP.
If all of the data is in syslog format then you can use the syslog input plugin for logstash to capture the data.
Then if you need to you can inspect the syslog payload and see if it is XML or JSON.
Try to parse the data with the XML filter
If it fails and you get an "_xml_parse_exception" (iirc) then you can try and JSON filter and do error checking on that.
However, ideally you would send all XML data to a single pipeline that knows how to handle just XML data and JSON data to a pipeline that can handle just JSON data. It will make troubleshooting much easier.
Thanks Andreas,
I am a newbee in ELK, Can you please help how can we inspect syslog data and filter xml data , json data separately?
I wont go writing your logstash configs for you if that's what you mean
However.
You have the input already set and now you need to add an XML filter in the filter section.
Then send in some XML data and see if it works with a stdout output.
Then send in some data that is NOT XML and see what error code you get.
Then create an if statement like this:
if "_xml_parse_exception" in [tags] {
json {
}
}
Meaning if the xml failed to parse then try to parse with the json filter.
If that still fails then you need to figure out what you want to do with that data (either drop it or put it somewhere else).
This is a simple use-case for logstash and a really good way to learn how it all works I highly recommend you try it for yourself.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.