Logstash Filter for Parsing Mule Log


(Dhanashree Zope) #1

Hi,
I wish to load data from a mule log file to elasticsearch via logstash.
The following is my log data pattern:

2018-04-26 14:35:51,016 [[cif].HTTP_Listener_Configuration.worker.01] INFO org.mule.api.processor.LoggerMessageProcessor - Process Initialised Corelation Id :ee7a310a-a45a-45cd-9159-caba7e6b97e1 , Flowname :XYZ,Process:PurchaseInfo,Source:S2,Target:T2,Status:SUCCESS End of Process

I wish to get only values of Flowname,Process,source,target,status as separate fields after parsing the data. Which filter can be used in logstash config file to achieve this and how it can done?


(Guy Boertje) #2

Its hard to help. You only gave one line of data. There are some outdated links if you search the internet but some of the info is relevant.
For instance, there is a JSON logging module for Mule. If you can switch to that then the LS config is much easier and LS performance much better.
If you can't use the JSON logging module, then we will need to see a few samples of (sensitive info scrubbed) the mule logs.
Depending on how regular the log format is you can use Dissect (more regular) or Grok (works for less regular) filters.


(system) #3

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.