A single grok filter can include multiple expressions that will be tried in order until there's a match. See the documentation of the filter's match options for an example.
In the second filter no DataString1 field is extract by the filter so it won't make sense to copy that field to TCNAME.
Your add_field settings are pointless. If you want the data in the TCNAME field extract it straight to that field instead of using DataString1 as a temporary field.
A single grok filter can include multiple expressions that will be tried in order until there's a match. See the documentation of the filter's match options for an example.
My problem is that i have to extract multiple pattern from the log files and log data is not consistent, like one log line contains filename, second log file contains TCName,3rd error String contains errorString and 4th error String contains TCName:Completed.
Now i have to extract all these string and group them by the TCName.
To extract multiple patterns i am using multiple groks, i am able to extract all the information but i am not group them . Best way i though to group all information is by displaying all the information in the same order it is coming in the log file
But I am not able to achieve it as log data has same timestamp in some scenario , i am not able to maintain the sequence of log data while parsing it.
How can i maintain the sequence of log data having same timestamp, I am not using filebeat
Okay, so the information you need is spread out over multiple log entries? You can probably use an aggregate filter to save information from previous lines and use them later on.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.