How to find error from logs using regex pattern?

regex Pattern which is saved in the local drive-in abc.json file and input of logs comes from Kafka server and send only error log snippet to elasticsearch if errors are not matched in abc.json then skip dumping of json in ES

here is abc.json


{
    "tag" : "CPU_Bad_Mode",
    "module": "syssw",
    "regex" : "Bad mode in Error handler detected",
    "regexFlag": 2,
    "file": "dmesg",
    "priority": 0
},
{

    "tag" : "CPU_CLK_Counter_Read_Error",
    "module": "syssw",
    "regex" : "Error in reading cpu clk counters before delay",
    "regexFlag": 2,
    "file": "dmesg",
    "priority": 0
},
{
    "tag" : "CPU_CLK_Set_Failed",
    "module": "syssw",
    "regex" : "failed to set pll parent clock",
    "regexFlag": 2,
    "file": "dmesg",
    "priority": 0
},
{
    "tag" : "PCIe_Bus_Error",
    "module": "syssw",
    "regex" : "PCIe Bus Error",
    "regexFlag": 2,
    "file": "dmesg",
    "priority": 0
}

and Here is input json whch come from kafka


{

"testSuite": "rts",
"logSnippet": [
type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",
"[Fri Apr 13 18:08:15 2018] pcieport 0000:00:01.0: PCIe Bus Error: severity=Uncorrected (Non-Fatal), type=Transaction Layer, id=0008(Requester ID)",

],
"branch": "abc",
"shim": null,
"timestamp": "2018-04-13 18:40:31",
"feedbackDesc": null,
"hostName": "ramesh-nagargoje"

}

There's no plugin that does exactly what you want. I suggest you generate a Logstash configuration with abc.json as input. Actually, a translate filter would probably work. Then you can generate e.g. a CSV file with the abc.json data and Logstash will automatically pick up any changes.

Can we get two input using config file one is abc.json and another is logs which comes from kafka?
If yes then how can we match errors using regex and dumps in ES?.

You can certainly have two inputs in the same Logstash configuration but you can't use it to accomplish what you want in this case.

Thanks, @magnusbaeck, Can you suggest what should I use to accomplish What I want?
any help would be appreciated.

I already made a suggestion in my first reply.

yes, I can get input abc.json and translate it in to .csv format. But I also want to get input from Kafka and use that abc.json as a filter to parse error and dump it in ES.

If I get abc.json and Kafka json input then, how I can differentiate messages from between them?

I'm not saying you should read abc.json with a file input. I'm saying that you should use a translate filter to process your Kafka messages. That translate filter should read a file in a compatible format that you have generated from abc.json. It won't be able to read abc.json in its current format.

thanks @magnusbaeck

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.