Trying to analyze data but getting wrong data format from log what are the options its big data

I have to analyze the given kind of data but the data format is not n json what is the problem and how i might change it.
this is the dummy json and i have to analyze data in original
{"_index": "dummy_elk","_id": "dummy_id_111","_version": 12,"_score": 12,"_ignored": ["event.original.keyword","message.keyword"],"_source": {"event": {"original": "<14>Mon Jul 04 06:40:39 GMT 2022Info: {"timestamp": 2018-7-4T6:40:39,"callerApplication": Caller_Application_1,"apiProduct": API_Producgt_1,"apiProxy": API_Proxy_1,"messageid": message_id_1234,"environment": dev,"basePath": /base_dummy/v1,"pathSuffix": /socre_dummy_api, "proxyEndpoint": https://ip_address/base_dummy/v1/socre_dummy_api,"targetEndpoint": ,"method": POST,"httpStatusCode": 200,"responseTime": 915216707,"requestBody": {"MobileNumber": "5267890", "Name": "Name_Dummy_1 ", "Pan": "xxxpxxxxxx"},"responseBody": { "JSON1": "JSON_Dummy1", "CreatedOn": "2022-07-04 11:43:36.94", "Error_Msg": "abc", "OfferID_Dummy": "offer_1234", "Pan": "xxxpxxxxx", "Score": "700"}}\u000"}}}

I think you'll need to add some more information before anybody can help you. Is that an NGINX proxy log or something? We have integrations that help ingest various types of logs.

You will need to implement additional parsing either using Logstash or Elasticsearch ingest pipeline. Based on the event.original, it looks like you are receiving a log in syslog format. I would foresee that you will need to look at grok and json filter/processor.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.