Json file parsing not working in logstash using beats

Hi,
I have a log which contain the json as input . when i was using lower version of logstash with logcouier i had processed the json input. But now im using latest version of logstash with beats. In this scenario json is not working for me. Did i miss anything to parse the file or else have to define anything in yml of beats. Please anyone help me to resolve it.

input {
beats {
port => 10546
type => "json"
}
}
filter
{
##Pattern to split the JSON format from the input.
grok {
match => [ "message","^]?((?[[[a-zA-Z0-9,:] ]-]*)) ?%{GREEDYDATA:app_json}"]
}
##Parsing JSON input to JSON Filter..
json {
source => "app_json"
}
}

This is how i'm processing my input. Please correct me if i did anything wrong

Use the beats input with Filebeat, not the courier input.

I'm using beats only, I was pasted wrongly here.

Okay, so what do you actually get? The result of a stdout { codec => rubydebug } output would be useful.

This is im getting in my stdout screen
"message" => "DiagnosticsLog brri-storms-t01 BRRIHUB_CEP STORM 111111111 2015-06-23T10:06:59.781Z WebServiceUtil: RESPONSE :{ "GetSpecialServiceCampaignResponse":{ "GetSpecialServiceCampaignBOD":{ "ApplicationArea":{ "CreationDateTime":"2015-06-23T10:06:58.418Z", "BODID":"", "VersionID":"2.0", "Sender":{ "UserID":"brri_hub", "CreatorNameCode":"BRRI_HUB", "SenderNameCode":"EIG" }, "Destination":{ "DestinationNameCode":"BRRI Hub" } }, "GetSpecialServiceCampaignDataArea":{ "BusinessContext":{ "TransactionType":"POST", "Action":"SpecialServiceCampaignRealTimeRequest", "MessageID":"88194000-F954-43B7-BE9C-23A270A0C409", "MessagePurpose":"Reporting", "FromRole":"Vehicle", "ToRole":"Dealer", "ActionType":"Insert" }, "SpecialServiceCampaignPayload":{ "VIN":"", "CampaignData":[ ] } } } } }",
"@version" => "1",
"@timestamp" => "2016-01-27T14:51:38.906Z",
"beat" => {
"hostname" => "vtmnalrhl216",
"name" => "vtmnalrhl216"
},
"count" => 1,
"fields" => nil,
"input_type" => "log",
"offset" => 0,
"source" => "/beep/brri-logs/DiagnosticsLog2015-06-23-10.log",
"type" => "elk_diagno_log",
"host" => "vtmnalrhl216",
"app_logType" => "DiagnosticsLog",
"app_instancId" => " brri-storms-t01",
"app_applicationId" => "BRRIHUB_CEP",
"app_methodName" => "STORM",
"app_correlationId" => "111111111",
"app_logRecorded_Timestamp" => "2015-06-23T10:06:59.781Z",
"app_logMessage" => "WebServiceUtil: RESPONSE :{ "GetSpecialServiceCampaignResponse":{ "GetSpecialServiceCampaignBOD":{ "ApplicationArea":{ "CreationDateTime":"2015-06-23T10:06:58.418Z", "BODID":"", "VersionID":"2.0", "Sender":{ "UserID":"brri_hub", "CreatorNameCode":"BRRI_HUB", "SenderNameCode":"EIG" }, "Destination":{ "DestinationNameCode":"BRRI Hub" } }, "GetSpecialServiceCampaignDataArea":{ "BusinessContext":{ "TransactionType":"POST", "Action":"SpecialServiceCampaignRealTimeRequest", "MessageID":"88194000-F954-43B7-BE9C-23A270A0C409", "MessagePurpose":"Reporting", "FromRole":"Vehicle", "ToRole":"Dealer", "ActionType":"Insert" }, "SpecialServiceCampaignPayload":{ "VIN":"", "CampaignData":[ ] } } } } }",
"tags" => [
[0] "_grokparsefailure"
],
"app_application" => "SpecialServiceCampaignRealTimeRequest",
"app_vhrId" => "2015-06-23T10",
"app_json" => "{ "GetSpecialServiceCampaignResponse":{ "GetSpecialServiceCampaignBOD":{ "ApplicationArea":{ "CreationDateTime":"2015-06-23T10:06:58.418Z", "BODID":"", "VersionID":"2.0", "Sender":{ "UserID":"brri_hub", "CreatorNameCode":"BRRI_HUB", "SenderNameCode":"EIG" }, "Destination":{ "DestinationNameCode":"BRRI Hub" } }, "GetSpecialServiceCampaignDataArea":{ "BusinessContext":{ "TransactionType":"POST", "Action":"SpecialServiceCampaignRealTimeRequest", "MessageID":"88194000-F954-43B7-BE9C-23A270A0C409", "MessagePurpose":"Reporting", "FromRole":"Vehicle", "ToRole":"Dealer", "ActionType":"Insert" }, "SpecialServiceCampaignPayload":{ "VIN":"", "CampaignData":[ ] } } } } }",
"fingerprint" => "345b358a8905d61f92bdc69e9aed5272bcb42002"

app_json contain my json value and i'm that source in json filter.

But the json filter you posted above parses the notify field.

That was copy paste error. Kindly check now i edited my input this is how i'm created logstash file.

input {
beats {
port => 10546
type => "json"
}
}
filter{
grok{
match =>{"message" => "^?((?<app_jsonexclude>[[[a-zA-Z0-9,_.:] ]-]*)) ?%{GREEDYDATA:app_json}"}
}
json {
source => "app_json"
}
}

Hi, I know this is not a correct place to ask this question. I am new to this community and also to elk. After splunk tried to bankrupt our company .. Can someone please guide me to correct place where I can get advice about what can be the replacement for splunk?
I have tried setting up ELK , got the data(json logs) into elasticsearch and was able to do some play with data on kibana. However, I am still not sure, how to achieve the very critical things which we need. Alerting...!!! We are currently using splunk to monitor ecommerce website traffic. Any help , will be very much appreciated. I am very much tired of piecing together so many components in ELK alone, and another tool for alerting only makes it more difficult, from what I have read so far. Is Graylog better than ELK to replace my use case? I know nothing may be same as splunk. I just want to replace a part of our use case, so we can bargain with splunk for a better deal,at the earliest.

Thanks

Please start your own thread instead :slight_smile:

Can any one help me out of the json issue

The field you extract into is named 'appjson' but you are specifying that the JSON filter should use 'app_json'. Make sure these are aligned and try again.

The field you extract into is named 'appjson' but you are specifying that the JSON filter should use 'app_json'. Make sure these are aligned and try again.

No, the file actually contains app_json. it's the Disqus formatter that interpreted the matching underscores as an emphasis and stripped them. I lost my patience after the second typo in the posted configurations.

Sorry for that typo @Magnus . I'm struck on the json part due to this issue im unable to move further step.

OK. Now that I look closer I can see that some of the text is in italics. I assume this could be avoided if the provided configuration samples were formatted as 'Preformatted text' when posted?

The issue could be the backslashes in the JSON string. Try adding a mutate sub filter as follows before parsing the JSON to clean it up.

mutate {
    gsub => ["json_app", "[\\]", "" ]
}

Ok @Christian. I ll implement this idea..