Hi, we are using ELK with Apigee to send the transaction logs, the logstash configuration is as recommended by the community as below:
input {
tcp {
port => 8080
type => syslog
}
}
filter {
mutate {
gsub => ["message", "[\u0000]", ""]
}
grok {
match => {"message" => "<%{NUMBER:priority_index}>%{DATESTAMP_OTHER:apigeeTimestamp}%{LOGLEVEL}: %{GREEDYDATA:apigeeMessage}"}
remove_field => ["message"]
}
json {
source => "apigeeMessage"
remove_field => ["apigeeMessage"]
}
}
output {
#added below line to see how the message is outputed/parsed with logstash filer, you can remove below line (only stdout { codec => rubydebug }), if necessary.
stdout {
codec => rubydebug
}
elasticsearch {
hosts => ["IP1", "IP2", "IP3"]
index => "apigee-%{+YYYY.MM}"
user => "elastic"
password => "*"
}
}
the real wanted json message contain a field called: error.message
"error":{
"isError":"true",
"message":"Execution of JS-ValidateRequiredHeadersAndQueryParams failed with error: "Javascript runtime" error: ",
"errorCode":"500",
"errorPhrase":"Internal Server Error",
"transportMessage":"com.apigee.messaging.adaptors.http.message.HttpResponseMessage@73f118a9",
"errorState":"PROXY_REQ_FLOW",
"isPolicyError":"1",
"isTargetError":"0",
"policyErrorPolicyName":"JS-ValidateRequiredHeadersAndQueryParams",
"policyErrorFlowName":"contracts",
"error":"com.apigee.flow.message.MessageImpl@6d128cc9",
"content":""
}
}
the issue is when the json filter receive the message it gives error _jsonparsingerror because the error.message contain extra (" ") and the filter cannot pars the message is there any way to clear the filed error.message from any special character without damaging the json structure, i tried the below but it did not work :
gsub => ["[apigeeMessage][error][message]", """, "" ]
second issue:
the error.content will contain another json message is there a way to manage it?.