Hi there,
I'm using ELK 7.3.1
And getting Error parsing json for "message"
This is how message looks like in origin (Exacmple)-
message {"time_date": "2019-02-14T14:00:39+00:00","client": "10.xxx.xxx.xxx", "host": "xxx.com", "scheme": "https", "request_method": "GET", "request_uri": "/static/img/logo_new.png", "request_id": "xxxxxxxxxxxxxx", "status": 304, "upstream_addr": "xxx.xx.xx.xx:80", "upstream_status": 304, "request_time": 0.002, "upstream_response_time": 0.000, "upstream_connect_time": 0.000, "upstream_header_time": 0.000}
The tag error that I get in Logstash is -
beats_input_codec_plain_applied, _jsonparsefailure
The logs in Logstash -
2019-09-09T17:00:55.642564748Z at [Source: (byte[])"{"time_date": "2019-09-09T17:00:54+00:00","client": "10.xxx.x.xxx", "host": "pro.pipl.com", "scheme": "https", "request_method": "POST", "request_uri": "/search/", "request_id": "5d54cc2f24c4420fb2dbc49500bcefa9", "status": 499, "upstream_addr": "xxx.xx.xxx.xxx:80", "upstream_status": -, "request_time": 1.679, "upstream_response_time": 1.680, "upstream_connect_time": 0.004, "upstream_header_time": -}"; line: 1, column: 289]>}
2019-09-09T17:02:16.804707218Z [2019-09-09T17:02:16,804][WARN ][logstash.filters.json ] Error parsing json {:source=>"message", :raw=>"{\"time_date\": \"2019-09-09T17:02:15+00:00\",\"client\": \"xx.xxx.x.xxx\", \"host\": \"pipl.com\", \"scheme\": \"https\", \"request_method\": \"GET\", \"request_uri\": \"/locationautocomplete/\", \"request_id\": \"624beb1705dae1c6d6d8b99ac66b7ac3\", \"status\": 499, \"upstream_addr\": \"xxx.xx.xxx.xxx:80\", \"upstream_status\": -, \"request_time\": 0.285, \"upstream_response_time\": 0.284, \"upstream_connect_time\": 0.000, \"upstream_header_time\": -}", :exception=>#<LogStash::Json::ParserError: Unexpected character (',' (code 44)) in numeric value: expected digit (0-9) to follow minus sign, for valid numeric value
My Logstash conf -
apiVersion: v1
kind: ConfigMap
metadata:
name: logstash-kube-config
data:
logstash.conf: |-
input {
beats {
port => 5044
}
}
filter {
if [kubernetes][container][name] == "nginx-ingress" {
json {
source => "message"
remove_field => "message"
}
}
else if [kubernetes][container][name] == "nginx" {
grok {
match => {
"message" => "%{IP:remote_ip} - \[%{HTTPDATE:[response][time]}\] \"%{DATA:url}\" %{NUMBER:[response][code]} %{NUMBER:[response][bytes]} %{QS:user_agent}"
}
remove_field => "message"
}
geoip {
source => "remote_ip"
target => "[geoip]"
}
}
else {
drop {}
}
date {
match => ["time", "ISO8601"]
remove_field => ["time"]
}
mutate {
remove_field => ["source", "host", "[beat][name]", "[beat][version]"]
}
}
output {
elasticsearch {
hosts => ["http://...:9200"]
index => "apps-prod-dal10-%{[kubernetes][namespace]}-deployment-%{[kubernetes][container][name]}-%{[kubernetes][replicaset][name]}%{+YYYY.MM.dd}"
}
}
What am I doing wrong here? (PS , worked before the upgrade from 6.4...)
Thanks!
Aleksei