Logstash Json Parsing Error

Hi,

I have setup the following pipeline to consolidate my logs in Elastic Search Cluster :
filebeat to gather nginx logs ==> logstash to parse the log and mutate them if needed ==> Elasticsearch cluster.

I'm facing an issue #<LogStash::JsonParserError: Unexpected character ('.' (code 46)): Expected space separating root-level values

What is strange is that for some entries there is no error at all and the entries are being shipped to Elasticsearch but for some others they are facing this Logstash::JsonParseError.

Hereunder a line that is failling

192.168.1.33 - [2023-07-28T09:10:05+02:00] 1690528205.570 "GET /mycomponentapi/api/v1/mech/login?after=2023-07-28T06:38:05.457Z HTTP/1.1" 200 89 1576 0.075 "-" "Apache-HttpAsyncClient/4.1.4 (Java/1.17.0_173)" 192.168.1.148:8089 2984142b-25a2-c9ed-babb-d5782be330b4 192.168.1.33 "-" TLSv1.2/ECDHE-RSA-AES256-SHA384 - - . 77b9071d0a430c78fe8d0356caf16b48a4022014e73b7a92e0518fee6c054c98 NONE -

Hereunder a line that is not failling

192.168.1.33 - [2023-07-28T09:12:11+02:00] 1690528331.118 "GET /otherapi/tech/user?executeOnRetrieve=true&_queryFilter=loginId+eq+%220644b702-3d54-4c56-a54e-c1e3e71857a7%22&_fields=has2FA,authorizationsV1,*,hasConfig,otpConfigExpirationTime HTTP/1.1" 200 1114 2943 0.048 "-" "PostmanRuntime/7.32.3" 192.167.185.188:8443 54a78440-ad86-cfda-1e4f-5eec3457462f 192.168.1.33 "-" TLSv1.2/ECDHE-RSA-AES128-GCM-SHA256 - - . 3d5eb4bc1f3974845ce45f67b706990b63f136f8f0b2c989658fd1da80efa899 NONE -

The filter of logstash is configured this way :

filter {
        json {
            source => "message"
        }
        grok {
            match => { "message" =>
                         [
                           "%{IP:clientDirectIp} (?:-|%{USER:auth}) \[%{TIMESTAMP_ISO8601:timestampSec}%{ISO8601_TIMEZONE:timezone}\] %{INT}\.%{INT:ms} \"(?:%{WORD:httpMethod} %{URIPATH:urlPath}(?:\?%{NOTSPACE:queryString}?)?(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})\" %{NUMBER:statusCode:int} %{NUMBER:bytesIn:int} %{NUMBER:bytesOut:int} %{NUMBER:time:float} \"(?<referer>[^\"]*)\" \"(?<agent>[^\"]*)\" (?:%{IP:upstreamIp}:%{NUMBER:upstreamPort:int}(?:, %{IP:upstreamIp}:%{NUMBER:upstreamPort:int})*|-|\S+) (?:-|(%{NOTSPACE:transactionId}) %{IP:realIp} \"(?:(?<forwardedFor>%{IP}(?:, %{IP})*)|-)\")?",
                           # in case of unmatch, try to match the most usual and useful fields
                           "%{IP:clientDirectIp} (?:-|%{USER:auth}) \[%{TIMESTAMP_ISO8601:timestampSec}%{ISO8601_TIMEZONE:timezone}\] %{INT}\.%{INT:ms} \"(?:%{WORD:httpMethod} %{URIPATH:urlPath}(?:\?%{NOTSPACE:queryString}?)?(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})\" %{NUMBER:statusCode:int} %{NUMBER:bytesIn:int} %{NUMBER:bytesOut:int} %{NUMBER:time:float} %{GREEDYDATA:unparsed_details}"
                         ]
                     }
            add_field => {
                "urlPathPattern" => "%{urlPath}"
                "timestamp" => "%{timestampSec}.%{ms}%{timezone}"
                "clientDirectHost" => "%{clientDirectIp}"
                "upstreamHost" => "%{upstreamIp}"
            }
            remove_field => [ "message", "timestampSec" ]
        }
}

I'm getting stuck.

Regards,

I have fix the issue by removing json filter. No need to transform a field that is not a json object.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.