LogStash Error

Hi Team,

I'm getting exception=>#<RuntimeError: Invalid FieldReference: whenever i have any value like below in my event.
'some.value[1]'

My Conf filw looks like

input {
        sqs {
                access_key_id => "abc"
                secret_access_key => "def"
                queue => "queue_name"
                region => "us-east-1"
                codec => "line"
        }
}

How i can avoid that error?

Can you share the full error from the logstash error log ?

It seems you are using a wrong field or parameter.

Please see the error

[2021-11-04T18:19:55,109][ERROR][logstash.codecs.json]
[main][9e4478322092ac46867421da34e83caeaf3c3c469d1a01e83f6d915678e5a9a2] 
JSON parse error, original data now in message field 
{:message=>"Invalid FieldReference: `proc.aname[2]`", 
:exception=>LogStash::Json::ParserError, 
:data=>"{\"output\":\"18:19:55.042663479: some message (user=root user_loginuid=-1 command=httpd --loglevel info run ^syscall.ReadSensitiveFileUntrusted$ --sleep 6s parent=httpd file=/etc/shadow parent=httpd gparent=containerd-abc container_id=123 image=abc/event-generator) k8s.ns=abc-logstash k8s.pod=abc-def-ghi container=123 k8s.ns=abc-logstash k8s.pod=abc-def-ghi container=123\",\"priority\":\"Warning\",\"rule\":\"Read sensitive file trusted after startup\",\"time\":\"2021-11-04T18:19:55.042663479Z\",\"output_fields\":{\"clustername\":\"eks-logstash-test\",\"container.id\":\"123\",\"container.image.repository\":\"abc/event-generator\",\"cloud\":\"aws\",\"evt.time\":1636049995042663479,\"fd.name\":\"/etc/shadow\",\"k8s.ns.name\":\"abc-logstash\",\"k8s.pod.name\":\"abc-def-ghi\",\"proc.aname[2]\":\"containerd-shim\",\"proc.cmdline\":\"httpd --loglevel info run ^syscall.ReadSensitiveFileUntrusted$ --sleep 6s\",\"proc.pname\":\"httpd\",\"user.loginuid\":-1,\"user.name\":\"root\",\"version\":\"abc-def\"}}"}

Can you share your full pipeline? You just shared your input, your error seems to be coming from a json filter in your pipeline.

This is how i defined my conf file

input {
        sqs {
                access_key_id => "abc"
                secret_access_key => "def"
                queue => "queue_name"
                region => "us-east-1"
                codec => "line"
        }
}
filter {
        json {
                source => "message"
        }
        mutate {
                remove_field => ["@timestamp", "host", "@version"]
                }
}

It is a known issue.

You could try something like

mutate { gsub => [ "message", "\[\d+\]", "" ] }

to adjust the field name before trying to parse it.

Do i need to keep json filter or i can remove that?
Will the below pipleine work?

filter {
        json {
                source => "message"
        }
        mutate {
                remove_field => ["@timestamp", "host", "@version"]
                }
       mutate { gsub => [ "message", "\[\d+\]", "" ] }
}

You still need the json filter, but the mutate must come first

 mutate { gsub => [ "message", "\[\d+\]", "" ] }
 json {
     source => "message"
     remove_field => ["@timestamp", "host", "@version"]
 }

i tried that but still getting the same error

You cannot use a json codec if the JSON keys contains things like [2]. You will need to use a json filter instead.

ERROR][logstash.codecs.json]
Sorry i did'n get that ?
Codecs which i'm using in input?

The error message that you posted came from a codec. That means the error has already occurred in the input, before the message is sent to the pipeline and the mutate can fix it before the json filter parses it.

making sense now.
What does d mean here in gsub

mutate { gsub => [ "message", "\[\d+\]", "" ] }

The regexp is

\[ -- a literal square bracket (not the start of a character group)
\d+ -- \d is a digit, + means one or more
\] -- a literal square bracket

1 Like

Thanks @Badger for responding. That was very helpful

mutate { gsub => [ "message", "\[\d+\]", "" ] }

how do i keep the data which is is inside the ?

If you want to keep the number and lose the square brackets you could try a capture group

mutate { gsub => [ "message", "\[(\d+)\]", "\1" ] }

or a character group

mutate { gsub => [ "message", "[\[\]]", "" ] }

Can i keep both square brackets and number/character inside brackets. ?

You will need to substitute the [ and ] before parsing the JSON, then substitute them back in afterwards. That said, I very much doubt that a json filter is the only place where field names that look like array references cause problems.

mutate {
    gsub => [
        "message", "\[", "LeftSquareBracket",
        "message", "\]", "RightSquareBracket"
    ]
}
json { ... }
ruby {
    code => '
        # Untested and has no error checking or recovery...
        event.to_hash.each { |k, v|
            if k.match(/LeftSquareBracket|RightSquareBracket/)
                newK = k.gsub(/LeftSquareBracket/, "[").gsub(/RightSquareBracket/, "]")
                event.set(newK, v)
            end
        }
    '
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.