Logstash Elasticsearch ouput: Append array while upserting

I am upserting documents: new fields should be added and existing fields should be replaced by the new documents field value. With one exception: the value of text field 'shodan.protocols' should always be appended instead of replaced. This field should contain multiple unique string values.

My logstash output looks like this:

elasticsearch {
                            index => "test"
                            document_id => "%{ip}"
                            doc_as_upsert => true
                            action => "update"
                            script => "ctx._source.shodan.protocols += shodan.protocols"
                           }

As you can see I tried it with 'script', but this one results in the following error: "reason"=>"Variable [shodan] is not defined."}, "script_stack"=>["... urce.shodan.protocols += shodan.protocols"

I don't even know if using 'script' is the right way to do this, that's why I'm asking you guys. So the problem is that the value of 'shodan.protocols' is overwritten instead of appended (unique strings).

1 Like

@dadoonet maybe you know the answer?

Your script approach might work, but the fields of the current event aren't available to the script execution engine (run on the Elasticsearch side). Replacing shodan.protocols with %{[shodan][protocols]} should at least help a bit.

@magnusbaeck Thanks for your reply, I tried your suggestion:

script => "ctx._source.shodan.protocols += %{[shodan][protocols]}"

Results in the following error:
"reason"=>"compile error", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"invalid sequence of tokens near ['%'].", "caused_by"=>{"type"=>"no_viable_alt_exception"

Same error with this:

 script => "%{[ctx][_source][shodan][protocols]} += %{[shodan][protocols]}"

It indicated some syntax error near the % symbol, I'm not sure whats wrong with it.

Also, when I start logstash with file input, I will get the "Too many dynamic script compilations within one minute, max: [15/min]" error when the file got more then 15 new events. I tried this in elasticsearch.yml:

script.max_compilations_per_minute: 100000

Restarted Elasticsearch but it seems not to read of ignore that line, because I get the same error. Also tried with 10000, 1000 and 100.

Any suggestions?

Results in the following error:
"reason"=>"compile error", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"invalid sequence of tokens near ['%'].", "caused_by"=>{"type"=>"no_viable_alt_exception"

Are you 100% sure all events have a [shodan][protocols] field? You'll have to double-quote the variable expansion, i.e. do this:

script => 'ctx._source.shodan.protocols += "%{[shodan][protocols]}"'

Which version of Logstash are you using?

script => "%{[ctx][_source][shodan][protocols]} += %{[shodan][protocols]}"

That's just wrong.

2 Likes

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.