Logstash having problem with elasticsearch filter


In the logstash pipeline I am querying the same index to get some data from the previous day

The code looks like this:

    elasticsearch {
    	hosts => [""]
    	index => "index"
    	query => "country_region:%{[country_region]}"
    	enable_sort => true
    	sort => "@timestamp:desc"
    	result_size => 1
    	fields => [{"sold" => "sold_yesterday"},{"viewed" => "viewed_yesterday"}]

However in the logs I can see the following:

[2020-03-19T05:12:44,531][WARN ][logstash.filters.elasticsearch][main] Failed to query elasticsearch for previous event {:index=>"index", :error=>"undefined method `start_with?' for {\"sold\"=>\"sold_yesterday\"}:Hash"}

What I am doing wrong?

I realize that fields is documented as taking an array, and code itself requires that, but normal usage is to pass a hash and rely on the automatic conversion of the hash to an array. This is actually pretty common in logstash. Does it work better if you try

fields => {
    "sold" => "sold_yesterday"
    "viewed" => "viewed_yesterday"
1 Like

Any idea how to set the field type? They are numbers but I can see them set as string while I need to do some math functions and I cannot do them unless they are int type:

If you are referring to the fields that the elasticsearch filter adds then try mutate+convert.

I remembered in meantime about the mutate-> convert. I was hoping that there is a way to directly map the field type directly from the elasticsearch filter. Thanks anyways

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.