Logstash Grok JSON error - mapper of different type

I have this log file:

`2020-08-05 09:11:19 INFO-flask.model-{"version": "1.2.1", "time": 0.651745080947876, "output": {...}}`

This is my logstash filter setting

    `grok{
            match => {
              "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:log.level}-%{DATA:model}-%{GREEDYDATA:log.message}"}
            }
        date {
                timezone => "UTC"
                match => ["timestamp" , "ISO8601", "yyyy-MM-dd HH:mm:ss"]
                target => "@timestamp"
                remove_field => [ "timestamp" ]
        }
        
        json{
                source => "log.message"
                target => "log.message"
        }
        mutate {
                add_field => {
                        "execution.time" => "%{[log.message][time]}"
                }
        }
    }`

I want to extract the "time" value from the message. But I receive this error:

`[2020-08-05T09:11:32,688][WARN ][logstash.outputs.elasticsearch][main][81ad4d5f6359b99ec4e52c93e518567c1fe91de303faf6fa1a4d905a73d3c334] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"index-2020.08.05", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0xbe6a80>], :response=>{"index"=>{"_index"=>"index-2020.08.05", "_type"=>"_doc", "_id"=>"ywPjvXMByEqBCvLy1871", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [log.message.input.values] of different type, current_type [long], merged_type [text]"}}}}`

That's not a grok problem. It's an Elasticsearch problem. You've got the field log.message.input.values that is a long data type in some entries and text in others. You'll have to create a new index with the correct mapping or create an index template to assign the correct data type to that field, instead of relying on the dynamic mapping of Elasticsearch. If your new index is configured correctly and you reindex your data, this warning should not occur.

I don't understand the problem, really. It is a JSON and of course some of the values are double and some strings. How to define all the values from the JSON in elastic search?

@ Jenni Sorry! You are great! I understand now the problem. There is some predefined value "input.value" and that's why the ES is throwing this error, because I had the same in my JSON. Now I have fixed the problem in this way:

mutate {
                add_field => {
                        "execution.time" => "%{[log.message2][time]}"
                }
                remove_field=>["log.message2"]
        }

In this way.

Thanks
Nik

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.