Logstash to Elastic search, Could not index event to Elasticsearch , "reason"=>"mapper [] of different type, current_type [long], merged_type [text]

error i see in logstash logs:

Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"arrayx1", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x55602f10>], :response=> {"index"=>{"_index"=>"norcalx1", "_type"=>"doc", "_id"=>"cclL6GgBpI0vWuktYEZR", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [volumes.lun-mapping-list] of different type, current_type [long], merged_type [text]

I am trying to pull data from a restfulAPI using http_poller. So far i have had no success. the fields it is having issues on are not really useful info so i tried to remove them with remove_field and tried mutate. that seem to have no effect. I must be not doing something right. semi new to ELK and logstash. I did some research and noticed logstash and elastic dont like nested arrays. i am sure i am missing something easy. Thankyou for you help

here is my config file:

input {
http_poller {
urls => {
   norcalxio => {
    # Supports all options supported by ruby's Manticore HTTP client
    method => get
    user => "user"
    password => "password"
    url => "https://X.X.X.X/api/json/v3/types/volumes?full=1"
    headers => {
      Accept => "application/json"
    }
     }
    }
    request_timeout => 60
    # Supports "cron", "every", "at" and "in" schedules by rufus scheduler
    schedule => { cron => "* * * * * UTC"}
        codec => "json"
        # A hash of request metadata info (timing, response headers, etc.) will be sent here
        metadata_target => "http_poller_metadata"
        cacert => "/etc/logstash/website.cer"
      }
    }

    filter {
            json {
                    remove_field => [ "volumes.lun-mapping-list", "volumes.vol-id", "volumes.xms-id",                         "volumes.snapgrp-id", "volumes.sys-id" ]
                    source => "message"
                    }        
    }

    output {
      elasticsearch {
            hosts => ["http://X.X.X.X:9200"]
            action => "index"
    index => "arrayx1"
      }
      stdout { codec => rubydebug }
    }

here is a sample of the data logstash is pulling in:

    {
        "params": {
            "id-property": "vol-id"
        }, 
        "volumes": [
            {
                "small-io-alerts": "disabled", 
                "last-refreshed-from-obj-name": null, 
                "obj-severity": "information", 
                "rd-bw": "11", 
                "iops": "43", 
                "replication-wr-bw-kbps": 0, 
        "qos-effective-bw": null, 
        "lb-size": 512, 
        "qos-exceeded-iops": "0", 
        "unaligned-rd-iops": "0", 
        "vaai-tp-alerts": "enabled", 
        "unaligned-io-alerts": "disabled", 
        "qos-exceeded-bw": "0", 
        "unique-physical-space": "0", 
        "tag-list": [], 
        "unaligned-io-ratio": "54", 
        "lun-mapping-list": [
                    [
                        [
                            "2c571a17371c48c5a96f1d3d8fe7c952", 
                            "dell_r640_esxi_cluster", 
                            2
                        ], 
                        [
                            "3635e0f1a92e4868a4df1cac9f6630ae", 
                            "Default", 
                            1
                        ], 
                                1
                    ]
                ], 
                "wr-iops": "39"
    }

I posted what was wrong with that filter in the other thread. Please do not open multiple threads for the same issue.

NP badger, problem was resolved thanks to you in other thread! Here is a link for anyone in the future that might run into this issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.