Encountered mapper_parsing_exception when working through LogStash tutorial

I recently downloaded LogStash 6.6.2, and I am currently working through the tutorial here.

I created a first-pipeline.conf file inside my Logstash directory:

input {
    beats {
        port => "5044"
    }
}
filter {
    grok {
        match => { "message" => "%{COMBINEDAPACHELOG}"}
    }
    geoip {
        source => "clientip"
    }	
}
output {
    elasticsearch {
        hosts => [ "localhost:9200" ]
    }
}

When I run the command logstash -f first-pipeline.conf --config.reload.automatic, I see the following output:

Sending Logstash logs to C:/Users/M/Downloads/logstash-6.6.2/logs which is now configured via log4j2.properties
[2019-03-25T11:54:53,447][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-03-25T11:54:53,463][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.6.2"}
[2019-03-25T11:55:02,311][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-03-25T11:55:02,858][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-03-25T11:55:03,295][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-03-25T11:55:03,389][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2019-03-25T11:55:03,389][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2019-03-25T11:55:03,452][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-03-25T11:55:03,467][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2019-03-25T11:55:03,514][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2019-03-25T11:55:03,920][INFO ][logstash.filters.geoip   ] Using geoip database {:path=>"C:/Users/Miao/Downloads/logstash-6.6.2/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2019-03-25T11:55:04,155][INFO ][logstash.inputs.beats    ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2019-03-25T11:55:04,202][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x1fabe54 run>"}
[2019-03-25T11:55:04,302][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-03-25T11:55:04,302][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2019-03-25T11:55:04,677][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2019-03-25T11:55:42,745][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2019.03.25", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x136ad0c>], :response=>{"index"=>{"_index"=>"logstash-2019.03.25", "_type"=>"doc", "_id"=>"GiW0tGkBs4yBBLKzzHw1", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [text]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:835"}}}}}
[2019-03-25T11:55:42,753][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2019.03.25", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x12c9342>], :response=>{"index"=>{"_index"=>"logstash-2019.03.25", "_type"=>"doc", "_id"=>"GyW0tGkBs4yBBLKzzHw1", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [text]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:821"}}}}}
[2019-03-25T11:55:42,783][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2019.03.25", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x159d7c>], :response=>{"index"=>{"_index"=>"logstash-2019.03.25", "_type"=>"doc", "_id"=>"HCW0tGkBs4yBBLKzzHw1", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [text]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:819"}}}}}
[2019-03-25T11:55:42,785][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2019.03.25", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x450cf3>], :response=>{"index"=>{"_index"=>"logstash-2019.03.25", "_type"=>"doc", "_id"=>"HSW0tGkBs4yBBLKzzHw1", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [text]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:819"}}}}}
...

(For the purpose of the tutorial, ignoring the pipelines.yml file is acceptable -- or so it is claimed in the tutorial itself.)

What should I do to fix the mapper_parsing_exception?

It is having a problem with the field [host]. If you have any beats sending data to your elasticsearch instance then they will be creating a [host] object that contains fields like [host][name]. If you try to ingest a string into a field that has previously been mapped as an object then you will get that error.

Hello @Badger, thanks for your reply. My sample data were downloaded directly from the tutorial page. Does this mean that the data provided on the tutorial page were not properly formatted?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.