I feel like this should be easy, and I'm not really sure why this isn't working.
I have the following data
host: MYPC
endpoint_type: computer
To follow ECS, I'm attempting to do a mutate and rename /nest the fields to
host.name
host.type
However - that appears to failed with the following
[2020-04-13T16:50:52,406][WARN ][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"dev-2020.04.13", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x36df53bc>], :response=>{"index"=>{"_index"=>"dev-2020.04.13", "_type"=>"_doc", "_id"=>"LRRPdXEBiQpvxsQYIBFT", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [text] in document with id 'xxxxxxxxxxxxx'. Preview of field's value: '{type=computer}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:844"}}}}}
I even tried dropping the field host to then rename (as there were prior contents in there) but that still didn't work.
The filter has been the following. I've also tried without having the [host] fields enclosed in double quotes. Since that had caused an issue for me in an if statement.
Not sure if it's a solution as much as it is a work around. But I got it working.
I was using the file input to bring in AV logs. The file input was setting "host" to the hostname of the logstash host, which was then preventing any renames to the host field.
I had to in it's own block before any parsing takes place remove the host and path field set by the logstash file input plugin.
mutate {
remove_field => [ "host", "path" ]
}
Then in the following blocks the following worked:
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.