Logstash and elasticsearch mismatch

I'm trying to ship logs from a server with filebeat to another server that hosts logstash and elasticsearch. Everything is latest and greatest (7.8.0). Problem is, I'm getting an error from logstash.

This is the error I get from logstash:

[2020-07-17T20:17:43,845][WARN ][logstash.outputs.elasticsearch][main][8ce40c8c6d7b7e92195bf01fa9d2c86d4bb1a87e7565d54444d45d82ebbd311f] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x2594ed33>], :response=>{"index"=>{"_index"=>"logstash-2020.07.14-000001", "_type"=>"_doc", "_id"=>"0yZsXnMBpYpmFLee7cmh", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [text] in document with id '0yZsXnMBpYpmFLee7cmh'. Preview of field's value: '{hostname=server150, os={kernel=3.10.0-1062.18.1.el7.x86_64, codename=Core, name=CentOS Linux, family=redhat, version=7 (Core), platform=centos}, containerized=false, ip=[*censoring public ip*], name=server150, id=3eec437c66d444a59ef5f075a429441d, mac=[*cencored*], architecture=x86_64}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:111"}}}}}

I've followed suggestions from other posts of people having this issue and this is what my conf file for logstash looks like (tried fixing it with the mutate part):

input{
file{
path => "/var/log/commands.log"
}
beats{
port => 5044
}
}
filter {
mutate {
   rename => ["host", "server"]
   convert => {"server" => "string"} 
}
if [path] == "/var/log/commands.log" {
grok{
match => { "message" => "\[(%{TIMESTAMP_ISO8601:sys_timestamp})\]\s(?<field1>[0-9a-zA-Z_-]+)\s(?<field2>[0-9a-zA-Z_-]+)\:USER=(?<field3>[0-9a-zA-Z_-]+)\sPWD=(?<field4>[0-9a-zA-Z_/-]+)\sPID=\[(?<field5>[0-9]+)\]\sCMD=\"(?<field6>.*)\"\sExit=\[(?<field7>[0-9]+)\]\sCONNECTION=(?<field8>.*)"
}
}
}
}
output{
elasticsearch { 
hosts => ["localhost:9200"]
index => "filteredindex"
}
}

But I still get the same error. I think it's just a mismatch in data and I can't get it to work. Does anyone know what's missing? Huge thanks ahead!

I suggest you read this post and then this post.

The beats input adds a [host] object to the event, the file input adds a [host] string to the event. A field cannot be an object on some documents and string on others. If you decide you want [host] to be an object you can do a mutate that is conditional upon [host] being a string. If you decide you want [host] to be a string then make the mutate conditional upon it being an object. Then once you start over with an empty index you should be OK.

Thanks for the response!

I just want it to be a string, should it be:

if ! [host][name] { 
mutate { 
rename => { "[host]" => "[host][name]" } 
convert => {"[host][name]" => "string"}
} 
}

That should work.

I still get the same issue :frowning:

[2020-07-19T07:57:16,280][WARN ][logstash.outputs.elasticsearch][main][ceb2c9b33104bb3ec31e20cf77c2d606d772a42046d77552fcc187a7d376c102] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filteredindex", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x518d4f63>], :response=>{"index"=>{"_index"=>"filteredindex", "_type"=>"_doc", "_id"=>"MTUTZnMBpYpmFLeevDJZ", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [text] in document with id 'MTUTZnMBpYpmFLeevDJZ'. Preview of field's value: '{hostname=server150, os={kernel=3.10.0-1062.18.1.el7.x86_64, codename=Core, name=CentOS Linux, family=redhat, version=7 (Core), platform=centos}, containerized=false, ip=[*public-ip*], name=server150, id=3eec437c66d444a59ef5f075a429441d, mac=[*mac*], architecture=x86_64}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:218"}}}}}

This is the config file I used:

input{
# file{
# path => "/var/log/commands.log"
# }
beats{
port => 5044
}
}
filter {
if ! [host][name] { mutate { rename => { "[host]" => "[host][name]" } convert => {"[host][name]" => "string"} } }
if [path] == "/var/log/commands.log" {
grok{
match => { "message" => "\[(%{TIMESTAMP_ISO8601:sys_timestamp})\]\s(?<field1>[0-9a-zA-Z_-]+)\s(?<field2>[0-9a-zA-Z_-]+)\:USER=(?<fi$
}
}
}
}
output{
elasticsearch {
hosts => ["localhost:9200"]
index => "filteredindex"
}
}

The index expects [host] to be a string, but you are inserting documents where it is an object:

Preview of field's value: '{hostname=server150, os={kernel=3.10.0-1062.18.1.el7.x86_64, codename=Core, name=CentOS Linux, family=redhat, version=7 (Core), platform=centos}, containerized=false, ip=[public-ip], name=server150, id=3eec437c66d444a59ef5f075a429441d, mac=[mac], architecture=x86_64}'

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.