Problem: [logstash.outputs.elasticsearch] Could not index event to Elasticsearch-wazuh-alerts-3.x-2020.05.30

Hi Team,

We are running into a problem where we are not seeing any alerts in the Kibana. We are using this for the first time.

We have two servers-

  1. Server is installed with Wazuh Manager-API- Filebeat
  2. Elastic search -Kibana -logstash installed

The below is the version of each:

Product version Version Number
Elastic version 7.6.2
Kibana version 7.6.2
Filebeat version 7.6.2
Logstash version 7.7
Wazuh Version 3.12.3
Wazuh API version 3.12.3
Wazuh App version 3.12.3

Problem: [logstash.outputs.elasticsearch] Could not index event to Elasticsearch-wazuh-alerts-3.x-2020.05.30

We are not seeing any alerts in Kibana. When we go to the discovery option we see Filebeat index, wazuh-alerts index and wazuh monitoring index and we see some alerts and data from wazuh monitoring alerts but not from any others

When we run the Command: sudo cat /var/log/logstash/logstash-plain.log | grep --color=auto -i -E "error|warn"

We get these errors:

[2020-05-30T10:06:46,927][WARN ][logstash.outputs.elasticsearch][main][837b9fdd489459d1c08d9ff9e6132b48a6da16e189d6f60e198f90e2413edc11] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"wazuh-alerts-3.x-2020.05.30", :routing=>nil, :_type=>"wazuh"}, #LogStash::Event:0x6955dbc7], :response=>{"index"=>{"_index"=>"wazuh-alerts-3.x-2020.05.30", "_type"=>"wazuh", "_id"=>"wbPoZXIBPjTouCeTDVEP", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [host] of different type, current_type [keyword], merged_type [ObjectMapper]"}}}}

Please help us to solve this problem as we have gone through many articles in the forums and none have solved our issue.

We are attaching the below logs for your review:

  • Filebeat.YML
  • Logstash.YML
  • Logstash config
  • Wazuh config
  • Logstash output error
  • Index output from Wazuh Server
  • Index output from Elastic server

Verified the below are working fine:

Service Type Verifications Command status
Filebeat lsof /var/ossec/logs/alerts/alerts.json Checks Filebeat reading file: ossec-analysisd and filebeat: ossec-analysisd and filebeat
ElasticSearch curl <ELASTICSEARCH_IP>:9200/_cat/indices/wazuh-alerts-3.x-* Elastic Search reading the alerts
Wazuh API systemctl status wazuh-api Up and running
Filebeat systemctl status filebeat Up and running
Logstash systemctl status logstash Up and running
Wazuh AP[![Command sudo cat var-ossec-logs-ossec.log Erroe 690x234](upload://44KEFwKCTEBj9QrNYiJ4edeX5hu.jpeg) ![Elastic search filebeat yml file- test output 630x181](upload://dSaGcdKsPFmQYERlN6liISqkZmM.png) ![elastic stack-Index files

Also host is not added to the mutate-remove-field- “Host” has been removed.

Elastic search filebeat yml file- test output Filebeat.yml

please avoid using screenshot when pasting configs. use </> instead

you tried to index an keyword data type, while your index template uses object. either change your template or change the data you’re trying to index to an object.

Thank you very much for the prompt reply. Apologies for attaching the screen shots.

I am new to elastic search and logstash:
Can you help me where I need to change this, I am assuming it is Logstash output?
Can you suggest me any settings which will work? I am fine with changing the index type or change the template.
I will go with your suggestion. Please let me know if you need any logs to verify from my side. Thank you.

I was trying to refer to the below documents, however getting confused:

https://www.elastic.co/guide/en/elasticsearch/reference/current/keyword.html#keyword-params

Thank you very much in advance.

if you don’t need host field then you can just remove it with mutate filter. logstash will automatically add a host field which value defaults to the hostname of the system where logstash runs.

if you want to keep it then you can rename the field to conform with object data type. something like

mutate { 
 rename => { “host” => “[host][name]” } 
}

should work

Hi Tamba,
I have tried the steps you have mentioned and it is still not working. I have added the "mutate rename" you suggested and also tried to search few forums and added "if "MSEC" in [message]" . Attached the config files.

Please help us out with this error message below and let us know the next steps.

Also if you can provide a working filebeat.yml file and also logstash config file , elasticsearch.yml file (additional if we want to roll back to elastic search and not use logstash). Also please let us know any other config files are there by default. If nothing works we would start afresh we can use these files in our environment.

Here is the error message and also the CONFIG files:

Output for** **sudo cat /var/log/logstash/logstash-plain.log | grep --color=auto -i -E "error|warn":

apper]"}}}}

[2020-05-31T15:18:56,607][WARN ][logstash.outputs.elasticsearch][main][f265b93a0021d1fdadd11ff9db9a91e83eaae24e6481942bc3950917832feda1] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"wazuh-alerts-3.x-2020.05.31", :routing=>nil, :_type=>"wazuh"}, #LogStash::Event:0x220143c6], :response=>{"index"=>{"_index"=>"wazuh-alerts-3.x-2020.05.31", "_type"=>"wazuh", "_id"=>"3PYsbHIBtQIrr09cN_BR", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [host] of different type, current_type [keyword], merged_type [ObjectMapper]"}}}}

[2020-05-31T15:18:56,601][WARN ][logstash.outputs.elasticsearch][main][f265b93a0021d1fdadd11ff9db9a91e83eaae24e6481942bc3950917832feda1] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"wazuh-alerts-3.x-2020.05.31", :routing=>nil, :_type=>"wazuh"}, #LogStash::Event:0x34ba64cb], :response=>{"index"=>{"_index"=>"wazuh-alerts-3.x-2020.05.31", "_type"=>"wazuh", "_id"=>"2fYsbHIBtQIrr09cN_BM", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [host] of different type, current_type [keyword], merged_type [ObjectMapper]"}}}}

[2020-05-31T15:18:56,607][WARN ][logstash.outputs.elasticsearch][main][f265b93a0021d1fdadd11ff9db9a91e83eaae24e6481942bc3950917832feda1] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"wazuh-alerts-3.x-2020.05.31", :routing=>nil, :_type=>"wazuh"}, #LogStash::Event:0x6c883eb9], :response=>{"index"=>{"_index"=>"wazuh-alerts-3.x-2020.05.31", "_type"=>"wazuh", "_id"=>"4PYsbHIBtQIrr09cN_BT", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [host] of different type, current_type [keyword], merged_type [ObjectMapper]"}}}}

[2020-05-31T15:18:56,609][WARN ][logstash.outputs.elasticsearch][main][f265b93a0021d1fdadd11ff9db9a91e83eaae24e6481942bc3950917832feda1] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"wazuh-alerts-3.x-2020.05.31", :routing=>nil, :_type=>"wazuh"}, #LogStash::Event:0x6a829c70], :response=>{"index"=>{"_index"=>"wazuh-alerts-3.x-2020.05.31", "_type"=>"wazuh", "_id"=>"3fYsbHIBtQIrr09cN_BR", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [host] of different type, current_type [keyword], merged_type [ObjectMapper]"}}}}

[2020-05-31T15:18:56,609][WARN ][logstash.outputs.elasticsearch][main][f265b93a0021d1fdadd11ff9db9a91e83eaae24e6481942bc3950917832feda1] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"wazuh-alerts-3.x-2020.05.31", :routing=>nil, :_type=>"wazuh"}, #LogStash::Event:0x1ffaa79f], :response=>{"index"=>{"_index"=>"wazuh-alerts-3.x-2020.05.31", "_type"=>"wazuh", "_id"=>"4fYsbHIBtQIrr09cN_BT", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [host] of different type, current_type [keyword], merged_type [ObjectMapper]"}}}}

[2020-05-31T15:18:56,608][WARN ][logstash.outputs.elasticsearch][main][f265b93a0021d1fdadd11ff9db9a91e83eaae24e6481942bc3950917832feda1] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"wazuh-alerts-3.x-2020.05.31", :routing=>nil, :_type=>"wazuh"}, #LogStash::Event:0x23c4f16e], :response=>{"index"=>{"_index"=>"wazuh-alerts-3.x-2020.05.31", "_type"=>"wazuh", "_id"=>"4vYsbHIBtQIrr09cN_BX", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [host] of different type, current_type [keyword], merged_type [ObjectMapper]"}}}}

[2020-05-31T15:18:56,609][WARN ][logstash.outputs.elasticsearch][main][f265b93a0021d1fdadd11ff9db9a91e83eaae24e6481942bc3950917832feda1] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"wazuh-alerts-3.x-2020.05.31", :routing=>nil, :_type=>"wazuh"}, #LogStash::Event:0x4b30a873], :response=>{"index"=>{"_index"=>"wazuh-alerts-3.x-2020.05.31", "_type"=>"wazuh", "_id"=>"3vYsbHIBtQIrr09cN_BS", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [host] of different type, current_type [keyword], merged_type [ObjectMapper]"}}}}

[2020-05-31T15:18:56,613][WARN ][logstash.outputs.elasticsearch][main][f265b93a0021d1fdadd11ff9db9a91e83eaae24e6481942bc3950917832feda1] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"wazuh-alerts-3.x-2020.05.31", :routing=>nil, :_type=>"wazuh"}, #LogStash::Event:0x5ccc71f4], :response=>{"index"=>{"_index"=>"wazuh-alerts-3.x-2020.05.31", "_type"=>"wazuh", "_id"=>"3_YsbHIBtQIrr09cN_BS", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [host] of different type, current_type [keyword], merged_type [ObjectMapper]"}}}}

[2020-05-31T15:19:03,572][WARN ][logstash.outputs.elasticsearch][main][f265b93a0021d1fdadd11ff9db9a91e83eaae24e6481942bc3950917832feda1] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"wazuh-alerts-3.x-2020.05.31", :routing=>nil, :_type=>"wazuh"}, #LogStash::Event:0x22aa4e33], :response=>{"index"=>{"_index"=>"wazuh-alerts-3.x-2020.05.31", "_type"=>"wazuh", "_id"=>"4_YsbHIBtQIrr09cUvCL", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [host] of different type, current_type [keyword], merged_type [ObjectMapper]"}}}}

Logstash conf file

input {
beats {
port => 5000

ssl => true

ssl_certificate => ""

ssl_key => ""

}

}
filter {
json {
source => "message"
}
}
filter {
if [data][srcip] {
mutate {
add_field => [ "@src_ip", "%{[data][srcip]}" ]
}
}
if [data][aws][sourceIPAddress] {
mutate {
add_field => [ "@src_ip", "%{[data][aws][sourceIPAddress]}" ]
}
}
if "MSEC" in [message] {
mutate {
gsub => [ "message", ","MSEC":.{3},", ","]
}
}
}
filter {
geoip {
source => "@src_ip"
target => "GeoLocation"
fields => ["city_name", "country_name", "region_name", "location"]
}
date {
match => ["timestamp", "ISO8601"]
target => "@timestamp"
}
mutate {
remove_field =>[ "timestamp", "beat", "input_type", "tags", "count", "@version", "log", "offset", "type","@src_ip"]
}

mutate {
   rename => {"host" => "[host][name]"}
    }

}

output {
elasticsearch {
hosts => ["10.7.110.195:9200"]
index => "wazuh-alerts-3.x-%{+YYYY.MM.dd}"
document_type => "wazuh"
}
}

Filebeat yml:

Wazuh - Filebeat configuration file

filebeat:
inputs:

  • type: log
    paths:
    • "/var/ossec/logs/alerts/alerts.json"

      output.elasticsearch:

       # hosts: [localhost:9200]
      

output.logstash:

The Logstash hosts

    hosts: ["10.7.110.195:5000"]

ssl:

certificate_authorities: [""]

filebeat adds a [host] object to events, and that object contains the field called [host][name] which contains the name of the host. Some other inputs add a [host] field to events and that field contains the name of a host.

In elasticsearch a field cannot be an object on some documents and a string on others. You have to pick one or the other. If you pick string and try to add a document where [host] is an object then you get the exact error message that you are seeing.

{"type"=>"illegal_argument_exception", "reason"=>"mapper [host] of different type, current_type [keyword], merged_type [ObjectMapper]"}

In the index the field type is "keyword" (i.e. string) but you are trying to insert a document where it would be an object.

You need to decide whether you want to have [host] be an object or a string. If you want it to be an object and your input produces a string (e.g. a syslog input) then

if ! [host][name] { mutate { rename => { "[host]" => "[host][name]" } } }

may be a solution. If you want it to be a string and your input produces an object (e.g. a beats input) then

mutate { replace => { "[host]" => "[host][name]" } }

might be a solution. Note that is replace, not rename, so the meaning of the order of arguments is reversed.

If your input produces an object and you want it to be an object then just rolling over to a new index might be a solution.

Hello Wali,

               Thank you very much for the information. You have saved my day :). I have removed the logstash and then installed only filebeat with elastic search with the config  provided from github and we have seen the alerts resuming. We are seeing the vulnerbaility data, System inventory data...events summary. I have seen people discussing and  push back from many to use logstash with the latest version as elastic search would do the work.Thank you very much.

Can you please assist on one last thing on this thread.

I have been trying hard to get the information on the Wazuh SIEM option, We dont see any information coming through under any of the tabs, I have attached the below screen shots. Please let us know if we need to enable anything else to have some data flowing into this. Awaiting your reply. Thanks again

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.