Logstash, syslog, ECS and Kibana Logs / SIEM

Hi, I have logstash working as a central syslog server using syslog_pri plugin and sending the events to elasticsearch. I would like to ingest those syslog events and adhere to ECS so it is possible to use Kibana Logs or SIEM more effectively. What happens is that the field "host" is used for the syslog events and "host.name" is used for elastic-agent events.

It seems like this problem might already be solved and I'm missing the obvious solution.

I have tried mutate { rename => { "host" => "host.name" }} but that throws the error of:

"Could not dynamically add mapping for field [host.name]. Existing mapping for [host] must be of type object but found [text]"



if [host] and ! [host][name] {
    mutate { rename => { "host" => "[host][name]" }}

400 error with Elasticsearch:

"failed to parse field [host] of type [text] in document with id '...'. Preview of field's value: '{name=example.com}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"="Can't get text on a START_OBJECT at 1:45"}}}}}

example.com is the host sending the log to logstash.

It seems like the syslog- index created is totally different than the .ds-logs-system.syslog-default- index created by elastic-agent.

It looks like your index has [host] mapped as text, and not an object with a name field in it. Can you start over with a new index?

I moved rsyslog and elastic-agent to the same host, the rsyslog server listening for incoming syslog. logstash as a syslog server was totally removed from the equation and all is good.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.