I have a tcp syslog input in filebeat, the incoming data is json. i would want to decode json to top level keys in elastic search. To do this has been quite the challenge for me, and i hope someone is able to help.
as the syslog input does not have an option to decode json, as the log input, i figured i either had to use Logstash or an ingest node in elastic search. I created a new ingest pipeline in elasticsearch. I have used the simulate to test if i put the content of my syslog into the "message" field (also the field targeted for decode". It all works and the response shows it correctly.
but as soon as i choose a pipeline id in my filebeat.yml either in the input or output i get this error from filebeat: ERROR [syslog] syslog/input.go:131 can't parse event as syslog rfc3164
if i remove the pipeline id it works, but just puts all the json in the message field.
this i my filebeat.yml file:
filebeat.inputs:
- type: syslog
protocol.tcp:
host: ":9000"
pipeline: "pipeline_name"
cloud.id: ${ELASTIC_CLOUD_ID}
cloud.auth: ${ELASTICSEARCH_USERNAME}:${ELASTICSEARCH_PASSWORD}
Here is the configuration for the pipeline:
"pipeline_name": {
"description": "json decode",
"processors": [
{
"json": {
"field": "message",
"add_to_root": true
}
}
]
}