> systemctl status -l logstash`
> Jun 27 15:29:10 sd-131865 logstash[23755]: [2019-06-27T15:29:10,956][INFO ][logstash.outputs.elasticsearch] retrying failed action with response code: 403 ({"type"=>"security_exception", "reason"=>"action [indices:admin/create] is unauthorized for user [logstash_internal]"})
> Jun 27 15:29:10 sd-131865 logstash[23755]: [2019-06-27T15:29:10,958][INFO ][logstash.outputs.elasticsearch] Retrying individual bulk actions that failed or were rejected by the previous bulk request. {:count=>125}
You should try to add a stdout output to see if anything is produced.
Some thoughts: why using logstash in that case? Elasticsearch ingest feature should be ok to do the filter part you want.
So connect filebeat to elasticsearch directly and define the pipeline to use.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.