I have filebeat installed on a Syncplify server. It will send a couple of logs to ES and then nothing else. The service says it is still running. If I restart the service I get a couple more logs and then nothing unless I restart the service again.
2021-09-10T09:12:18.516-0400 DEBUG [elasticsearch] elasticsearch/client.go:411 Bulk item insert failed (i=47, status=500): {"type":"find_match","reason":"Unable to find match for dissect pattern: %{}.%{file_extension} against source: 2"}
2021-09-10T09:12:18.516-0400 DEBUG [elasticsearch] elasticsearch/client.go:411 Bulk item insert failed (i=48, status=500): {"type":"find_match","reason":"Unable to find match for dissect pattern: %{}.%{file_extension} against source: -"}
2021-09-10T09:12:18.729-0400 ERROR [publisher_pipeline_output] pipeline/output.go:180 failed to publish events: temporary bulk send failure
2021-09-10T09:12:18.729-0400 INFO [publisher_pipeline_output] pipeline/output.go:143 Connecting to backoff(elasticsearch(https://XXX.XXX.XXX.XXX:9200))
2021-09-10T09:12:18.729-0400 DEBUG [esclientleg] eslegclient/connection.go:249 ES Ping(url=https://XXX.XXX.XXX.XXX:9200)
2021-09-10T09:12:18.729-0400 INFO [publisher] pipeline/retry.go:213 retryer: send wait signal to consumer
2021-09-10T09:12:18.729-0400 INFO [publisher] pipeline/retry.go:217 done
2021-09-10T09:12:18.730-0400 DEBUG [esclientleg] eslegclient/connection.go:272 Ping status code: 200
2021-09-10T09:12:18.730-0400 INFO [esclientleg] eslegclient/connection.go:273 Attempting to connect to Elasticsearch version 7.13.2
2021-09-10T09:12:18.730-0400 DEBUG [esclientleg] eslegclient/connection.go:328 GET https://XXX.XXX.XXX.XXX:9200/_license?human=false
2021-09-10T09:12:18.754-0400 INFO [index-management] idxmgmt/std.go:261 Auto ILM enable success.
2021-09-10T09:12:18.754-0400 DEBUG [esclientleg] eslegclient/connection.go:328 GET https://XXX.XXX.XXX.XXX:9200/_ilm/policy/filebeat
2021-09-10T09:12:18.756-0400 INFO [index-management.ilm] ilm/std.go:160 ILM policy filebeat exists already.
2021-09-10T09:12:18.756-0400 INFO [index-management] idxmgmt/std.go:401 Set setup.template.name to '{filebeat-7.14.0 {now/d}-000001}' as ILM is enabled.
2021-09-10T09:12:18.756-0400 INFO [index-management] idxmgmt/std.go:406 Set setup.template.pattern to 'filebeat-7.14.0-*' as ILM is enabled.
2021-09-10T09:12:18.756-0400 INFO [index-management] idxmgmt/std.go:440 Set settings.index.lifecycle.rollover_alias in template to {filebeat-7.14.0 {now/d}-000001} as ILM is enabled.
2021-09-10T09:12:18.756-0400 INFO [index-management] idxmgmt/std.go:444 Set settings.index.lifecycle.name in template to {filebeat {"policy":{"phases":{"hot":{"actions":{"rollover":{"max_age":"30d","max_size":"50gb"}}}}}}} as ILM is enabled.
2021-09-10T09:12:18.756-0400 DEBUG [esclientleg] eslegclient/connection.go:328 GET https://XXX.XXX.XXX.XXX:9200/_cat/templates/filebeat-7.14.0
2021-09-10T09:12:18.883-0400 INFO template/load.go:111 Template "filebeat-7.14.0" already exists and will not be overwritten.
2021-09-10T09:12:18.883-0400 INFO [index-management] idxmgmt/std.go:297 Loaded index template.
2021-09-10T09:12:18.883-0400 DEBUG [esclientleg] eslegclient/connection.go:328 GET https://XXX.XXX.XXX.XXX:9200/_alias/filebeat-7.14.0
2021-09-10T09:12:18.891-0400 INFO [index-management.ilm] ilm/std.go:121 Index Alias filebeat-7.14.0 exists already.
2021-09-10T09:12:18.891-0400 DEBUG [esclientleg] eslegclient/connection.go:328 GET https://XXX.XXX.XXX.XXX:9200/
2021-09-10T09:12:18.891-0400 INFO [publisher_pipeline_output] pipeline/output.go:151 Connection to backoff(elasticsearch(https://XXX.XXX.XXX.XXX:9200)) established
2021-09-10T09:12:18.891-0400 INFO [publisher] pipeline/retry.go:213 retryer: send wait signal to consumer
2021-09-10T09:12:18.891-0400 INFO [publisher] pipeline/retry.go:217 done
2021-09-10T09:12:18.903-0400 DEBUG [elasticsearch] elasticsearch/client.go:227 PublishEvents: 50 events have been published to elasticsearch in 12.0023ms.
2021-09-10T09:12:18.903-0400 DEBUG [elasticsearch] elasticsearch/client.go:411 Bulk item insert failed (i=0, status=500): {"type":"find_match","reason":"Unable to find match for
As far as I know no modules are enabled. The logging was working on this but recently stopped. The person that had configured it is no longer here. I was tasked with trying to figure it out.
I believe I am suppose to be using an ingest pipeline. If I remove that line from the config the logs start to show up in elastic but do not populate any fields. If I add the line back, it will populate the fields I want but only for a second.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.