Failed to publish events: temporary bulk send failure

I have filebeat installed on a Syncplify server. It will send a couple of logs to ES and then nothing else. The service says it is still running. If I restart the service I get a couple more logs and then nothing unless I restart the service again.

In the debug logs I see this error repeatedly.

ERROR [publisher_pipeline_output] pipeline/output.go:180 failed to publish events: temporary bulk send failure

I have looked through the forum and google searches but have been unable to figure out the issue.

Any suggestions on what else to look at?

Can you provide your config and more of your Filebeat log?

Formatting might be off a little. Had to redact some info and comments

filebeat.inputs:

- type: log
  pipeline: "syncplify"
  enabled: true

  paths:
    - C:\ProgramData\Syncplify.me\ServerV5\Logs\default\logs/*.log
  exclude_lines: ['^#Remark',^sophosemailbackup]

filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false

setup.template.settings:
  index.number_of_shards: 1
  
tags: ["syncplify", "sftp"]

setup.kibana:
  host: "local.com:5601:5601"

output.elasticsearch:
  hosts: ["XXX.XXX.XXX.XXX"]

processors:
  - add_host_metadata: ~
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~


logging.level: debug```


2021-09-10T09:12:18.516-0400 DEBUG [elasticsearch] elasticsearch/client.go:411 Bulk item insert failed (i=47, status=500): {"type":"find_match","reason":"Unable to find match for dissect pattern: %{}.%{file_extension} against source: 2"}
2021-09-10T09:12:18.516-0400 DEBUG [elasticsearch] elasticsearch/client.go:411 Bulk item insert failed (i=48, status=500): {"type":"find_match","reason":"Unable to find match for dissect pattern: %{}.%{file_extension} against source: -"}
2021-09-10T09:12:18.729-0400 ERROR [publisher_pipeline_output] pipeline/output.go:180 failed to publish events: temporary bulk send failure
2021-09-10T09:12:18.729-0400 INFO [publisher_pipeline_output] pipeline/output.go:143 Connecting to backoff(elasticsearch(https://XXX.XXX.XXX.XXX:9200))
2021-09-10T09:12:18.729-0400 DEBUG [esclientleg] eslegclient/connection.go:249 ES Ping(url=https://XXX.XXX.XXX.XXX:9200)
2021-09-10T09:12:18.729-0400 INFO [publisher] pipeline/retry.go:213 retryer: send wait signal to consumer
2021-09-10T09:12:18.729-0400 INFO [publisher] pipeline/retry.go:217 done
2021-09-10T09:12:18.730-0400 DEBUG [esclientleg] eslegclient/connection.go:272 Ping status code: 200
2021-09-10T09:12:18.730-0400 INFO [esclientleg] eslegclient/connection.go:273 Attempting to connect to Elasticsearch version 7.13.2
2021-09-10T09:12:18.730-0400 DEBUG [esclientleg] eslegclient/connection.go:328 GET https://XXX.XXX.XXX.XXX:9200/_license?human=false
2021-09-10T09:12:18.754-0400 INFO [index-management] idxmgmt/std.go:261 Auto ILM enable success.
2021-09-10T09:12:18.754-0400 DEBUG [esclientleg] eslegclient/connection.go:328 GET https://XXX.XXX.XXX.XXX:9200/_ilm/policy/filebeat
2021-09-10T09:12:18.756-0400 INFO [index-management.ilm] ilm/std.go:160 ILM policy filebeat exists already.
2021-09-10T09:12:18.756-0400 INFO [index-management] idxmgmt/std.go:401 Set setup.template.name to '{filebeat-7.14.0 {now/d}-000001}' as ILM is enabled.
2021-09-10T09:12:18.756-0400 INFO [index-management] idxmgmt/std.go:406 Set setup.template.pattern to 'filebeat-7.14.0-*' as ILM is enabled.
2021-09-10T09:12:18.756-0400 INFO [index-management] idxmgmt/std.go:440 Set settings.index.lifecycle.rollover_alias in template to {filebeat-7.14.0 {now/d}-000001} as ILM is enabled.
2021-09-10T09:12:18.756-0400 INFO [index-management] idxmgmt/std.go:444 Set settings.index.lifecycle.name in template to {filebeat {"policy":{"phases":{"hot":{"actions":{"rollover":{"max_age":"30d","max_size":"50gb"}}}}}}} as ILM is enabled.
2021-09-10T09:12:18.756-0400 DEBUG [esclientleg] eslegclient/connection.go:328 GET https://XXX.XXX.XXX.XXX:9200/_cat/templates/filebeat-7.14.0
2021-09-10T09:12:18.883-0400 INFO template/load.go:111 Template "filebeat-7.14.0" already exists and will not be overwritten.
2021-09-10T09:12:18.883-0400 INFO [index-management] idxmgmt/std.go:297 Loaded index template.
2021-09-10T09:12:18.883-0400 DEBUG [esclientleg] eslegclient/connection.go:328 GET https://XXX.XXX.XXX.XXX:9200/_alias/filebeat-7.14.0
2021-09-10T09:12:18.891-0400 INFO [index-management.ilm] ilm/std.go:121 Index Alias filebeat-7.14.0 exists already.
2021-09-10T09:12:18.891-0400 DEBUG [esclientleg] eslegclient/connection.go:328 GET https://XXX.XXX.XXX.XXX:9200/
2021-09-10T09:12:18.891-0400 INFO [publisher_pipeline_output] pipeline/output.go:151 Connection to backoff(elasticsearch(https://XXX.XXX.XXX.XXX:9200)) established
2021-09-10T09:12:18.891-0400 INFO [publisher] pipeline/retry.go:213 retryer: send wait signal to consumer
2021-09-10T09:12:18.891-0400 INFO [publisher] pipeline/retry.go:217 done
2021-09-10T09:12:18.903-0400 DEBUG [elasticsearch] elasticsearch/client.go:227 PublishEvents: 50 events have been published to elasticsearch in 12.0023ms.
2021-09-10T09:12:18.903-0400 DEBUG [elasticsearch] elasticsearch/client.go:411 Bulk item insert failed (i=0, status=500): {"type":"find_match","reason":"Unable to find match for

That looks to be why.
What modules do you have enabled?

As far as I know no modules are enabled. The logging was working on this but recently stopped. The person that had configured it is no longer here. I was tasked with trying to figure it out.

I believe I am suppose to be using an ingest pipeline. If I remove that line from the config the logs start to show up in elastic but do not populate any fields. If I add the line back, it will populate the fields I want but only for a second.

I figured it out. My regex was incorrect for exclude_lines.

Thanks for your help.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.