Filebeat error when parsing .txt log files (Failed to publish events: temporary bulk send failure)


(Jose) #1

Hello,
I'm trying to get some log files that in a .txt format to my ElasticSearch instance in AWS. I'm using Filebeat to send the output to my ES cluster. However when I initiate Filebeat on the Windows host, I get the below error in the filebeat logs.

I've provided below how my filebeat.yml file is configured for this communication. Is there something I'm missing that is causing the below error?

   |2018-09-10T14:44:02.158-0500|INFO|elasticsearch/client.go:690|Connected to Elasticsearch version 6.2.3|
|---|---|---|---|
|2018-09-10T14:44:02.205-0500|INFO|template/load.go:73|Template already exists and will not be overwritten.|
|2018-09-10T14:44:03.347-0500|ERROR|pipeline/output.go:92|Failed to publish events: temporary bulk send failure|

Below is how I have the prospector set up:

> 
> #=========================== Filebeat prospectors =============================
> 
> filebeat.prospectors:
> 
> # Each - is a prospector. Most options can be set at the prospector level, so
> # you can use different prospectors for various configurations.
> # Below are the prospector specific configurations.
> 
> - type: log
> 
>   # Change to true to enable this prospector configuration.
>   enabled: true
> 
>   # Paths that should be crawled and fetched. Glob based paths.
>   paths:
>     #- /var/log/*.log
>     - C:\Log\File\Path\LogFile.txt
>   exclude_lines: ['#']
>   #document_type: iis
> 
>   # Exclude lines. A list of regular expressions to match. It drops the lines that are
>   # matching any regular expression from the list.
>   #exclude_lines: ['^DBG']
> 
>   # Include lines. A list of regular expressions to match. It exports the lines that are
>   # matching any regular expression from the list.
>   #include_lines: ['^ERR', '^WARN']
> 
>   # Exclude files. A list of regular expressions to match. Filebeat drops the files that
>   # are matching any regular expression from the list. By default, no files are dropped.
>   #exclude_files: ['.gz$']
> 
>   # Optional additional fields. These fields can be freely picked
>   # to add additional information to the crawled log files for filtering
>   #fields:
>   #  level: debug
>   #  review: 1
> 
>   ### Multiline options
> 
>   # Mutiline can be used for log messages spanning multiple lines. This is common
>   # for Java Stack Traces or C-Line Continuation
> 
>   # The regexp Pattern that has to be matched. The example pattern matches all lines starting with [
>   #multiline.pattern: ^\[
> 
>   # Defines if the pattern set under pattern should be negated or not. Default is false.
>   #multiline.negate: false
> 
>   # Match can be set to "after" or "before". It is used to define if lines should be append to a pattern
>   # that was (not) matched before or after or as long as a pattern is not matched based on negate.
>   # Note: After is the equivalent to previous and before is the equivalent to to next in Logstash
>   #multiline.match: after
> 
> 
> #============================= Filebeat modules ===============================
> 
> 

Below is the template settings I have set up:

> 
> #==================== Elasticsearch template setting ==========================
> 
> setup.template.name: "cvp_activitylogs-%{+yyyy.MM.dd}"
> setup.template.pattern: "cvp_activitylogs"
> 
> setup.template.settings:
>   index.number_of_shards: 2
>   #index.codec: best_compression
>   #_source.enabled: false
> 
> 

Below is the Output section of my Filebeat.yml file.

> 
> #-------------------------- Elasticsearch output ------------------------------
> output.elasticsearch:
>   # Array of hosts to connect to.
>   hosts: ["ElasticSearch-AWS-Link:443"]
>   index: "cvp_activitylogs-%{+yyyy.MM.dd}"
>   pipeline: cvp_logs
>   template.enabled: false
>   template.name: "cvp_activitylogs"
> 
>   # Optional protocol and basic auth credentials.
>   protocol: "https"
>   #username: "elastic"
>   #password: "changeme"

(Pier-Hugues Pellerin) #2

The info below is normal if your start multiple time Filebeat.

 |2018-09-10T14:44:02.205-0500|INFO|template/load.go:73|Template already exists and will not be overwritten.| 

This look similar to this issue https://github.com/elastic/beats/issues/6175, do you see these errors continuously?

|2018-09-10T14:44:03.347-0500|ERROR|pipeline/output.go:92|Failed to publish events: temporary bulk send failure|

Also, there is an error with the setup.template.pattern, this need to be a pattern.

setup.template.pattern: "cvp_activitylogs*"

I am not sure if the above is causing your issue.

Can you check the Elasticsearch log, we might have a better message there?


(Jose) #3

I actually got past the error around the bulk send failure. The issue was with the Pipeline I was using in the Filebeat.yml file. I had to modify it to add some additional fields for the date and also where to send the logs if there was an error (on_failure) section. Now I have to find out why it's failing to send the logs to the index properly.

I will make the changes you caught on the template pattern and see if that fixes my new issues where the logs aren't getting pushed to the index but rather to the failed cvp index instead.

Thanks!


(Pier-Hugues Pellerin) #4

@jobse.batres Fixing the pattern will make sure the log you are sending pick up the right template :slight_smile:


(system) #5

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.