Trying to use pipelines with filebeat output.elasticsearch

(Siva) #1


I configured multiple .yml config file for different types of logs.
Now i need to add conditions to send outputs 1. to elasticsearch and 2.logstash

this is based on a condition with fields.logtype value. After doing a lot of google and the elastic search forums figured out we can use pipelines to acheive this.

the issue here is when I add "pipelines" token in the configuration, it doesn't accept while starting the filebeat instance. below is my configuration file.

#========================= Filebeat global options ============================

filebeat.registry_file_permissions: 0600

By default Ingest pipelines are not updated if a pipeline with the same ID

already exists. If this option is enabled Filebeat overwrites pipelines

everytime a new Elasticsearch connection is established.

filebeat.overwrite_pipelines: true

How long filebeat waits on shutdown for the publisher to finish.

Default is 0, not waiting.

#filebeat.shutdown_timeout: 0

Enable filebeat inputs config

enabled: true
path: config/*.yml

#================================ Outputs =====================================

Configure what output to use when sending the data collected by the beat.

#-------------------------- Elasticsearch output ------------------------------

Array of hosts to connect to.

hosts: ["localhost:9200"]
- pipeline: "ErrorLogs"
fields.log_type: "ErrorLog"
- pipeline: "IPLogs"
fields.log_type: "IPLog"

Optional protocol and basic auth credentials.

#protocol: "https"
#username: "elastic"
#password: "changeme"

#----------------------------- Logstash output --------------------------------

The Logstash hosts

hosts: ["localhost:5044"]
- pipeline: "DebugLogs"
fields.log_type: "Debug"

#================================ Logging =====================================

Sets log level. The default log level is info.

Available log levels are: error, warning, info, debug

logging.level: debug

At debug level, you can selectively enable logging only for some components.

To enable all selectors use ["*"]. Examples of other selectors are "beat",

"publish", "service".

#logging.selectors: ["*"]

#============================== Xpack Monitoring ===============================

filebeat can export internal metrics to a central Elasticsearch monitoring

cluster. This requires xpack monitoring to be enabled in Elasticsearch. The

reporting is disabled by default.

Set to true to enable the monitoring reporter.

#xpack.monitoring.enabled: false

Uncomment to send the metrics to Elasticsearch. Most settings from the

Elasticsearch output are accepted here as well. Any setting that is not set is

automatically inherited from the Elasticsearch output configuration, so if you

have the Elasticsearch output configured, you can simply uncomment the

following line.


Please advise.


(Noémi Ványi) #2

Could you please format your configuration using </>?

(Siva) #3

Hi Noémi Ványi,

I tried to manage it by sending all the events to logstash with seperate filebeat yml files.
I guess it not possible to push the events to multiple output sources from filebeats.

Thank you.

(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.