I want to use multiple pipelines.
When i have aa.conf and bb.conf, each uses a different filter {}.
So if-elase seems to have to be branched from the pipeline, is there a way?
Yes, you can totally do this.
To do this you would create a special field to identify what log is being processed. For example your log1 filebeat.yml would look something like this:
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/logs/log1.log
fields:
log1: true
This creates a field called log1 that we can search for in the pipelines.yml file on the logstash server. So your pipelines.yml would look something like this:
- pipeline.id: filebeats
config.string: |
input {
beats {
port => 5044
}
}
output {
if "log1" in [fields] {
pipeline {
send_to => log1_run
}
}
else {
pipeline {
send_to => regular_run
}
}
}
- pipeline.id: log1_run
path.config: "/etc/logstash/conf.d/log1.conf"
- pipeline.id: regular_run
path.config: "/etc/logstash/conf.d/regular.conf"
Then all you have to do is setup your input on your logstash conf files to look have this:
input {
pipeline {
address => log1_run
}
}
So you would create a field on each of the filebeat inputs that you can search on.
Then you would create conditional logic in your pipelines.yml to determine which config to use by created a virtual address.
Then you set the virtual address in the input of the logstash conf files. From there you can change the output to whatever index you want and send it to Elastic Search.
There are questions.
If i create 10 pipelines, each request will be checked by if-else on all 10 pipelines.
Is this a loss of performance?
Would not it be better to use if-else in one pipeline?
You are not checking on inside the pipelines themselves, you're checking in the pipelines.yml file to route it to the correct pipeline.
But yes, you could do the if-else in one giant conf file to process everything. The problem is, if there is a problem with the parsing of a single log, then ALL of your pipelines go down. Using the if-else in the pipelines.yml file, only the single pipeline will go down, rather than all of them.
As for performance, no matter what you are doing the same processing. I don't know for sure, but one giant pipeline conf file would cause locking during processing. But if you use your pipelines.yml to break out into multiple pipelines then I would assume that you are able to process in parallel.
In either case, it is up to you as to which direction you go.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.