I'm trying to create a pipeline in logstash which receives CEF events and other input from Filebeat. Since not all input is in CEF format, I created an IF-statement.
My logstash.conf looks like this:
beats {
port => 5044
if "forcepoint" in [tags] {codec => cef {}}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "logstash-%{[fields][flavor]}-%{+YYYY.MM.dd}"
}
# stdout { codec => rubydebug }
}
But when I check the config using config test and exit, I get the following error:
[FATAL] 2020-08-23 14:39:47.004 [LogStash::Runner] runner - The given configuration is invalid. Reason: Expected one of [ \t\r\n], "#", "=>" at line 6, column 14 (byte 129) after input {
beats {
port => 5044
if
[ERROR] 2020-08-23 14:39:47.006 [LogStash::Runner] Logstash - java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
When I uncomment the if statement and run the config test again, it says the config is OK. So I guess there must be a mistake with the formatting of the if statement.
Can someone please tell me what I need to change? Or is it even possible to use the codec plugin and conditional statements in the input section?
filter {
if "forcepoint" in [tags] {
codec => cef
}
But it gives me an error again.
[FATAL] 2020-08-23 18:45:24.966 [LogStash::Runner] runner - The given configuration is invalid. Reason: Expected one of [ \t\r\n], "#", "{" at line 17, column 11 (byte 432) after filter {
if "forcepoint" in [tags] {
codec
Thank you for the help! I'm not quite understanding your solution. Do you basically send it from one pipeline to another? My problem is that the message field is in CEF format.
I'm sorry if the question sounds silly, I'm new to ElasticStack.
BTW: Do you have experience with the decode_cef processor in Filebeat? Could that also be a option?
A codec can only be used on an input. The idea is that you accept the events using a beats input, then in the output section, if they have that "forcepoint" tag you can use a tcp output to send them to another pipeline that has a tcp input with a cef codec. That second pipeline can then send the decoded events to elasticsearch.
Yes, that's not a good idea.
I think the solution with two pipelines is the only suitable one. I'm also thinking about dropping Filebeat and sending the logs from Forcepoint via TCP/UDP directly to Logstash. But I'm not sure if that's an option as I'm not responsible for the firewalls.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.