Logstash pipeline with input+ filter to one output and other without filter to different output

Hello Elkan s
I wish you a happy new year!!
Hope all are doings well.

My scenario: collecting logs on port 9600 and adding filter to forward the logs to qradar

I need to add one more output of azure sentinel without filter

When I created another conf file with same input without filter its getting errored

Please suggest solution. I know I need to change something on pipelines.yml
But not sure though.
Thank you in advance!!!

If you have a single port which accept all traffic, in you case 9600, and you want to forward to multiple locations then use IFs in the output section. Based on value of fields or tags you do redirections.

output {
if ( [field]=="value1" ) {
    elasticsearch {
    hosts => ["http://host1:9200"]
    index => "index1"
    } 
}
else if ( [field]=="value2" ) {
      microsoft-logstash-output-azure-loganalytics {
        workspace_id => "id"
        workspace_key => "key"
        custom_log_table_name => "tableName"
      }
}

Check the documentation.

If you have the multiple input ports, for instance 9500 and 9600, then you separate pipelines.

Thank you for quick response.

My conf:

Input section: Input on tcp 9600 which accepts raw logs( from heterogenous log sources)

Filter section: set of grok and mutate functions to cleanse

Output : send cleanse( from filter section) data to qradar

New addition : send Raw data without cleansing to sentinel

Now is more clear. In your case, the pseudo code is like this:

Add this lines to: config/pipelines.yml

- pipeline.id: main
  path.config: "/etc/logstash/main.conf"

- pipeline.id: qradar
  path.config: "/etc/logstash/qradar.conf"

Create the main.conf file:

input {
        tcp { port => 9600 }
}
filter {
# for raw messages, do not add any processing code
}
output {
  microsoft-logstash-output-azure-loganalytics {
        workspace_id => "id"
        workspace_key => "key"
        custom_log_table_name => "tableName"
      }
  pipeline { send_to => "qradar" }
}

Create the qradar.conf file:

input {
 pipeline { address => "qradar" }
}
filter {
 # add your existing logic
}
output {
 qradar_connection_string {  ....
 }
}

Check the documentation

I appreciate it. Shall test in my lab and come back here.

Thank you so much!!!!
Cheers!!

1 Like

Hope this works on logstash 7.x?

Test went well using file outputs.
But I see such ERRORS: "main ERROR Unable to locate appender "

Can be fixed or ignored?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.