Logstash Input - beats and azureblob


I am using Filebeat to send the log files from remote host to logstash and also trying to grab NSG flow logs from Azure using logstash plugin.

Here is my input.conf, how can I parse these two individual inputs separately and how do I index them as well? Can someone please give me an example. I wold like to index them separately based on the input.

input {
beats {
port => 5044
client_inactivity_timeout => 599
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
storage_account_name => "mystorageaccount"
storage_access_key => "VGhpcyBpcyBhIGZha2Uga2V5Lg=="
container => "insights-logs-networksecuritygroupflowevent"
codec => "json"
file_head_bytes => 12
file_tail_bytes => 2

Use tags or add_field for each input to update each event coming from that input. Then you can use conditionals in your filter and output sections to select which events should be sent where.

Thanks for the quick reply, do you have an example syntax?


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.