Logstash

Hi
i hope this message find you well , how could to send filtred logs to elasticsearch and in the same time send raw logs to another server (I need this logs for investigation and forensics task)
regards and thanks

Hi,

With logstash you could input, then output the logs to one server or index, then do the filters and output again.

First output captures logs direct after sent to logstash.

1 Like

Just one logstash server.
Input
Output (raw logs)
Filter for custom parsing
Output (custom parsed logs)

1 Like

so how can I do this operation , if you have any links that can help me for accomplish this task please send them to me

You need to look at logstash configurations, ie - Logstash Configuration Examples | Logstash Reference [7.12] | Elastic

ie
server A raw logs, then filter, then output to server B for processed logs

input { stdin { } }
output {
  elasticsearch { hosts => ["servera:9200"] }
  stdout { codec => rubydebug }
}
filter {
  grok {
    match => { "message" => "%{COMBINEDAPACHELOG}" }
  }
  date {
    match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
  }
}

output {
  elasticsearch { hosts => ["serverb:9200"] }
  stdout { codec => rubydebug }
}
1 Like

Thank you very much

This does not work that way, Logstash groups inputs, filters and outputs together and runs the pipeline in that order, every field added by a filter will be present in the document and in both outputs.

To send a raw message you need to change the use the plain codec in your output and specify only the message field.

output {
    elasticsearch {
        hosts => ["hostname"]
        codec => plain { format => "%{message}" } 
    }
}

This will make you send a message to elasticsearch with only the message field, if this is what you want.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.