Logstash configuration advice needed

Dears,

I need your advice in case of configuration of Logstash. Currently the configuration looks like this one:

cat /etc/logstash/conf.d/logstash.conf

input {
  beats {
    port => 5044
    ssl => true
    ssl_certificate_authorities => ["/etc/logstash/certs/ca.crt"]
    ssl_certificate => "/etc/logstash/certs/${HOSTNAME}.crt"
    ssl_key => "/etc/logstash/certs/${HOSTNAME}.p8"
    ssl_verify_mode => "force_peer"
  }
}

filter {
  xml {
    source => "message"
    store_xml => false
    force_array => false
    xpath => [
      "/log//isomsg/field[@id='37']/@value", "ci.a",
      "/log//isomsg/field[@id='0']/@value", "ci.b",
      "/log//isomsg/field[@id='39']/@value", "ci.c",
      "/log//*[contains(name(),'exception')]/@name", "ci.ex_name",
      "/log//*[contains(name(),'exception')]/text()", "ci.ex",
      "/log//error/text()", "ci.err",
      "/log/@realm", "ci.r",
      "/log/@at", "ci.at_date",
      "/log//rout/face/text()", "ci.face"
    ]
  }
  if ("" in [ci.at_date]) {
    date {
      match => ["ci.at_date", "YYYY-MM-dd'T'HH:mm:ss", "YYYY-MM-dd'T'HH:mm:ss.SSS", "YYYY-MM-dd'T'HH:mm:ss.SSSZ"]
      timezone => "Europe/Warsaw"
      target => "@timestamp"
    }
  }
}

output {
  elasticsearch {
    hosts => ["https://${HOSTNAME}:9200"]
    cacert => '/etc/logstash/certs/ca.crt'
    user => 'logstash_internal'
    password => '${ES_PWD}'
    ilm_enabled => false
    document_id => "%{[@metadata][_id]}"
    index => "important-log-%{+YYYY.MM.dd}"
  }
}

This is very critical configuration for some applikation which produce xml log file.

I've got new request from my bussines to create additional configuration in Logstash for new application which generate Tomcat txt logs simmilar to this:

192.168.1.101 - - [10/Dec/2020:13:30:49 +0100] "PUT /api/mms/terminals/virtual/edit/M0000352 HTTP/1.1" 400 69
192.168.1.101 - - [10/Dec/2020:13:31:26 +0100] "PUT /api/mms/terminals/virtual/edit/M0000352 HTTP/1.1" 400 69
192.168.1.101 - - [10/Dec/2020:13:31:32 +0100] "PUT /api/mms/terminals/virtual/edit/M0000352 HTTP/1.1" 400 69
192.168.1.101 - - [10/Dec/2020:13:43:51 +0100] "PUT /api/mms/clients/riskChange/200183 HTTP/1.1" 400 92
192.168.1.101 - - [10/Dec/2020:13:43:56 +0100] "PUT /api/mms/clients/riskChange/200183 HTTP/1.1" 400 92
192.168.1.101 - - [10/Dec/2020:13:44:01 +0100] "PUT /api/mms/clients/riskChange/200183 HTTP/1.1" 400 92

I did some investigation and I want to use grok filter to parse correctly each line. The grok filter will looks like this one:

%{IPV4} %{USER:ident} %{USER:auth} \[%{HTTPDATE:timestamp}\] \"%{DATA:request}\" %{NUMBER:RESPONSE} (?:-|%{NUMBER:bytes:int})

The question is how to add grok filter configuration to current Logstash configuration for Tomcat logs without any influence on xml configuration? Should I create separate config file for Tomcat logs? What is your experience in such cases?

Best Regards,
Dan

Filebeat allow you to use multiple inputs
You can add a specific tag to each fleabet input likt this

- type: log
  paths:
    - /opt/path1/access-tomcat.log
  tags: ["log1"]

- type: log
  paths:
    - /opt/path2/log-xml.xml
  tags: ["log2"]

Then, in logstash filter, use conditionals based on the tags to determine which filters to apply.

if "log1" in [tags] {
#Aply specific parsing to log1
}

if "log2" in [tags] {
#Aply specific parsing to log2
}

Thank you @ylasri. I'll test your proposal.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.