How to send different logs files to different logstash ports

I am trying to send different log files on different logstash ports on logstash ..like one on 5044 and other on 5043

Here is my filebeat config . can someone help

#=========================== Filebeat prospectors =============================

filebeat.prospectors:

- input_type: log

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /etc/filebeat/download.2017-06-16-1125-1224.log
 # tail_files:  true
  multiline.pattern: '^\[[0-9]{4}-[0-9]{2}-[0-9]{2}'
  multiline.negate: true
  multiline.match: after
  document_type: download

- input_type: log
  paths:
    - /etc/filebeat/vus.2017-08-02.log
  multiline.pattern: '^\[[0-9]{4}-[0-9]{2}-[0-9]{2}'
  multiline.negate: true
  multiline.match: after
  document_type: vus

#================================ Outputs =====================================
#----------------------------- Logstash output --------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["dfdevlogstash1.df.jabodo.com:5044"]
  loadbalance: true
  worker: 2
#  filebeat.publish_async: true

Beats are designed to deliver data to one output. That output can be multiple servers for the purposes of load balancing. But you cannot send specific logs to one port and other logs to another.

Why do you need to do this? What are you trying to accomplish?

Thanks for the info andrew . I am sending 2 log files ( different log structure) to to logstash cluster at port 5044 but its not able to identify properly and grok is unable to parse it . so i was wondering if sending it to different ports might help . Can you please suggest how can i achieve this ?

Also log files which i am sending are custom log files not generic like apache , tomcat etc.

Nikhil

You can apply tags to the events or use the source field to identify which logs came from which files.

Example using tags:

filebeat.prospectors:
- paths:
  - /path/to/log/type/A/*.log
  tags: ["log_type_a"]
- paths:
  - /path/to/log/type/B/*.json
  tags: ["log_type_b", json]

Then over in your Logstash config you can use a condition to apply whatever processing you would like.

filters {
  if "log_type_b" in [tags] {
     // do something fun like grok
  } // else if ...
}

Hi Andrew , i tried as you said but i am only able to parse and send one log file to ES .

Below are my filebeat and logstash configs .

logstash.conf

   input {
        beats {
        client_inactivity_timeout => 86400
        port => 5044
        type => "log"
              }
          }
    filter {
        if "vus" in [tags] {
        mutate {
        gsub => [
          "message", "\t", " ",
          "message", "\n", " "
                ]
            }
        grok {
    match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp_match}\]%{SPACE}%{WORD:level}%{SPACE}%{JAVACLASS:coidkey}%{SPACE}%{USER:ident}%{SPACE}%{GREEDYDATA:Url}"}

      }
    }
     else if "download" in [tags] {
        mutate {
        gsub => [
          "message", "\t", " ",
          "message", "\n", " "
                ]
            }
        grok {
        match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp_match}\]%{SPACE}\:\|\:%{SPACE}%{WORD:level}%{SPACE}\:\|\:%{SPACE}%{USERNAME:host_name}%{SPACE}\:\|\:%{SPA
    CE}%{GREEDYDATA:coidkey}%{SPACE}\:\|\:%{SPACE}%{GREEDYDATA:clientinfo}%{SPACE}\:\|\:%{SPACE}(%{IP:clientip})?%{SPACE}\:\|\:%{SPACE}%{GREEDYDATA:Url}%{SPACE}\:\|\:%
    {SPACE}%{JAVACLASS:class}%{SPACE}\:\|\:%{SPACE}%{USER:ident}%{SPACE}%{GREEDYDATA:msg}"}
       remove_field => [ "ident","offset","name","version","host" ]
             }
                     }

      }
     output {
        stdout { codec => rubydebug }

      if "_grokparsefailure" in [tags] {
        # write events that didn't match to a file
        file { "path" => "/tmp/grok_failures.txt" }
      }
    # if "vus" in [tags] {
    #   elasticsearch {
    #       hosts => "dfdevelasticp1.df.jabodo.com:9200"
    #       user => "elastic"
    #       password => "fd5dd89c"
    #       index => "download-%{+YYYY.MM.dd}"
    #       document_type => "log"
    #     }
    #   }
    #if [type] == "download"{
    else{
    elasticsearch {
           hosts => "dfsyselastic.df.jabodo.com:9200"
           user => "UN"
           password => "PW"
           index => "vicinio-%{+YYYY.MM.dd}"
           document_type => "log"
         }
       }
    }

filebeat.yml (1st host )

#=========================== Filebeat prospectors =============================

filebeat.prospectors:

- input_type: log

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /opt/log/tomcat/vus.log
  tail_files:  true
  multiline.pattern: '^\[[0-9]{4}-[0-9]{2}-[0-9]{2}'
  multiline.negate: true
  multiline.match: after
  tags: ["vus"]

#================================ Outputs =====================================
#----------------------------- Logstash output --------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["dfsyslogstash1.df.jabodo.com:5044"]
#  hosts: ["dfsyslogstash1.df.jabodo.com:5044","dfsyslogstash2.df.jabodo.com:5044",
"dfsyslogstash3.df.jabodo.com:5044","dfsyslogstash4.df.jabodo.com:5044"]
  loadbalance: true
  worker: 2
#  filebeat.publish_async: true

==============

filebeat.yml (2nd host )

#=========================== Filebeat prospectors =============================

filebeat.prospectors:

- input_type: log

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /archives/logs/tomcat7-8080/download.log
    - /archives/logs/tomcat7-8090/download.log
  tail_files:  true
  multiline.pattern: '^\[[0-9]{4}-[0-9]{2}-[0-9]{2}'
  multiline.negate: true
  multiline.match: after
  tags: ["download"]
#================================ Outputs =====================================
#----------------------------- Logstash output --------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["dfsyslogstash1.df.jabodo.com:5044"]
#  hosts: ["dfsyslogstash1.df.jabodo.com:5044","dfsyslogstash2.df.jabodo.com:5044",
"dfsyslogstash3.df.jabodo.com:5044","dfsyslogstash4.df.jabodo.com:5044"]
  loadbalance: true
  worker: 2
#  filebeat.publish_async: true

Do i need to send it to different indexes on ES

Andre ..can you please help on this .

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.