Problems with logstsah multiple pipelines

Dear folks,
Recently I'm trying to enable multiple pipelines with logstash version 6.2.3.
but we got the problems from the logstash log as follows, I tested my pipline with version 5.5.x , It's working well with single pipeline of single instance.

Don't know why we got the issue from logstash version 6?

logs from logstash

[2018-04-04T15:34:17,734][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/etc/logstash/conf.d/filebeat.conf"}
[2018-04-04T15:34:17,734][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["/etc/logstash/conf.d/filebeat.conf"]}
[2018-04-04T15:34:17,734][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/etc/logstash/conf.d/metricbeat.conf"}
[2018-04-04T15:34:17,734][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>0}
[2018-04-04T15:34:17,735][DEBUG][logstash.config.source.multilocal] Reading pipeline configurations from YAML {:location=>"/etc/logstash/pipelines.yml"}
[2018-04-04T15:34:17,735][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["/etc/logstash/conf.d/metricbeat.conf"]}
[2018-04-04T15:34:17,735][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/etc/logstash/conf.d/filebeat.conf"}
[2018-04-04T15:34:17,735][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["/etc/logstash/conf.d/filebeat.conf"]}

logstash debug

180404154058 root@logstashtest06 bin # ./logstash -f /etc/logstash/logstash.yml --debug

WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[INFO ] 2018-04-04 15:41:24.595 [main] scaffold - Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[DEBUG] 2018-04-04 15:41:24.602 [main] registry - Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x4db72eb3 @module_name="netflow", @directory="/usr/share/logstash/modules/netflow/configuration", @kibana_version_parts=["6", "0", "0"]>}
[INFO ] 2018-04-04 15:41:24.602 [main] scaffold - Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[DEBUG] 2018-04-04 15:41:24.602 [main] registry - Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x6be06c7e @module_name="fb_apache", @directory="/usr/share/logstash/modules/fb_apache/configuration", @kibana_version_parts=["6", "0", "0"]>}
[INFO ] 2018-04-04 15:41:24.631 [main] writabledirectory - Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
[INFO ] 2018-04-04 15:41:24.635 [main] writabledirectory - Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
[WARN ] 2018-04-04 15:41:24.845 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2018-04-04 15:41:24.852 [LogStash::Runner] agent - No persistent UUID file found. Generating new UUID {:uuid=>"482fdad9-b88f-4bcb-8975-1eeb5a7972ab", :path=>"/usr/share/logstash/data/uuid"}
[DEBUG] 2018-04-04 15:41:24.868 [LogStash::Runner] agent - Setting up metric collection
[DEBUG] 2018-04-04 15:41:24.873 [LogStash::Runner] os - Starting {:polling_interval=>5, :polling_timeout=>120}
[DEBUG] 2018-04-04 15:41:24.920 [LogStash::Runner] jvm - Starting {:polling_interval=>5, :polling_timeout=>120}
[DEBUG] 2018-04-04 15:41:24.956 [LogStash::Runner] jvm - collector name {:name=>"ParNew"}
[DEBUG] 2018-04-04 15:41:24.957 [LogStash::Runner] jvm - collector name {:name=>"ConcurrentMarkSweep"}
[DEBUG] 2018-04-04 15:41:24.962 [LogStash::Runner] persistentqueue - Starting {:polling_interval=>5, :polling_timeout=>120}
[DEBUG] 2018-04-04 15:41:24.964 [LogStash::Runner] deadletterqueue - Starting {:polling_interval=>5, :polling_timeout=>120}
[INFO ] 2018-04-04 15:41:24.969 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"6.2.3"}
[DEBUG] 2018-04-04 15:41:24.973 [Ruby-0-Thread-1: /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:22] agent - Starting agent
[DEBUG] 2018-04-04 15:41:24.975 [Api Webserver] agent - Starting puma
[DEBUG] 2018-04-04 15:41:24.976 [Api Webserver] agent - Trying to start WebServer {:port=>9600}
[DEBUG] 2018-04-04 15:41:24.993 [Api Webserver] service - [api-service] start
[DEBUG] 2018-04-04 15:41:25.006 [Ruby-0-Thread-1: /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:22] configpathloader - Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["/etc/logstash/conf.d", "/etc/logstash/jvm.options", "/etc/logstash/log4j2.properties", "/etc/logstash/logstash.yml.rpmnew", "/etc/logstash/pipelines.yml", "/etc/logstash/startup.options"]}
[DEBUG] 2018-04-04 15:41:25.007 [Ruby-0-Thread-1: /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:22] configpathloader - Reading config file {:config_file=>"/etc/logstash/logstash.yml"}
[DEBUG] 2018-04-04 15:41:25.039 [Ruby-0-Thread-1: /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:22] agent - Converging pipelines state {:actions_count=>1}
[DEBUG] 2018-04-04 15:41:25.041 [Ruby-0-Thread-1: /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:22] agent - Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[ERROR] 2018-04-04 15:41:25.080 [Ruby-0-Thread-1: /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:22] agent - Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, input, filter, output at line 5, column 1 (byte 107) after  \n# ------------------------------------ Node ------------------------------------\n \n# logstash node name\n", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:42:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:50:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:12:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `compile_sources'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:51:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:169:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:315:in `block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:312:in `block in converge_state'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:299:in `converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:in `block in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:in `converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:90:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:348:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}
[INFO ] 2018-04-04 15:41:25.088 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}

logstash pipline config

@/etc/logstash/pipelines.yml

# This file is where you define your pipelines. You can define multiple.
# For more information on multiple pipelines, see the documentation:
#   https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html

- pipeline.id: filebeat
  path.config: "/etc/logstash/conf.d/filebeat.conf"

- pipeline.id: metricbeat
  path.config: "/etc/logstash/conf.d/metricbeat.conf"

@/etc/logstash/conf.d/filebeat.conf

input {
  beats {
    port => 5050
    }
}

filter {

  if "NOTIFYMAPI" in [message] and  "ms-exchange"  in [tags] {
        drop { }
  
  }



  if "_jsonparsefailure" in [tags ]  {
        drop { }
  }

  
  
  if "ms-exchange"  in [tags] {

    csv {
      source => "message"
      separator => ","
      columns => [ "date-time","client-ip","client-hostname","server-ip","server-hostname","source-context","connector-id","source","event-id","internal-message-id","message-id","network-message-id","recipient-address","recipient-status","total-bytes",
                 "recipient-count","related-recipient-address","reference","message-subject","sender-address","return-path","message-info","directionality","tenant-id","original-client-ip","original-server-ip","custom-data","transport-traffic-type",
                 "log-id","schema-version"]
    }

      date {
        match => ["date-time" , "ISO8601"]
        target => "@timestamp"
      }
 

    mutate {

      convert => {
        "total-bytes" => "integer"
        "recipient-count" => "integer"
      }

      add_field => { "[@metadata][index_prefix]" => "%{agent}-%{env}" }
      remove_field =>  ["date-time","client-ip","server-ip","source-context","connector-id","source","internal-message-id","message-id","reference","directionality","custom-data","transport-traffic-type","schema-version","offset","@version","tenant-id",
                      "[beat][hostname]","[beat][version]","host","network-message-id","related-recipient-address","message-info","message"] 
    }
 
  }
  
}


output {
  file{
    path=> "/tmp/%{[@metadata][index_prefix]}-%{+YYYY}"
  }

}

@/etc/logstash/conf.d/metricbeat.conf

    #logstash for metricbeat
    input {
      beats {
        port => 5051
        client_inactivity_timeout => 0
      }
    }


    filter {
      
      if "_jsonparsefailure" in [tags] {
            drop { }
      }
      
      
      mutate {
        add_field => { "[@metadata][index_prefix]" => "%{agent}-%{env}-%{service}" }  
        remove_field =>  ["agent","env","service","type"]
      }
      
    }
     


    output {
      
      
     if "system" in [tags] {  
        file {
          path=> "/tmp/%{[@metadata][index_prefix]}-%{+YYYY-MM}"
        }
      }
      
     if "apache" in [tags] {
        file {
          path=> "/tmp/%{[@metadata][index_prefix]}-%{+YYYY}"
        }    
      }
      
      if "mysql" in [tags] {
        file {
          path=> "/tmp/%{[@metadata][index_prefix]}-%{+YYYY}"
        }    
      }
      
      
      
      
      if "nginx" in [tags] {
        file {
          path=> "/tmp/%{[@metadata][index_prefix]}-%{+YYYY}"
        }    
      }
      
      
     
      
      if "postgresql" in [tags] {
        file {
          path=> "/tmp/%{[@metadata][index_prefix]}-%{+YYYY-MM}"
        }
      }
      
      
      
      if "haproxy" in [tags] {
        file {
          path=> "/tmp/%{[@metadata][index_prefix]}-%{+YYYY-MM}"
        }
      }
      
      
      
      if "vsphere" in [tags] {
        file {
          path=> "/tmp/%{[@metadata][index_prefix]}-%{+YYYY}"
        }    
      }
      
    }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.