Same request processed by 2 config file

I try to understand why every request I sent is process by my 2 logstash config file although I am using 2 differents port in my input http plugin:

Config file 1:

input {
  http {
    host => "0.0.0.0" # default: 0.0.0.0
    port => 52001
    tags => ["gps_globalsat"]
  }
}

filter {

}

output {

  if "gps_globalsat" in [tags] {
    elasticsearch {
     hosts => ['localhost:9200']
     index => 'gps@elm'
     document_type => '_doc'
     user => 'elastic'
     password => 'element'
    }
  }

  #stdout { codec => rubydebug }
  file {
	 codec => rubydebug
	 path => '/var/log/logstash/elk-gps_gsat.log'
  }

}

Config file 2:

input {
  http {
    host => "0.0.0.0" # default: 0.0.0.0
    port => 52003
    tags => ["gps_osm"]
  }
}

filter {

}

output {

  if "gps_osm" in [tags] {
    elasticsearch {
     hosts => ['localhost:9200']
     index => 'gps@elm'
     document_type => '_doc'
     user => 'elastic'
     password => 'element'
    }
  }

  #stdout { codec => rubydebug }
  file {
	 codec => rubydebug
	 path => '/var/log/logstash/elk-gps_osm.log'
  }

}

I can see every data inserted in elastic when checking in Kibana and the 2 differents logs file are showing the same request logs as well:

==> /var/log/logstash/elk-gps_gsat.log <==

{
    "@timestamp" => 2018-10-17T00:18:42.000Z,
          "type" => "push",
          "host" => "127.0.0.1",
          "tags" => [
        [0] "gps_globalsat"
    ],
      "@version" => "1",
        "report" => "82",
      "customer" => "elm",
            "rx" => {
            "gwrx" => [
            [0] {
                "lsnr" => -8.8,
                "time" => "2018-10-17 11:18:42",
                "chan" => 1,
                "rssi" => -119,
                "rfch" => 0
            }
        ],
         "moteeui" => "665365B",
        "userdata" => {
             "motetx" => {
                "modu" => "LoRa",
                "codr" => "4/5",
                "datr" => "SF12BW125",
                "freq" => 868300000
            },
              "seqno" => 36,
               "port" => 2,
            "payload" => "MDA4MjU0ZmVhYzI2NDAwOWViYWFmOQ=="
        }
    },
       "battery" => 84,
        "client" => "element",
      "activite" => "tracking",
      "location" => {
        "lat" => -22.272447,
        "lon" => 166.439673
    }
}

==> /var/log/logstash/elk-gps_osm.log <==

{
    "@timestamp" => 2018-10-17T00:18:42.000Z,
          "type" => "push",
          "host" => "127.0.0.1",
          "tags" => [
        [0] "gps_globalsat"
    ],
      "@version" => "1",
        "report" => "82",
      "customer" => "elm",
            "rx" => {
            "gwrx" => [
            [0] {
                "lsnr" => -8.8,
                "time" => "2018-10-17 11:18:42",
                "chan" => 1,
                "rssi" => -119,
                "rfch" => 0
            }
        ],
         "moteeui" => "665365B",
        "userdata" => {
             "motetx" => {
                "modu" => "LoRa",
                "codr" => "4/5",
                "datr" => "SF12BW125",
                "freq" => 868300000
            },
              "seqno" => 36,
               "port" => 2,
            "payload" => "MDA4MjU0ZmVhYzI2NDAwOWViYWFmOQ=="
        }
    },
       "battery" => 84,
        "client" => "element",
      "activite" => "tracking",
      "location" => {
        "lat" => -22.272447,
        "lon" => 166.439673
    }
}

I started logstash in debug mode and make an http test:

[2018-10-17T11:30:48,712][DEBUG][logstash.util.decorators ] inputs/LogStash::Inputs::Http: adding tag {"tag"=>"gps_osm"}
[2018-10-17T11:30:48,814][DEBUG][logstash.pipeline        ] filter received {"event"=>{"message"=>"", "@version"=>"1", "@timestamp"=>2018-10-17T00:30:48.709Z, "tags"=>["gps_osm"], "headers"=>{"accept_language"=>"fr-FR,fr;q=0.9,en-US;q=0.8,en;q=0.7", "cache_control"=>"no-cache", "accept_encoding"=>"gzip, deflate", "request_path"=>"/?lat=-22.264154&lon=166.47124&timestamp=1539689715889&hdop=48.0&altitude=162.85074&speed=0.0", "http_user_agent"=>"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/69.0.3497.100 Safari/537.36", "http_host"=>"elk.element.nc:52003", "postman_token"=>"c6767544-6d6b-7026-b69f-62cbb55bc4df", "http_version"=>"HTTP/1.1", "connection"=>"keep-alive", "http_accept"=>"*/*", "request_method"=>"GET", "content_length"=>"0"}, "host"=>"61.5.210.22"}}
[2018-10-17T11:30:48,816][DEBUG][logstash.filters.json    ] Running json filter {:event=>#<LogStash::Event:0x7c715d1e>}
[2018-10-17T11:30:48,822][DEBUG][logstash.filters.json    ] Event after json filter {:event=>#<LogStash::Event:0x7c715d1e>}
[2018-10-17T11:30:48,908][ERROR][logstash.filters.ruby    ] Ruby exception occurred: undefined method `unpack' for nil:NilClass
[2018-10-17T11:30:48,909][DEBUG][logstash.filters.grok    ] Running grok filter {:event=>#<LogStash::Event:0x7c715d1e>}
[2018-10-17T11:30:48,911][DEBUG][logstash.filters.grok    ] Event now:  {:event=>#<LogStash::Event:0x7c715d1e>}
[2018-10-17T11:30:48,914][DEBUG][logstash.pipeline        ] output received {"event"=>{"message"=>"", "@version"=>"1", "@timestamp"=>2018-10-17T00:30:48.709Z, "tags"=>["gps_osm", "_rubyexception", "_grokparsefailure"], "json"=>nil, "host"=>"61.5.210.22"}}
[2018-10-17T11:30:48,922][DEBUG][logstash.outputs.file    ] File, writing event to file. {:filename=>"/var/log/logstash/elk-gps_gsat.log"}
[2018-10-17T11:30:48,925][DEBUG][logstash.outputs.file    ] Starting stale files cleanup cycle {:files=>{"/var/log/logstash/elk-gps_gsat.log"=>#<IOWriter:0x3f8be026 @active=true, @io=#<File:/var/log/logstash/elk-gps_gsat.log>>}}
[2018-10-17T11:30:48,928][DEBUG][logstash.outputs.file    ] 0 stale files found {:inactive_files=>{}}
[2018-10-17T11:30:48,936][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@id = "plain_ffc11d0a-760a-4ab7-bb84-c75333428138"
[2018-10-17T11:30:48,937][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-10-17T11:30:48,937][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-10-17T11:30:48,939][DEBUG][logstash.codecs.json     ] config LogStash::Codecs::JSON/@id = "json_d32512c7-e3a5-4564-a6a3-c8969dea5030"
[2018-10-17T11:30:48,939][DEBUG][logstash.codecs.json     ] config LogStash::Codecs::JSON/@enable_metric = true
[2018-10-17T11:30:48,939][DEBUG][logstash.codecs.json     ] config LogStash::Codecs::JSON/@charset = "UTF-8"
[2018-10-17T11:30:49,027][DEBUG][logstash.outputs.file    ] File, writing event to file. {:filename=>"/var/log/logstash/elk-gps_osm.log"}
[2018-10-17T11:30:49,027][DEBUG][logstash.outputs.file    ] Starting stale files cleanup cycle {:files=>{"/var/log/logstash/elk-gps_osm.log"=>#<IOWriter:0x3b3d15ec @active=true, @io=#<File:/var/log/logstash/elk-gps_osm.log>>}}
[2018-10-17T11:30:49,028][DEBUG][logstash.outputs.file    ] 0 stale files found {:inactive_files=>{}}

Any idea?

All configuration files in a Logstash pipeline are concatenated and count as one, so all filters and outputs applies to all events unless you use conditionals. This is an extremely common misconception so you should easily find multiple old threads about this.

Thanks for your answer, I will have a look.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.