Getting logs in other index

Hi Guys,

I am kinda facing weird issue! I have configured two config file and have created two different indices which are indexing my bind and Apache logs.

However my bind logs are getting successfully parsed and even logstash-apache is showing Bind logs not sure why.

Here are the config

bind config

input {

##############STDIN

stdin {

type => "dnsmaldef"

}

##########STIN

   file {
   path => ["/var/log/named/queries.*"]
            start_position => "beginning"
            type => "dnsmaldef"
   }

}
output {
if [type] == "dnsmaldef" {

stdout {

codec => rubydebug

}

            elasticsearch {
                    hosts => ["dnsdf.isnlab.in:9200"]
                    index => "logstash-dnsmaldef-%{+YYYY.MM.dd}"
                    template => "/etc/logstash/logstash-template.json"
    }
            }

}


Apache config

input {
# add necessary input parameters

stdin {

}

           file {
   path => ["/var/log/httpd/access*"]
            start_position => "beginning"
   }

}

output {
# add necessary output parameters

stdout {

codec => rubydebug

}

                    elasticsearch {
                    hosts => ["dnsdf.isnlab.in:9200"]
                    index => "logstash-apache-%{+YYYY.MM.dd}"
                    template => "/etc/logstash/logstash-template.json"
    }

}

See Logstash sending to all output hosts. Since that post was made Logstash 6 which supports multiple pipelines (if explicitly configured) is available.

I am sorry I did not get anything out of that.. i am still not sure what went wrong with my pipeline.

Can you please elaborate more on this?

You have a conditional wrapping the elasticsearch output in your Bind configuration file:

if [type] == "dnsmaldef" {
  ...
}

You don't have that kind of conditional wrapping the elasticsearch output in your Apache configuration file. Hence, events from the other file will be routed to that output.

OK - I disabled it and restarted the logstash and recreated the index-pattern but is the same :frowning:

Please show your configuration and an example of an event that ended up in the wrong index. You can copy/paste the raw JSON document from Kibana's JSON tab.

here is my config for apache.conf

input {
# add necessary input parameters

stdin {

}

           file {
            path => ["/var/log/httpd/access_*"]
            start_position => "beginning"
            type => "apache"
   }

}

filter {
if [type] == "apache" {
# for Apache Access logs
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
}
}
output {
# add necessary output parameters

stdout {

codec => rubydebug

}

            if [type] == "apache"   {
                    elasticsearch {
                    hosts => ["dnsdf.isnlab.in:9200"]
                    index => "logstash-apache-%{+YYYY.MM.dd}"
                                            }
                                    }

And here is log
92.168.5.102 - - [11/Dec/2017:08:58:57 +0530] "GET / HTTP/1.1" 200 44546 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/62.0.3202.94 Safari/537.36"

Please answer my second question (the one about an example event). Also, what other configuration files do you have?

Well you know I was playing with configuration and logs are not even getting indexed with said config file however stdin & stdout are showing proper indexes.

Please answer all questions.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.