Event routing to indexes not working

Hi,
I am taking windows and unix logs onto logstash and wanted these event logs to be ingested on two different indexes. Below is by config file. 
input {
    beats {
        port => "5042"
 }
}
filter {
  if "winlogbeat" in [agent][type]
   {
      mutate { 
            add_tag => ["windows_event"]
       }
  }
  if "audit.log" in [log][file][path]
  {       mutate {
            add_tag => ["audit_logs"]
       }
  }
  if "Security" in [winlog][channel] {
      if [event][code] not in [ 472,529,530,531,532,533,534,535,536,537,539,4625,4656,4673,4771 ] {
          drop {}
         }
  }
  if "Application" in [winlog][channel] {
      if [event][code] not in [ 2,4,90,18456,4363 ] {
          drop {}
         }
  }
  if "System" in [winlog][channel] {
      if [event][code] not in [ 6013 ] {
          drop {}
        }
  }
}
output {
  if "unix_logs" in [tags] {
   elasticsearch {
     hosts => [""http://x.x.x.x:9250", "http://x.x.x.x:9250""]
     user => "elastic"
     password => "{ES_PWD}"
     index => "unix_logs"
   }
  }
  if "windows_logs" in [tags] or "windows_event" in [tags] {
   elasticsearch {
     hosts => [""http://x.x.x.x:9250", "http://x.x.x.x:9250""]
     user => "elastic"
     password => "{ES_PWD}"
     index => "windows_logs"
   }
  }
  else {
   elasticsearch {
     hosts => ["http://x.x.x.x:9250", "http://x.x.x.x:9250"]
     user => "elastic"
     password => "{ES_PWD}"
     index => "misc"
    }
   }
}

All data which is coming from unix system is having tag "linux_logs" and for windows "windows_logs". 
But data is not routing properly to elasticsearch, I am getting some data with tags "unix_logs" on "misc" index. For windows it's working fine.

Is there anything, into the conf file which i am missing ?

Thanks in advance !

Your conditionals are

The events with unix_logs in tags should appear in two indexes. Perhaps you want that to be

  if "unix_logs" in [tags] {
  } else if "windows_logs" in [tags] or "windows_event" in [tags] {
  } else {
  }

Thanks, It is working now.