Documents not in elasticsearch

Hello Logstash and Elasticsearch users,

Once again i did something without fully knowing how it works and what I did.

I have a topic in the logstash part where i got some help with a problem of not getting events in the right index. One part is working perfectly now and one part is not. My syslog events are now all getting in my syslog index and my ossec events are not showing up anywhere. While if i run the config manually with either of the following debug configs it works perfectly.

At this moment the config is this:

input {
  lumberjack {
    port => 5003
    type => "lumberjack"
    ssl_certificate => "/etc/logstash/logstash-forwarder.crt"
    ssl_key => "/etc/logstash/logstash-forwarder.key"
    codec => json
    tags => ["ossec"]
  }
}

filter {
   if "ossec" in [tags] {
   geoip {
      source => "srcip"
      target => "geoip"
      database => "/opt/logstash/vendor/geoip/GeoLiteCity.dat"
      add_field => [ "[geoip][location]", "%{[geoip][longitude]}" ]
      add_field => [ "[geoip][location]", "%{[geoip][latitude]}"  ]
    }
    date {
        match => ["timestamp", "YYYY MMM dd HH:mm:ss"]
        target => "@timestamp"
    }
    mutate {
      convert => [ "[geoip][location]", "float"]
      rename => [ "hostname", "AgentName" ]
      rename => [ "geoip", "GeoLocation" ]
      rename => [ "file", "AlertsFile" ]
      rename => [ "agentip", "AgentIP" ]
      rename => [ "[rule][comment]", "[rule][description]" ]
      rename => [ "[rule][level]", "[rule][AlertLevel]" ]
      remove_field => [ "timestamp" ]
    }
  }
}

output {
   if "ossec" in [tags] {
    elasticsearch {
         hosts => bcksrv16
         index => "ossec-%{+YYYY.MM.dd}"
         #document_type => "ossec"
         #template => "/etc/logstash/elastic-ossec-template.json"
         #template_name => "ossec"
         #template_overwrite => true
    }
  }
}

So i tried this:

input {
        stdin {
                codec => json
                tags => ["ossec"]
        }
}


filter {
   if "ossec" in [tags] {
   geoip {
      source => "srcip"
      target => "geoip"
      database => "/opt/logstash/vendor/geoip/GeoLiteCity.dat"
      add_field => [ "[geoip][location]", "%{[geoip][longitude]}" ]
      add_field => [ "[geoip][location]", "%{[geoip][latitude]}"  ]
    }
    date {
        match => ["timestamp", "YYYY MMM dd HH:mm:ss"]
        target => "@timestamp"
    }
    mutate {
      convert => [ "[geoip][location]", "float"]
      rename => [ "hostname", "AgentName" ]
      rename => [ "geoip", "GeoLocation" ]
      rename => [ "file", "AlertsFile" ]
      rename => [ "agentip", "AgentIP" ]
      rename => [ "[rule][comment]", "[rule][description]" ]
      rename => [ "[rule][level]", "[rule][AlertLevel]" ]
      remove_field => [ "timestamp" ]
    }
  }
}

output {
   if "ossec" in [tags] {
    elasticsearch {
         hosts => bcksrv16
         index => "ossec-%{+YYYY.MM.dd}"
         document_type => "ossec"
         template => "/etc/logstash/elastic-ossec-template.json"
         template_name => "ossec"
         template_overwrite => true
    }
  }
}

and:

input {
  lumberjack {
    port => 5003
    type => "lumberjack"
    ssl_certificate => "/etc/logstash/logstash-forwarder.crt"
    ssl_key => "/etc/logstash/logstash-forwarder.key"
    codec => json
    tags => ["ossec"]
  }
}

filter {
   if "ossec" in [tags] {
   geoip {
      source => "srcip"
      target => "geoip"
      database => "/opt/logstash/vendor/geoip/GeoLiteCity.dat"
      add_field => [ "[geoip][location]", "%{[geoip][longitude]}" ]
      add_field => [ "[geoip][location]", "%{[geoip][latitude]}"  ]
    }
    date {
        match => ["timestamp", "YYYY MMM dd HH:mm:ss"]
        target => "@timestamp"
    }
    mutate {
      convert => [ "[geoip][location]", "float"]
      rename => [ "hostname", "AgentName" ]
      rename => [ "geoip", "GeoLocation" ]
      rename => [ "file", "AlertsFile" ]
      rename => [ "agentip", "AgentIP" ]
      rename => [ "[rule][comment]", "[rule][description]" ]
      rename => [ "[rule][level]", "[rule][AlertLevel]" ]
      remove_field => [ "timestamp" ]
    }
  }
}
output {
   if "ossec" in [tags] {
   stdout { codec => rubydebug }
}

Both test cases do what you would expect. One files up my screen quit fast because of the amount of event and the other gets the event from the stdin and enters it into elasticsearch. So why does the automated one doesn't work?

Does the last one really work? As far as I can tell there seems to be a finishing curly brace missing for the output block. Is that a cut and paste error?

You are right in what i copied the last curly brace is missing, with it it works.

Hi Guys,

I can really use some help here, if this is the wrong place to create this ticket please let me know and i'll create a new on in the elasticsearch forum.

Thanks
Meaglin