Documents not in elasticsearch from logstash

Hello Logstash and Elasticsearch users,

Once again i did something without fully knowing how it works and what I did.

I have a topic in the logstash part where i got some help with a problem of not getting events in the right index. One part is working perfectly now and one part is not. My syslog events are now all getting in my syslog index and my ossec events are not showing up anywhere. While if i run the config manually with either of the following debug configs it works perfectly.

At this moment the config is this:

input {
  lumberjack {
    port => 5003
    type => "lumberjack"
    ssl_certificate => "/etc/logstash/logstash-forwarder.crt"
    ssl_key => "/etc/logstash/logstash-forwarder.key"
    codec => json
    tags => ["ossec"]
  }
}

filter {
   if "ossec" in [tags] {
   geoip {
      source => "srcip"
      target => "geoip"
      database => "/opt/logstash/vendor/geoip/GeoLiteCity.dat"
      add_field => [ "[geoip][location]", "%{[geoip][longitude]}" ]
      add_field => [ "[geoip][location]", "%{[geoip][latitude]}"  ]
    }
    date {
        match => ["timestamp", "YYYY MMM dd HH:mm:ss"]
        target => "@timestamp"
    }
    mutate {
      convert => [ "[geoip][location]", "float"]
      rename => [ "hostname", "AgentName" ]
      rename => [ "geoip", "GeoLocation" ]
      rename => [ "file", "AlertsFile" ]
      rename => [ "agentip", "AgentIP" ]
      rename => [ "[rule][comment]", "[rule][description]" ]
      rename => [ "[rule][level]", "[rule][AlertLevel]" ]
      remove_field => [ "timestamp" ]
    }
  }
}

output {
   if "ossec" in [tags] {
    elasticsearch {
         hosts => bcksrv16
         index => "ossec-%{+YYYY.MM.dd}"
         #document_type => "ossec"
         #template => "/etc/logstash/elastic-ossec-template.json"
         #template_name => "ossec"
         #template_overwrite => true
    }
  }
}

So i tried this:

input {
        stdin {
                codec => json
                tags => ["ossec"]
        }
}


filter {
   if "ossec" in [tags] {
   geoip {
      source => "srcip"
      target => "geoip"
      database => "/opt/logstash/vendor/geoip/GeoLiteCity.dat"
      add_field => [ "[geoip][location]", "%{[geoip][longitude]}" ]
      add_field => [ "[geoip][location]", "%{[geoip][latitude]}"  ]
    }
    date {
        match => ["timestamp", "YYYY MMM dd HH:mm:ss"]
        target => "@timestamp"
    }
    mutate {
      convert => [ "[geoip][location]", "float"]
      rename => [ "hostname", "AgentName" ]
      rename => [ "geoip", "GeoLocation" ]
      rename => [ "file", "AlertsFile" ]
      rename => [ "agentip", "AgentIP" ]
      rename => [ "[rule][comment]", "[rule][description]" ]
      rename => [ "[rule][level]", "[rule][AlertLevel]" ]
      remove_field => [ "timestamp" ]
    }
  }
}

output {
   if "ossec" in [tags] {
    elasticsearch {
         hosts => bcksrv16
         index => "ossec-%{+YYYY.MM.dd}"
         document_type => "ossec"
         template => "/etc/logstash/elastic-ossec-template.json"
         template_name => "ossec"
         template_overwrite => true
    }
  }
}

and:

input {
  lumberjack {
    port => 5003
    type => "lumberjack"
    ssl_certificate => "/etc/logstash/logstash-forwarder.crt"
    ssl_key => "/etc/logstash/logstash-forwarder.key"
    codec => json
    tags => ["ossec"]
  }
}

filter {
   if "ossec" in [tags] {
   geoip {
      source => "srcip"
      target => "geoip"
      database => "/opt/logstash/vendor/geoip/GeoLiteCity.dat"
      add_field => [ "[geoip][location]", "%{[geoip][longitude]}" ]
      add_field => [ "[geoip][location]", "%{[geoip][latitude]}"  ]
    }
    date {
        match => ["timestamp", "YYYY MMM dd HH:mm:ss"]
        target => "@timestamp"
    }
    mutate {
      convert => [ "[geoip][location]", "float"]
      rename => [ "hostname", "AgentName" ]
      rename => [ "geoip", "GeoLocation" ]
      rename => [ "file", "AlertsFile" ]
      rename => [ "agentip", "AgentIP" ]
      rename => [ "[rule][comment]", "[rule][description]" ]
      rename => [ "[rule][level]", "[rule][AlertLevel]" ]
      remove_field => [ "timestamp" ]
    }
  }
}
output {
   if "ossec" in [tags] {
   stdout { codec => rubydebug }
} }

Both test cases do what you would expect. One files up my screen quit fast because of the amount of event and the other gets the event from the stdin and enters it into elasticsearch. So why does the automated one doesn't work?

This doesn't seem like an Elasticsearch question?

Hi Mark,

I've also asked the question in the logstash section but haven't had a response for 2 weeks.

Because of that i thought i'll try here, maybe someone has an idea

Greetings Richard

Do they show up if you use stdout?

When using stdout they are getting sent to the screen, as expected. And when i use the stdin with the endpoint being elasticsearch it also works.

Any errors in ES or LS logs?