Enhance ELK data records with an external source?

Hello Logstash Gurus,

I have built a server that receives network data from external sources. That data goes into Logstash, then Elasticsearch, then is displayed on Kibana. Everything works great. (I’m using ELK versions 7.4.0 in Docker containers… yes, I know I have to upgrade.)

Right now, my network data flowing into ELK is pretty bare-bones. My boss would like to see the data enhanced with an external data source. I don’t have a specific source in mind, because I need to think how I might implement this enhancement within an ELK environment. There is no option to enhance the data before it arrives in Logstash.

The first idea that occurs is to use the “path” command in my Logstash filter, essentially “bouncing” all data records through a local Ruby script. Here would be the filter section of my Logstash config file:

filter {
  ruby {
    # Bounce all data records through this script:
    path => "/home/me/magicScript.rb"
  }
}

And then, that Ruby script might be:

def filter(event)
        # Parse event for key data, send that to external data source
        # When external data source replies, modify event to include new data fields with new information
        return [event]
end

In this way, every data record flowing through Logstash would be “bounced” through Ruby and enhanced with the extra data my boss is requesting.

I think this approach would likely work, but might not be the most optimal. I have to think about performance considerations. What other options might you recommend? Is there a way to insert a module or database lookup or something into either Logstash or Elasticsearch? Many thanks!

The dns, elasticsearch, geoip, http, jdbc_static, jdbc_streaming, and memcached filter can all be used to enrich events.

Awesome, thank you, this is a great jumping-off point for my research. Really appreciate the assist! :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.