Looukps with memcahed or tanslate filter

Hi,

I am doubt about lookups. Recently I implemented a pipeline using the filter plugin Memcached, but the Memcached plugin does not work well with json values. I needed to do this for work:

memcached {
        hosts => ["localhost"]
        get => {
            "%{ip}" => "[host_detail_tmp]"
        }
        add_tag => "key_founded"        
    }

    if "key_founded" in [tags] {
        dissect {
            mapping => {
                "host_detail_tmp" => "%{hostname}::%{type}::%{vendor}::%{model}"
            }
        }
    }

The filter plugin translate looks better with json values. I choose Memcached because I need to keep keys always updated.

For performance, wich filter is better? And, how can I keep updated json file using translate plugin?

tks.

Paulo

You can keep your keys updated with the translate filter as well, you need to point to a file with your external dictionary and set a refresh interval and logstash will update the values and keys if they change.

If you want an example, I made a blog post about the translate filter a couple of time ago.

About the performance, both are pretty fast, but since the translate filter looks for the key-value pairs in the memory of the logstash process, it will be a little faster, but it has some limitations about the size of the dictionary.

In this case I also made an old blog post about some differences between translate and memcached.

The main advantaged is that you can have larger dictionaries in memcached and multiple logstash can connect to it if needed.

If your memcached is in the same host of your logstash, you could also configure it to listen to a unix socket and configure the filter to connect to the unix socket, while this is not in the documentation, but it works. I made a PR to add this in the documentation, but it is still waiting a review from someone at elastic.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.