Adding fields from csv to event that's a log


#1

Hi,

i'm using log stash to forward a static file which is csv and a another log stash instance sending log files.

I'm able to get both csv and log into the same index. They both have the same value in the "key" field.

basically what i want to do is....
If key in .csv == key in log event then add other fields from .csv row in log events so i can view all fields in the same event in kibana.

another way to view this is...
server 1 sending .csv has:
key, username, email, company
h39hs, smith123, smith123@smith.com, smith inc

server 2 sending log has:
id - key - ip - - time GET query
0 - h39hs - 127.0.0.1 -- [22/Sep/2015:19:03:15 +0000] /some/query/here

in kibana i want to view:
0 - h39hs - 127.0.0.1 -- [22/Sep/2015:19:03:15 +0000] /some/query/here smith123, smith123@smith.com, smith inc

It matches at key and so i see what email when with what requests.

Is there a way to do this? I'm using the latest stable build of logstash, elasticsearch, and kibana


(Mark Walkom) #2

This might be what you want https://github.com/alcanzar/logstash-filter-memorize


(system) #3