From logstash how to lookup on a data in elastic index?

Need a support,

i have created a lookup index in elastic search
index - lookup
Username;Identifier;First name;Last name

In logstash

Input data
I have to use in filter the elastic index - lookup to transform the output to

Store to elastic logstash index

Questions / suggestions are welcome

You would use the elasticsearch filter plugin. Your use case seems pretty simple, am I missing something more?

Hey thanks @rugenl ... Yes its looks simple simple but not able to implement though

Let me put in detail the logstash conf and data used

elastic index : lookup1

Identifier:l9012 Username:booker12 First name:Rachel Last name:Booker _id:R_ESQHsB2a-Go4z9V2H3 _type:_doc _index:lookup1 _score:0
Identifier:l2070 Username:grey07 First name:Laura Last name:Grey _id:SPESQHsB2a-Go4z9V2H3 _type:_doc _index:lookup1 _score:0
Identifier:l4081 Username:johnson81 First name:Craig Last name:Johnson _id:SfESQHsB2a-Go4z9V2H3 _type:_doc _index:lookup1 _score:0
Identifier:l9346 Username:jenkins46 First name:Mary Last name:Jenkins _id:SvESQHsB2a-Go4z9V2H3 _type:_doc _index:lookup1 _score:0
Identifier:l5079 Username:smith79 First name:Jamie Last name:Smith _id:S_ESQHsB2a-Go4z9V2H3 _type:_doc _index:lookup1 _score:0

data : sample.log


logstash conf : logstash-elastic-lookup.conf

input {
    file {
    path => "/home/csk/elk/logstash/sample.log"
    start_position => "beginning"
gsub=> ["message","Testlog-",""]
if [message] =~ /^{.*}$/
        json { 
		source => "message" 
		target => "message_extract" 
	elasticsearch {
            hosts => ["localhost:9200"]
	    index => "lookup1"
            query => "Identifier:%{[message_extract][opid]}"
            fields => { "Username" => "test" }
add_tag=> ["_testlogparsefailure"]
output {
file {
    path => "/home/csk/elk/logstash/sample_look.log"
    codec => json
      stdout { codec => rubydebug }

Output write to file sample_look.log


Expected output


Am I doing anything wrong please suggest..

The tag "_elasticsearch_lookup_failure" shows you're getting to the elasticsearch filter, but it's not working. Unless it's a typo here, you are looking for "index1", but said you created "index".

Is there anything in the logstash or elastic logs? Check the elastic audit log too if you have it. You can reproduce this query with curl to see if it works. Something like this:

curl -XGET "http://localhost:9200/index1/_search" -H 'Content-Type: application/json' -d'
  "match": {
    "terms": {
      "Identifier": "19012"

Does your Elastic stack require https?

Thanks @rugenl works on http only..

Also executed the command in console: getting error
Data is available in the index : lookup1

In kibana also i can see the data

My query is
When a data comes to logstash need to use a lookup table which is a index (lookup1) in elastic search and add a new column and update the user name.

Could you please help here, hope the logic is clear.

I don't use curl queries often and construction one I can't test is harder. Try this query, I'm pretty sure my first one is wrong. You can try it in devtools to get it debugged too. I still think you are close.

Thanks @rugenl ...Yes i got the response for the query but how to incorporate same in logstash..

Are you sure that there is nothing in the logs for either logstash or elasticsearch? We need to know what query the logstash filter is sending, so (I hope this is a linux host) you should be able to capture the net traffic with something like this:

tcpdump -i lo -s 1500 -w filter.pcap port 9200

Start tcpdump, run your logstash test, stop tcpdump and look at the captured file with something like wireshark. You should see a http request similar to the curl test. Hopefully there isn't a lot of other traffic to localhost:9200.

Logstash i see some error

021-08-18T20:51:03,117][WARN ][logstash.filters.elasticsearch] Failed to query elasticsearch for previous event {:index=>"lookup1", :error=>"[400] {\"error\":{\"root_cause\":[{\"type\":\"query_shard_exception\",\"reason\":\"No mapping found for [@timestamp] in order to sort on\",\"index_uuid\":\"t3tT7cyoRF6IXoG0WzBdew\",\"index\":\"lookup1\"}],\"type\":\"search_phase_execution_exception\",\"reason\":\"all shards failed\",\"phase\":\"query\",\"grouped\":true,\"failed_shards\":[{\"shard\":0,\"index\":\"lookup1\",\"node\":\"TzAW1Nu9REufdUwdXFjXlQ\",\"reason\":{\"type\":\"query_shard_exception\",\"reason\":\"No mapping found for [@timestamp] in order to sort on\",\"index_uuid\":\"t3tT7cyoRF6IXoG0WzBdew\",\"index\":\"lookup1\"}}]},\"status\":400}"}

You'll have to look in the filter.pcap file to see what logstash sent to elasticsearch.

Ok, I finally see the error "no mapping found for @timestamp". The doc section for templates seems to imply that the simple query is adding size = 1 and a sort on @timestamp. You don't have @timestamp in your lookup index.

I think you may need to create a template and omit the sort.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.