Filter Plugin: Elasticsearch

Hello,
My main goal is to get a tag (TRANSFER ID) from a previous log event, and i would like to include this tag in my current log event. So I decided to use the "Elasticsearch Filter" plugin but I'm having trouble with it:

Failed to query elasticsearch for previous event { <
........
.......
SOME LOGS EVENTS
.......
.......
 >}>>, :error=>#<Elasticsearch::Transport::Transport::Errors::MovedPermanently: [301] <HTML>
<HEAD><TITLE>Redirection</TITLE></HEAD>
<BODY><H1>Redirect</H1></BODY>
>, :level=>:warn} 

Here my code:

input {
    file{
        path => ["/home/christian/Documents/ELK/logs-interop/oma-interop-app.log-20160207"]
        start_position => "beginning"
        sincedb_path => "/dev/null"
    }
  stdin{}
}
filter {
    grok {
        patterns_dir => "./patterns"
        match => ["message", "%{DEFAULT}"]
    }
   elasticsearch {
        hosts => ["127.0.0.1"]
        query => "session_id:2" 
        fields => ["TRANSFER ID", "test_id"]
   }
}
output {
    stdout {
        codec => rubydebug
    } 
    elasticsearch {
        hosts => "127.0.0.1"
        index => "logstash-interop-app"
    }
} 

May you help me please.

Sincerely,
Chris

Looks like something is proxying Elasticsearch which the plugin cannot handle.

Yes, with proxy disabled it works. Is there a way to configure the plugin to handle the proxy ?

Not according to the docs.

Thank you warkolm for your answers.

Else, I got another issue with this plugin.

To example I got this log in a file:

2016-02-06T04:45:48.186Z INFO      iop_mg_1454733936.741_433 task            Interop\TransferService >> OUTGOING REQUEST to telma MMITransaction  {"transaction_Id":"iop_mg_1454733915.236_784"}
2016-02-06T04:46:01.092Z INFO      iop_mg_1454733936.741_433 task            Sending SMS to=000000000, encoding=UTF-8, text=ok. 

I want the second event "Sending SMS" retrieve the "transaction_Id" tag from the previous event thanks to session_id(iop_mg_1454733936.741_433).

Here my filter :

  elasticsearch {
                hosts => ["127.0.0.1"]
                query => "session_id:%{session_id} AND _exists_:transaction_Id"
                fields => ["transaction_Id", "verif_id"]
           }

When i run logstash, the transaction_id is not found. I think is because logstash take to much time to add the first event in elasticsearch and when it is trying to filter the second log event, it can't find nothing because the first log event is not yet saved in elasticsearch.

I tried not to use a file as input but stdin{} to insert my event log 1 by 1 using an interval of 2 sec and it works. But if I use a short interval to example 1sec it doesn't work.

How can I solve this ?

Sorry for my English, hope you have understood my issue.

Sincerely,
Chris.

i found a solution to retrieve data from a previous event using Ruby Fitler.

   if[transaction_Id] {
        ruby {  
            init => "@@map = {}"
            code => "
                @@map[event['verif_id']] = event['transaction_Id']
                @@map[event['session_id']] = event['session_id']
            "
        }
   } else if[classname] == "SendingSMS" {
        ruby{
            code => "
                if event['session_id'] == @@map[event['session_id']] 
                then
                    event['transaction_Id'] = @@map[event['verif_id']] 
                end"
        }
   }

Sincerely,
Chris.

Did you make it work??

I made the samething and I take this error:

[logstash.agent ] Error in reactor loop escaped: Bad file descriptor - Bad file descriptor (Errno::EBADF)

Could you help me??

And sorry my english... I 'm learning