Logstash filter: updating nested document event by querying ES

Hi,

I have a requirement where I need to join 2 indexes

"test-a-2017-02-22" index doc structure:

{
   "event": {
          "a": "A",
          "b": "B"
    }
}

"test-b" doc structure:

{
   "event": {
        "a": "A",
         "x": "X"
    }
}

Now I need to be able to show all the documents in "test-a-2017-02-22" index that have missing entry in "test-b" in Kibana. I wasn't able to do aggregations in visualize though these 2 indexes start with the same pattern as suggested in another post. I wont be able to put these 2 in the same index in different types since the first index is a daily index and the second index is static.

The approach am taking now is to add the "event.x" field from second index to the document in first index before indexing itself using logstash elasticsearch filter. Here are the different filters that I tried.

Option 1:

filter {
  json {
    source => "message"
    target => "event"
  }

  elasticsearch {
      hosts => ["xxxxx.com:9200"]
      user => "xxx"
      password => "xxx"
      index => "test-b"
      query => "event.a.keyword: %{[event][a]}"
      fields => { "event.x" => "[event][x]" }
      add_field => { "[event][x]" => "%{event.x}" }

   }

}

Option 2:

filter {

  json {
    source => "message"
    target => "event"
  }

  elasticsearch {
      hosts => ["xxxxx.com:9200"]
      user => "xxx"
      password => "xxx"
      index => "test-b"
      query => "[event][a]: %{[event][a]}"
      fields => { "[event][x]" => "[event][x]" }
      add_field => { "[event][x]" => "%{[event][x]}" }

   }

}

Neither of the options are adding/updating that field. What am I doing wrong? I have installed the plugin.

Also, if event.x has both keyword and text field, does it effect how am reading "x"?

Thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.