Trouble using elasticsearch filter plugin

Hi,

A bit new into SSL/TLS. I do have managed to get Logstash to work with Searchguard/Elasticsearch (elasticsearch output) on a test server but when trying to use the elasticsearch filter plugin I get a ConnectionFailed warning in Logstash but nothing from Elasticsearch.

logstash.conf

filter {
...
    elasticsearch {
      hosts => ["https://127.0.0.1:9200"]
      index => "logstash-index-ref"
      user => "logstash"
      password => "*****"
      ssl => true
      query => "BusinessEmail:%{user}"
      fields => { "FirstName" => "FirstName" }
    }
...
}
...
output {
  elasticsearch {   
    hosts => ["https://127.0.0.1:9200"]
    index => "logstash-%{+YYYY.MM.dd}"
    user => "logstash"
    password => "*****"
    ssl => true
    ssl_certificate_verification => false
    truststore =>  "/etc/elasticsearch/truststore.jks" 
    truststore_password =>  "changeit"
  }
  stdout { codec => rubydebug }
}

sg_roles.yml
sg_logstash:
cluster:
- indices:admin/template/get
- indices:admin/template/put
- indices:data/write/bulk*
indices:
'logstash-':
'
':
- CRUD
- CREATE_INDEX
'beat':
'*':
- CRUD
- CREATE_INDEX

I'm using the search-guard demo. Security works across Logstash->ES->Kibana, except I really couldn't get the elasticsearch filter plugin to work.

logstash-plain.log
[2017-09-10T13:31:19,688][WARN ][logstash.filters.elasticsearch] Failed to query elasticsearch for previous event {:index=>"logstash-index-ref", :query=>"BusinessEmail:-", :event=>2017-09-10T12:30:54.824Z ubuntu 2017-09-07T04:37:47.805788Z .***.***. -
2017-09-07T04:37:47.808586Z 3.798ms
HTTP/1.1 401 Unauthorized
Content-Type: application/json;charset=UTF-8

{
  "responseCode": 401
}
2017-09-07T04:37:47.809702Z 4.914ms


, :error=>#<Faraday::ConnectionFailed>}
[2017-09-10T13:31:19,736][WARN ][logstash.filters.elasticsearch] Failed to query elasticsearch for previous event {:index=>"logstash-index-ref", :query=>"BusinessEmail:test@test.com", :event=>2017-09-10T12:30:54.824Z ubuntu 2017-09-07T04:37:48.216151Z ***.***.***.*** test@test.com
2017-09-07T04:37:48.852590Z 637.439ms
HTTP/1.1 200 OK
Content-Type: application/json;charset=UTF-8


{
  "responseCode": 200
}
2017-09-07T04:37:48.853740Z 638.589ms


, :error=>#<Faraday::ConnectionFailed>}

Anyone have an idea? The elasticsearch-filter-plugin doc doesn't have the other security options in their elasticsearch output API.

Cinto

Don't you need to put security certificate path in the filter? Did you try ca_file setting?

Thanks. I have tried trial-and-erroring with the following ca_file options (not sure which one to put in, so I just tried each one):

Using self-signed certificates (listed in keystore.jks)
ca_file => "/dir/server.cer"
ca_file => "/dir/server.pem"

And the jks files from the demo
ca_file => "/etc/elasticsearch/trustore.jks"
ca_file => "/etc/elasticsearch/keystore.jks"
ca_file => "/etc/elasticsearch/kirk.jks"

Nothing worked and it just produces similar errors as previously pasted.

Adding in some details:
Elastic Stack - 5.5.0
Ubuntu 16.0.4
ES and Logstash are in the same machine
-I'm also able to curl --insecure 127.0.0.1 without setting a certificate file

Still stuck, still hopeful. Could this issue be related in any way to:

Fixes in master and 6.0
Elasticsearch Filter: Support ca_file setting when using https URI in hosts parameter (#58).

You seem to have figured out! The links seem relevant.

However, I would request someone senior in this matter to comment on this.

Is there a viable workaround for enrichment of data? (Without JDBC) Like, would creating 2-3 translate filters instead as lookup be safe in terms of performance?

Or should I just wait out on ES 6?

Hi @cito.ets , I am facing the same issue with elasticsearch filter. I compiled a gem file from master of logstash-filter-elasticsearch which contains the bug fix.
I used a config with SSL false and ca_file pointing to PEM file of my CA but the configuration still fails with below error
[2017-09-27T13:16:01,437][WARN ][logstash.filters.elasticsearch] Failed to query elasticsearch for previous event {:index=>"logstash-XXX-task-his-*", :query=>{"query"=> {"bool"=>{"must_not"=>{"exists"=>{"field"=>"srvr_status"}}, "must"=>[{"match"=> {"srvr_user_name.keyword"=>"XXX"}}]}}, "_source"=>["row_id", "srvr_start_ts", "srvr_end_ts"]}, :event=>2017-09-27T12:16:00.378Z %{host} %{message}, :error=># <Faraday::SSLError>

On ElasticSearch server I get below error
[2017-09-27T12:39:01,315][WARN ][o.e.x.s.t.n.SecurityNetty4HttpServerTransport] [ES-DEV-NODE-1] caught exception while handling client http traffic, closing connection [id: 0x6a69b4ce, L:0.0.0.0/0.0.0.0:9203 ! R:/10.33.15.194:38707] io.netty.handler.codec.DecoderException: javax.net.ssl.SSLException: Received fatal alert: certificate_unknown

Will keep you posted if I make progress on getting it to work.

Hi @cito.ets, my configuration is finally working based on the new GEM file that I used. The SSL error was resolved by setting ca_file to path of .cer file that contained chained cert for my intermediate and root CA. Previously I was using only Intermediate CA.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.