Not able to index documents using app search(7.6.1)

I was able successfully install app search on linux(ubuntu) with nginx as reverse proxy to app search, my configs are:

Nginx:

upstream kombare-search {
    server 127.0.0.1:3002;
    keepalive 15;
}

server {
    server_name kombare-search.com;
    listen 80;
    location / {
      proxy_pass http://kombare-search;
      }
}

app-search.yml:

allow_es_settings_modification: true
elasticsearch.username: "{{ elasticsearch_username }}"
elasticsearch.password: "{{ elasticsearch_password }}"
elasticsearch.ssl.enabled: true
elasticsearch.ssl.verify: false
app_search.external_url: "{{ app_search_external_url }}"
app_search.auth.source: elasticsearch-native

When I am indexing documents - tried indexing with python API, curl, and paste json - they are not getting created.
e.g. with "Paste json" option


result:

Might be relevant bits from log:

  1. Failed
[2020-03-10T09:07:34.032+00:00][8913][2290][app-server][WARN]: Failed to claim job cb95b7d95a1c24fcaa243d5591cf7711c13c3c20, claim conflict occurred
[2020-03-10T09:07:34.032+00:00][8913][2286][app-server][WARN]: Failed to claim job cb95b7d95a1c24fcaa243d5591cf7711c13c3c20, claim conflict occurred
  1. Error:
[2020-03-10T09:07:34.191+00:00][8913][2288][app-server][INFO]: [ActiveJob] [Work::Engine::IndexAdder] [cb95b7d95a1c24fcaa243d5591cf7711c13c3c20] Re-queueing Work::Engine::IndexAdder for engine 5e674d49f1f1792522921019 document **["5e674fd9f1f179252292101f", "5e674fd9f1f1792522921020"] in 60 seconds. Reason: Transient HTTP error**

Log generated in response to "paste json", detailed log:

[2020-03-10T09:07:33.021+00:00][8913][2310][action_controller][INFO]: [5c2b4ba7-aa47-4d58-9fef-5af44147fcf0] Processing by LocoMoco::DocumentsController#create as JSON
[2020-03-10T09:07:33.023+00:00][8913][2310][action_controller][INFO]: [5c2b4ba7-aa47-4d58-9fef-5af44147fcf0]   Parameters: {"documents"=>[{"id"=>"park_rocky-mountain", "title"=>"Rocky Mountain", "description"=>"Bisected north to south by the Continental Divide, this portion of the Rockies has ecosystems varying from over 150 riparian lakes to montane and subalpine forests to treeless alpine tundra. Wildlife including mule deer, bighorn sheep, black bears, and cougars inhabit its igneous mountains and glacial valleys. Longs Peak, a classic Colorado fourteener, and the scenic Bear Lake are popular destinations, as well as the historic Trail Ridge Road, which reaches an elevation of more than 12,000 feet (3,700 m).", "nps_link"=>"https://www.nps.gov/romo/index.htm", "states"=>["Colorado"], "visitors"=>4517585, "world_heritage_site"=>false, "location"=>"40.4,-105.58", "acres"=>265795.2, "square_km"=>1075.6, "date_established"=>"1915-01-26T06:00:00Z"}, {"id"=>"park_saguaro", "title"=>"Saguaro", "description"=>"Split into the separate Rincon Mountain and Tucson Mountain districts, this park is evidence that the dry Sonoran Desert is still home to a great variety of life spanning six biotic communities. Beyond the namesake giant saguaro cacti, there are barrel cacti, chollas, and prickly pears, as well as lesser long-nosed bats, spotted owls, and javelinas.", "nps_link"=>"https://www.nps.gov/sagu/index.htm", "states"=>["Arizona"], "visitors"=>820426, "world_heritage_site"=>false, "location"=>"32.25,-110.5", "acres"=>91715.72, "square_km"=>371.2, "date_established"=>"1994-10-14T05:00:00Z"}], "dry_run"=>true, "host"=>"kombare-search.com", "protocol"=>"http", "engine_slug"=>"sp"}
[2020-03-10T09:07:33.128+00:00][8913][2310][app-server][INFO]: [5c2b4ba7-aa47-4d58-9fef-5af44147fcf0] Engine[5e674d49f1f1792522921019]: Adding a batch of 2 documents to the index asynchronously
[2020-03-10T09:07:33.137+00:00][8913][2310][app-server][INFO]: [5c2b4ba7-aa47-4d58-9fef-5af44147fcf0] [ActiveJob] Enqueueing a job into the '.app-search-esqueues-me_queue_v1_index_adder' index. {"job_type"=>"ActiveJob::QueueAdapters::EsqueuesMeAdapter::JobWrapper", "payload"=>{"args"=>[{"job_class"=>"Work::Engine::IndexAdder", "job_id"=>"cb95b7d95a1c24fcaa243d5591cf7711c13c3c20", "queue_name"=>"index_adder", "arguments"=>["5e674d49f1f1792522921019", ["5e674fd9f1f179252292101f", "5e674fd9f1f1792522921020"]], "locale"=>:en, "executions"=>1}]}, "status"=>"pending", "created_at"=>1583831253136, "perform_at"=>1583831253136, "attempts"=>0}
[2020-03-10T09:07:33.159+00:00][8913][2310][active_job][INFO]: [5c2b4ba7-aa47-4d58-9fef-5af44147fcf0] [ActiveJob] [2020-03-10 09:07:33 UTC] enqueued Work::Engine::IndexAdder job (cb95b7d95a1c24fcaa243d5591cf7711c13c3c20) on `index_adder`
[2020-03-10T09:07:33.161+00:00][8913][2310][active_job][INFO]: [5c2b4ba7-aa47-4d58-9fef-5af44147fcf0] [ActiveJob] Enqueued Work::Engine::IndexAdder (Job ID: cb95b7d95a1c24fcaa243d5591cf7711c13c3c20) to EsqueuesMe(index_adder) with arguments: "5e674d49f1f1792522921019", ["5e674fd9f1f179252292101f", "5e674fd9f1f1792522921020"]
[2020-03-10T09:07:33.165+00:00][8913][2310][action_controller][INFO]: [5c2b4ba7-aa47-4d58-9fef-5af44147fcf0] Completed 200 OK in 140ms (Views: 1.1ms)
[2020-03-10T09:07:34.032+00:00][8913][2290][app-server][WARN]: Failed to claim job cb95b7d95a1c24fcaa243d5591cf7711c13c3c20, claim conflict occurred
[2020-03-10T09:07:34.032+00:00][8913][2286][app-server][WARN]: Failed to claim job cb95b7d95a1c24fcaa243d5591cf7711c13c3c20, claim conflict occurred
[2020-03-10T09:07:34.042+00:00][8913][2288][active_job][INFO]: [ActiveJob] [Work::Engine::IndexAdder] [cb95b7d95a1c24fcaa243d5591cf7711c13c3c20] Performing Work::Engine::IndexAdder from EsqueuesMe(index_adder) with arguments: "5e674d49f1f1792522921019", ["5e674fd9f1f179252292101f", "5e674fd9f1f1792522921020"]
[2020-03-10T09:07:34.043+00:00][8913][2288][app-server][INFO]: [ActiveJob] **[Work::Engine::IndexAdder] [cb95b7d95a1c24fcaa243d5591cf7711c13c3c20] Bulk-indexing 2 documents...**
[2020-03-10T09:07:34.076+00:00][8913][2288][app-server][INFO]: [ActiveJob] [Work::Engine::IndexAdder] [cb95b7d95a1c24fcaa243d5591cf7711c13c3c20] Adding documents ["5e674fd9f1f179252292101f", "5e674fd9f1f1792522921020"] to index for engine 5e674d49f1f1792522921019
[2020-03-10T09:07:34.191+00:00][8913][2288][app-server][INFO]: [ActiveJob] [Work::Engine::IndexAdder] [cb95b7d95a1c24fcaa243d5591cf7711c13c3c20] Re-queueing Work::Engine::IndexAdder for engine 5e674d49f1f1792522921019 document **["5e674fd9f1f179252292101f", "5e674fd9f1f1792522921020"] in 60 seconds. Reason: Transient HTTP error**
[2020-03-10T09:07:34.194+00:00][8913][2288][app-server][INFO]: [ActiveJob] [Work::Engine::IndexAdder] [cb95b7d95a1c24fcaa243d5591cf7711c13c3c20] Enqueueing a job into the '.app-search-esqueues-me_queue_v1_index_adder' index. {"job_type"=>"ActiveJob::QueueAdapters::EsqueuesMeAdapter::JobWrapper", "payload"=>{"args"=>[{"job_class"=>"Work::Engine::IndexAdder", "job_id"=>"cb95b7d95a1c24fcaa243d5591cf7711c13c3c20", "queue_name"=>"index_adder", "arguments"=>["5e674d49f1f1792522921019", ["5e674fd9f1f179252292101f", "5e674fd9f1f1792522921020"]], "locale"=>:en, "executions"=>1}]}, "status"=>"pending", "created_at"=>1583831254193, "perform_at"=>1583831314192, "attempts"=>0}
[2020-03-10T09:07:34.200+00:00][8913][2288][app-server][INFO]: [ActiveJob] [Work::Engine::IndexAdder] [cb95b7d95a1c24fcaa243d5591cf7711c13c3c20] Ignoring duplicate job class=Work::Engine::IndexAdder, id=cb95b7d95a1c24fcaa243d5591cf7711c13c3c20, args=["5e674d49f1f1792522921019", ["5e674fd9f1f179252292101f", "5e674fd9f1f1792522921020"]]
[2020-03-10T09:07:34.200+00:00][8913][2288][active_job][INFO]: [ActiveJob] [Work::Engine::IndexAdder] [cb95b7d95a1c24fcaa243d5591cf7711c13c3c20] [2020-03-10 09:07:34 UTC] enqueued Work::Engine::IndexAdder job (cb95b7d95a1c24fcaa243d5591cf7711c13c3c20) on `index_adder`
[2020-03-10T09:07:34.202+00:00][8913][2288][active_job][INFO]: [ActiveJob] [Work::Engine::IndexAdder] [cb95b7d95a1c24fcaa243d5591cf7711c13c3c20] Enqueued Work::Engine::IndexAdder (Job ID: cb95b7d95a1c24fcaa243d5591cf7711c13c3c20) to EsqueuesMe(index_adder) at 2020-03-10 09:08:34 UTC with arguments: "5e674d49f1f1792522921019", ["5e674fd9f1f179252292101f", "5e674fd9f1f1792522921020"]
[2020-03-10T09:07:34.203+00:00][8913][2288][active_job][INFO]: [ActiveJob] [Work::Engine::IndexAdder] [cb95b7d95a1c24fcaa243d5591cf7711c13c3c20] [2020-03-10 09:07:34 UTC] completed Work::Engine::IndexAdder job (cb95b7d95a1c24fcaa243d5591cf7711c13c3c20) on `index_adder`
[2020-03-10T09:07:34.204+00:00][8913][2288][active_job][INFO]: [ActiveJob] [Work::Engine::IndexAdder] [cb95b7d95a1c24fcaa243d5591cf7711c13c3c20] Performed Work::Engine::IndexAdder from EsqueuesMe(index_adder) in 161.42ms
[2020-03-10T09:07:34.205+00:00][8913][2288][app-server][INFO]: Deleting: {:index=>".app-search-esqueues-me_queue_v1_index_adder", :type=>nil, :id=>"cb95b7d95a1c24fcaa243d5591cf7711c13c3c20", :if_primary_term=>1, :if_seq_no=>19}
[2020-03-10T09:07:35.735+00:00][8913][2294][app-server][INFO]: [6c2fa469-cd71-45b2-bd20-4c2e3ef709c9] Started POST "/as/engines/sp/documents.json?query=" for 127.0.0.1 at 2020-03-10 09:07:35 +0000
[2020-03-10T09:07:35.739+00:00][8913][2294][action_controller][INFO]: [6c2fa469-cd71-45b2-bd20-4c2e3ef709c9] Processing by LocoMoco::DocumentsController#index as JSON
[2020-03-10T09:07:35.739+00:00][8913][2294][action_controller][INFO]: [6c2fa469-cd71-45b2-bd20-4c2e3ef709c9]   Parameters: {"page"=>{"current"=>1}, "query"=>"", "host"=>"kombare-search.com", "protocol"=>"http", "engine_slug"=>"sp"}
[2020-03-10T09:07:35.853+00:00][8913][2294][action_controller][INFO]: [6c2fa469-cd71-45b2-bd20-4c2e3ef709c9] Completed 200 OK in 113ms (Views: 0.5ms)
[2020-03-10T09:07:41.012+00:00][8913][2312][app-server][INFO]: [40bd5f48-d72e-4b47-abe2-c76aaa0d518e] Started POST "/as/engines/sp/documents.json?query=" for 127.0.0.1 at 2020-03-10 09:07:41 +0000

Hi @Jaaved_Ali_Khan. What version of App Search are you using?

Version 7.6.1 @orhantoy

Does the exact same thing happen if you create a new engine and try to index some documents in that? I'm just wondering why you got the "Transient HTTP error" error in the first place.

Yes.
I created new index and tried adding index in three different ways:

  1. by using "paste json" method
  2. with curl command
  3. using python API

I did not get any error while using any of the methods, but the documents were not created in index and I was getting the same - the one I shared. @orhantoy

I wonder if the documents actually aren't created, or if the UI is just not updated.

Can you try querying your index with the REST API to see if you get any results back? https://swiftype.com/documentation/app-search/api/search

@Jaaved_Ali_Khan But are you seeing "Transient HTTP error" in the logs for the other methods as well?

@JasonStoltz Even REST API search results are empty:

{"meta":{"alerts":[],"warnings":[],"page":{"current":1,"total_pages":0,"total_results":0,"size":10},"engine":{"name":"spwithspecs","type":"default"},"request_id":"6e543246-adc2-4039-9ae8-3fbd61deb62f"},"results":[]}

Yes @orhantoy

@orhantoy and @JasonStoltz do you need more information about the issue, please help.

So far I haven't been able to reproduce this error. The transient HTTP error should only happen if the connection to ES fails or times out but I don't see how that is happening when you seem to have a fine ES connection.

Could you show your ES configuration?

- name: Install elasticsearch
  hosts: kombare_es
  roles:
    - role: elastic.elasticsearch
  vars:
    es_enable_xpack: true
    es_version: 7.6.1
    es_heap_size: 4g
    es_enable_http_ssl: true
    es_api_basic_auth_username: elastic
    es_api_basic_auth_password: changme
    es_ssl_keystore: "files/certs/my-keystore.p12"
    es_ssl_truststore: "files/certs/ca-public-private.p12"
    es_validate_certs: no
    es_config:
      # network.host: 0.0.0.0
      # network.publish_host: 0.0.0.0
      network.host: 127.0.0.1
      network.publish_host: 127.0.0.1
     # http.port: 8001
      node.name: e1
#      cluster.initial_master_nodes:
#        - kombare-es.com
      discovery.type: single-node
      http.cors.enabled: true
      http.cors.allow-origin: "*" # todo make it specific
      xpack:
        security:
          authc:
            realms:
              native:
                native1:
                  order: 0

  become: yes

  tasks:
    - name: add nginx server config
      import_tasks: _elasticsearch_nginx.yml
      tags: nginx

I am using official ansible playbook: https://github.com/elastic/ansible-elasticsearch

is the issue related to nginx? is this topic related Issue with app search (download) release -- not quite a connection issue ?

Following this article I moved the app search to the same server - to communicate with elasticsearch directly without nginx reverse proxy- and changed app-search.yml to

allow_es_settings_modification: true
elasticsearch.username: "{{ elasticsearch_username }}"
elasticsearch.password: "{{ elasticsearch_password }}"
elasticsearch.ssl.enabled: true
elasticsearch.ssl.verify: false
app_search.external_url: https://localhost:9200
app_search.auth.source: elasticsearch-native

then I was this error:

unable to find valid certification path to requested target 

even when

elasticsearch.ssl.verify: false

@orhantoy

Elastic search configs:

discovery.type: single-node
http.cors.allow-origin: '*'
http.cors.enabled: true
network.host: 127.0.0.1
network.publish_host: 127.0.0.1
node.name: e1
xpack:
  security:
    authc:
      realms:
        native:
          native1:
            order: 0
cluster.name: elasticsearch
#################################### Paths ####################################
# Path to directory containing configuration (this file and logging.yml):
path.data: /var/lib/elasticsearch
path.logs: /var/log/elasticsearch
action.auto_create_index: true
xpack.security.enabled: true
xpack.security.http.ssl.enabled: true
xpack.security.http.ssl.keystore.path: "/etc/elasticsearch/certs/my-keystore.p12"
xpack.security.http.ssl.truststore.path: "/etc/elasticsearch/certs/ca-public-private.p12"

@orhantoy

@orhantoy @JasonStoltz should I give up on this?

Mar 17 21:29:14 kombare-es_1 app-search[18906]: [2020-03-17T20:29:14.891+00:00][18906][2286][app-server][INFO]: [ActiveJob] [Work::Engine::IndexAdder] [cb95b7d95a1c24fcaa243d5591cf7711c13c3c20] Re-queueing Work::Engine::IndexAdder for engine 5e674d49f1f1792522921019 document ["5e674fd9f1f179252292101f", "5e674fd9f1f1792522921020"] in 60 seconds. Reason: Transient HTTP error

Hey @Jaaved_Ali_Khan,

Do you have the same issue when you disable SSL? It'd be great to understand whether your issues are related to your SSL configuration or not.

As mentioned in your other post, elasticsearch.ssl.verify: false may not work correctly right now in your local development environment.

it works fine when I don't use SSL, when I use HTTP; @JasonStoltz

@JasonStoltz @orhantoy finally I was able resolve this issue.

The reason for the issue was nginx reverse proxy, I moved the both elasticsearch and app search to same machine and connected app-search to elasticsearch directly, https://localhost:9200

This configuration worked:

elasticsearch.host: " https://127.0.0.1:9200"
elasticsearch.ssl.enabled: true
elasticsearch.ssl.certificate_authority: "{{ ssl_certs_dir_path }}/ca.crt"
elasticsearch.ssl.verify: true

I think the bug here is, when using elasticsearch.ssl.verfiy= false , app search was still verifying server public certificate using default trust store of java. I made elasticsearch.ssl.verfiy= true pointing to the CA and it worked; I described the issue here: App search not able to connect to elasticsearch directly over https.

I spent lot of time on this, @orhantoy @JasonStoltz thanks for your help.

1 Like

@Jaaved_Ali_Khan I'm so glad you got this working. Thank you for sticking with it, your answers here in this forum will hopefully help other in the future with the same struggles.

In the meantime, we've logged a bug internally for the SSL verification issues.

1 Like