Query cuts off part of document and having operator issues in Rails app

edit: I narrowed down the truncated search results issue to the use of highlighting. I tried all the different methods, but none of them returned the entire document.

Hi all,

I'm a web development and elasticsearch newbie, so please forgive any glaring mistakes. I've spent a lot of time reading through documentation and searching for answers, but I'm not making much progress.

In my app I have a bunch of Spanish terms that I define and explain their origins. The first image below shows the full explanation of the term "abuelo". However, the second image shows that querying "abuelo" cuts off the rest of the explanation. I noticed this issue when querying some other terms as well, but it doesn't happen with every unique query.

My second issue is two-part. Changing the default operator to "and" simply doesn't work for me. The second problem is that if I keep the operator as 'or', it won't always show results for every instance where a word occurs. For instance, in the first image below, I only search for "Asturian", which shows all the results. However, if I search for "Asturian down", it only shows a couple results.

Asturian:

Asturian down:

Below is all the code in my model. I apologize for the long post and if I violated any forum rules. I'm racking my brain right now. Any help is much appreciated!


require 'elasticsearch/model'

class Term < ActiveRecord::Base
  include Elasticsearch::Model
  include Elasticsearch::Model::Callbacks

  settings index: { number_of_shards: 1, number_of_replicas: 1 },

analysis: {
  filter: {
    english_stop: {
      type: 'stop',
      stopwords: ['_english_']
    },
    spanish_stop: {
      type: 'stop',
      stopwords: ['_spanish_']
    },
    spanish_stemmer: {
      type: 'stemmer',
      language: 'light_spanish'
    },
    synonym: {
      type: 'synonym',
      synonyms_path: 'synonyms.txt',
      ignore_case: 'true'
    }
  },
  analyzer: {
    stops_and_synonyms: {
      type: 'custom',
      tokenizer: 'standard',
      filter: [
        'english_stop',
        'spanish_stop',
        'spanish_stemmer',
        'synonym',
        'lowercase'
      ]
    },
    spanish: {
      tokenizer: 'standard',
      filter: 'synonym',
      language: 'spanish'
    }
  }
} do

mappings dynamic: 'false' do
  indexes :id, index: :not_analyzed
  indexes :name, analyzer: 'spanish', index_options: 'offsets'
  indexes :gender, index: :not_analyzed
  indexes :part_of_speech, index: :not_analyzed
  indexes :definition, index_options: 'offsets'
  indexes :etymology1, analyzer: 'stops_and_synonyms', index_options: 'offsets'
  indexes :etymology2, analyzer: 'stops_and_synonyms', index_options: 'offsets'
  indexes :uses, analyzer: 'stops_and_synonyms', index_options: 'offsets'
  indexes :romance_cognates, analyzer: 'stops_and_synonyms', index_options: 'offsets'
  indexes :notes1, analyzer: 'stops_and_synonyms', index_options: 'offsets'
  indexes :notes2, analyzer: 'stops_and_synonyms', index_options: 'offsets'
  indexes :quote1, analyzer: 'stops_and_synonyms', index_options: 'offsets'
  indexes :quote2, analyzer: 'stops_and_synonyms', index_options: 'offsets'
end
  end

  def self.search(query)
    __elasticsearch__.search(
       {
        query: {
          multi_match: {
            query: query,
            fields: ['name^10', 'definition^9', 'etymology1', 'etymology2', 'uses', 'romance_cognates', 'notes1', 'notes2', 'quote1', 'quote2'],
            operator: 'and',
          }
        },
        highlight: {
          pre_tags: ['<`em`>'],
          post_tags: ['<`/em`>'],
          fields: {
            name: {},
            definition: {},
            etymology1: {},
            etymology2: {},
            uses: {},
            romance_cognates: {},
            notes1: {},
            notes2: {},
            quote1: {},
            quote2: {}
          }
        }
      }
    )
  end
end


Term.__elasticsearch__.client.indices.delete index: Term.index_name rescue nil


Term.__elasticsearch__.client.indices.create \
  index: Term.index_name,
  body: { settings: Term.settings.to_hash, mappings: Term.mappings.to_hash }

Term.import(force: true)