Elasticsearch picks wrong data type

I have an active record object which looks like:

ResourceGroup id: 1, boolean_col1: true, boolean_col2: false

Both of the boolean columns were added at different time. When boolean_col1 was added, elasticsearch picked up proper BOOLEAN data type. Later on, when boolean_col2 was added, it picked up LONG data type.

However, I didnt implicitly mention the data type for the columns in my code base.
Here, I delete whole index from elasticsearch and reindex whole records, now it picked up proper data-type (BOOLEAN).

BELOW is the code to declare searchkick:

searchkick callbacks: false, merge_mappings: true, mappings: {
default: {
dynamic_templates: [],
properties: {
term: {
type: 'string',
index: 'not_analyzed'

I am puzzled how different data type was picked up ?

Tip: Wrap code blocks in ``` and they'll show up easier to read. Like on github.

Tip 2: You are much more likely to get specific help if you can reproduce the problem with curl and bash. Its the only way you can be sure everyone knows what you did. Personally all I know about searchkick is that it is a ruby library that can talk to Elasticsearch so I'm having to make a lot of guesses.

By chance does searchkick spit out 0 and 1 instead of false and true sometimes? As useful as the automatic schema inference stuff is in Elasticsearch I tend to want to configure everything myself. I set dynamic: false and manually add the new fields. I treat Elasticsearch's mapping like a relational schema and make building it the responsibility of my application's maintenance scripts. You'll have to index all the documents with the new field again after modifying the mapping.

You can avoid that with templates like you are doing. I've not done it myself but I'm sure you should be able to set up a template to match that boolean_prefix you have. Here are the docs.