Template is not being used, mapping not working

Hey there! I am analysing access logs for a project. I have everything set up, Logstash is doing what is supposed to do, Elasticsearch and Kibana, too (using 5.2). BUT:
Yesterday I decided to write my own template, since I wanted to use the Path Hierarchy Tokenizer. But whenever I index data, my template is just not being used correctly and I don't know why.

The output in my logstash.conf looks like this:

output { 
 
        elasticsearch { 
  
                hosts => ["localhost:9200"] 
                manage_template => "true" 
                template_overwrite => "false" 
        }     
}

I have tried different things: set a template_name (no effect), set manage_template to "false" (which did nothing), set template_overwrite to "true" (which of course then did overwrite my template with the logstash one), all because I had no idea what to change.

My template looks like this and I used the Dev Tools ind Kibana to write it.

PUT _template/logstash
    {
    "template": "logstash",
    "index_patterns": "logstash-*",
    "settings": {
        "analysis": {
          "analyzer": {
            "custom_path_tree": {
              "tokenizer": "custom_hierarchy"
            },
            "custom_path_tree_reversed": {
              "tokenizer": "custom_hierarchy_reversed"
            }
          },
          "tokenizer": {
            "custom_hierarchy": {
              "type": "path_hierarchy",
              "delimiter": "/"
            },
            "custom_hierarchy_reversed": {
              "type": "path_hierarchy",
              "delimiter": "/",
              "reverse": "true"
            }
          }
        }
      },
      "mappings": {
        "logs": {
        "properties": {
          "object": {
            "type": "text",
            "fields": {
              "tree": {
                "type": "text",
                "analyzer": "custom_path_tree"
              },
              "tree_reversed": {
                "type": "text",
                "analyzer": "custom_path_tree_reversed"
              }
            }
          },
          "referral": {
            "type": "text",
            "fields": {
              "tree": {
                "type": "text",
                "analyzer": "custom_path_tree"
              },
              "tree_reversed": {
                "type": "text",
                "analyzer": "custom_path_tree_reversed"
              }
            }
          },
          "datetime": {
            "type": "date",
            "format": "time_no_millis"
          },
          "size": {
            "type": "integer"
          }
          }
        }
      }
    }

GET _template/logstash results in showing me my template, the way I want it. However, once I index data, nothing happens the way I expect it. The standard logstash template seems to not be used, because once I define an index pattern in Kibana, fields like geoip.latitude don't appear as opposed to not using a custom template. But also the fields like result.tree don't appear and datetime ends up being indexed as a keyword.

http://localhost:9200/_all/_mapping?pretty=1 yields this:

{
  "logstash-2019.07.03" : {
    "mappings" : {
      "logs" : {
        "properties" : {
          "@timestamp" : {
            "type" : "date"
          },
          "action" : {
            "type" : "text",
            "fields" : {
              "keyword" : {
                "type" : "keyword",
                "ignore_above" : 256
              }
            }
          },
          "datetime" : {
            "type" : "text",
            "fields" : {
              "keyword" : {
                "type" : "keyword",
                "ignore_above" : 256
              }
            } ...

I am probably missing something very simple, but this is the first time I am working with the ELK stack and I am only asking because I don't know what to do anymore. I would be grateful if someone had an idea.

Try this, Mainly need to add ilm_enabled => false

output {
elasticsearch {
	hosts => ["http://ip:9200"]
	user => "$user_name"
	password => "$password"
	index => "index-name-here-%{+YYYY-MM-dd}"
	action => "index"
	document_id => "%{doc_id}"
	manage_template => true 
	template => "$path_to_template/logstash_default_template.json"
	template_name => "$template_name"
	#https://github.com/elastic/logstash/issues/10687
	ilm_enabled => false
	template_overwrite => true
	action => "index"
   
}
}

Adding ilm_enabled => "false" resulted in my data not even getting indexed anymore.

[2019-07-12T12:46:48,419][ERROR][logstash.outputs.elasticsearch] Unknown setting 'ilm_enabled' for elasticsearch
[2019-07-12T12:46:48,425][ERROR][logstash.agent           ] Cannot create pipeline {:reason=>"Something is wrong with your configuration."}

I use Version 5.2 (have to, can't upgrade to the currant one). Maybe that's why?

template => "$path_to_template/logstash_default_template.json"
Is this mandatory? Like, do I need to save my template as .json?

Sorry, ilm_enabled is for the new version, i guess 7

https://www.elastic.co/guide/en/logstash/5.2/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-manage_template

Setting manage_template to false disables this feature. If you require more control over template creation, (e.g. creating indices dynamically based on field names) you should set manage_template to false and use the REST API to apply your templates manually.

This should work

output { 

    elasticsearch { 

            hosts => ["localhost:9200"] 
            manage_template => false
            template_overwrite => false
    }     

}

I tried that before, doesn't help. I restarted Elasticsearch and Logstash, but my template still isn't applied properly. Maybe something is wrong with my template? There seems to be an issue with it, since Elasticsearch just ignores my Analyzer and the mappings I wanted are simply overwritten.

Little update:

I saved my custom template as a .json file (it is valid, I checked).
The output part in my logstash configuration now looks like this:

output {

    elasticsearch { 

            hosts => ["localhost:9200"] 
            template => "/etc/logstash/conf.d/template.json" 
            index => "beam-%{+YYYY.MM.dd}"   
            manage_template => "false" 

            template_overwrite => "true" 
    } 

    stdout {  
            codec => rubydebug 
    } 

}

I tried setting managed_template to true and template_overwrite to false, with no difference.
I deleted any template already known to Elasticsearch before indexing documents. The indexing works just fine, only that my custom template is not being used, but still the standard logstash one.
I also tried creating my index before actually putting data in it. If I do that, my custom template is applied. But after I index data I get the Courier Fetch: 5 of 10 shards failed error.

Please, does anyone have an idea?

So what I see is is your index template is ...

PUT _template/logstash
    {
    "template": "logstash",
    "index_patterns": "logstash-*", <------------
    "settings": {

Which affects / matches indices with names logstash-*

But you are then you are outputting into...

elasticsearch { 

            hosts => ["localhost:9200"] 
            template => "/etc/logstash/conf.d/template.json" 
            index => "beam-%{+YYYY.MM.dd}"   <-----------
            manage_template => "false" 

            template_overwrite => "true" 

This writes to indexes named beam-* e.g. beam-2019.07.13

So your index template would need to look something like so that template index_patterns and the index you are writing to match

PUT _template/logstash
    {
    "template": "logstash",
    "index_patterns": "beam-*", <------
    "settings": {

For better consistency I would use a convention like

   PUT _template/beam
        {
        "template": "beam",
        "index_patterns": "beam-*",
        "settings": {

because you may end up indexing many different indexes from logstash...

I believe if you do this correctly... just put in your index_template and the output to the correct index I don't think you need to do any of the other settings for just basic indexing...

Hope that helps

Hi there! Sorry, I changed the template and my index name from logstash to beam because I read somewhere that the default names could mess with my desired outcome.

output {

        elasticsearch { 

                hosts => ["localhost:9200"] 
                template => "/etc/logstash/conf.d/template.json" 
                index => "beam-%{+YYYY.MM.dd}"   
                manage_template => "true" 
                template_overwrite => "true" 
                document_type => "beamlogs"
        } 

        stdout {  
                codec => rubydebug 
        } 

}

That is my latest output in the .conf

{
"template": "beam_custom",
"index_patterns": "beam-*",
"order" : 5,
"settings": {
    "number_of_shards": 1,
    "analysis": {
      "analyzer": {
        "custom_path_tree": {
          "tokenizer": "custom_hierarchy"
        },
        "custom_path_tree_reversed": {
          "tokenizer": "custom_hierarchy_reversed"
        }
      },
      "tokenizer": {
        "custom_hierarchy": {
          "type": "path_hierarchy",
          "delimiter": "/"
        },
        "custom_hierarchy_reversed": {
          "type": "path_hierarchy",
          "delimiter": "/",
          "reverse": "true"
        }

And this is part of my custom template. The names match up, yet still my template is not being used and overwritten with the custom indexing template.
I really need the Path Hierarchy Analyzer to work, but when I index data, this is what I get when I check the mapping:

"object" : {
            "type" : "text",
            "fields" : {
              "keyword" : {
                "type" : "keyword",
                "ignore_above" : 256
              }
            }
          },

The object.tree field isn't even created.

So. What I just tried was creating the index template in Kibana.

PUT _template/beam_custom
/followed by what is in my template.json

I then checked if the template was created.

GET _template/beam_custom

The output was this:

    {
      "beam_custom": {
        "order": 100,
        "template": "beam_custom",
        "settings": {
          "index": {
            "analysis": {
              "analyzer": {
                "custom_path_tree_reversed": {
                  "tokenizer": "custom_hierarchy_reversed"
                },
                "custom_path_tree": {
                  "tokenizer": "custom_hierarchy"
                }
              },
              "tokenizer": {
                "custom_hierarchy": {
                  "type": "path_hierarchy",
                  "delimiter": "/"
                },
        ...

So I guess creating the template worked.

Then I created an index

    PUT beam-2019-07-15

But when I checked the index, I got this:

    {
  "beam-2019.07.15": {
    "aliases": {},
    "mappings": {},
    "settings": {
      "index": {
        "creation_date": "1563044670605",
        "number_of_shards": "5",
        "number_of_replicas": "1",
        "uuid": "rGzplctSQDmrI_NSlt47hQ",
        "version": {
          "created": "5061699"
        },
        "provided_name": "beam-2019.07.15"
      }
    }
  }
}

Shouldn't the index pattern have been recognized? I think this is the heart of the problem. I thought that my template would have been used and the output should have been something like this instead:

{
  "beam-2019.07.15": {
    "aliases": {},
    "mappings": {
      "logs": {
        "properties": {
          "@timestamp": {
            "type": "date"
          },
          "action": {
            "type": "text",
            "fields": {
              "keyword": {
                "type": "keyword"
              }
            }
          },...

Why doesn't it recognize the pattern?

Couple thoughts.

I would try this all manually from the console first... instead of through logstash

First have you tested the analyzer just stand alone as documented here to make sure it actually works? If not you need to debug that first.

Then if that works then I would proceed and test by setting up the template, index a document and look at the results manually from the developer console / tool and only then move on to logstash

Manual: Cleanup your indexes and templates be careful DO NOT do this in production

DELETE /_template/beam_custom

DELETE /beam-2019.07.13

You probably want to clean up any index patterns you created through Kibana as well.

Then POST in your template by hand

POST /_template/beam_custom
{...

then index a document manually

POST /beam-2019.07.13/_doc
{
.... your data in json here...
}

Then take a look at the resulting document. This needs to work before you move on to logstash

GET /beam-2019.07.13/_search

Then if all that works do not remove the template and then try to ingest with logstash with just the following...

output {

        elasticsearch { 
                hosts => ["localhost:9200"] 
                index => "beam-%{+YYYY.MM.dd}"   
        } 

        stdout {  
                codec => rubydebug 
        } 
}

Then take a look.... and see if it is correct.

Continue to iterate from there....

NOTE: Also I would steer away from using document_type as that is decremented in future versions (starting with 6.0) see here going forward all documents are of type _doc and will cause more work when you upgrade and is not a best practice going forward.

Thank you so much for taking the time! But I just found a major mistake in my template!

When I looked up how to make my own template, I managed to look it up for the current version instead of 5.2.

The thing is that in 5.1. "index_pattern =>" does not exist. The patterns are actually defined in "template =>"

So after i deleted the index_patterns line and instead wrote
template => "beam-*"

I got it to work. At least somehow. After indexing my files I can see that the template is applied and my index pattern in Kibana includes all the fields I need, however my data doesn't show up anymore?

But this is something I hoped I'll have fixed in no time. Any ideas, though? I took a look at the logstash log and it said

[2019-07-13T21:26:50,734][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"beam-2019.07.03", :_type=>"beamlogs", :_routing=>n
il}, 2019-07-03T21:48:17.000Z %{host} the_data_no_one_has_to_see, :response=>{"index"=>{"_index"=>"beam-2019.07.03", "_type"=>"beamlogs", 
"_id"=>"AWvszV7LXUuoAQt7UGil", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [datetime]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: 
\"03/Jul/2019:23:48:17 +0200\" is malformed at \"/Jul/2019:23:48:17 +0200\""}}}}}

That looks like a date parsing exception i.e the date you are supplying and the date format do not match .. usually if that is just a field in the data that field will just show up as a data parse failure for that field but if you are trying to use that date as the @timestamp the document will not be indexed. Just debug the date formatting / parsing...

I ended up removing the datetimefield like it was suggested here: https://discuss.elastic.co/t/invalid-format-for-timestamp-field/34793/5

After that, indexing worked fine. Yet the object.tree field is still not being created. If I search

GET beam-*/_search
{
"query": {
"term": {
"object.tree": "/belletristik/"
}
}
}
I get nothing, though I should have a few hundred hits.

I am still unclear if you indexed a single document by hand do you see the correct results? i.e if you simply index a single document do the analyzers work... or even first simply test the analyzer to you see the correct results?

https://www.elastic.co/guide/en/elasticsearch/reference/5.2/analysis-pathhierarchy-tokenizer.html

At this point.. I would suggest opening a different discussion with the proper title so that it attracts the correct experts. Seeing as it seems this is no longer about the template and I am not an expert on analyzers.

I did index a few documents by hand. One of the analyzers seems to work, the other one does not. I'll try to find a solution. After getting this far I am confident I'll make this work, too.

Thank you so much for your help!

Even though this is not your version there is a pretty complete sample here

https://www.elastic.co/guide/en/elasticsearch/reference/7.2/analysis-pathhierarchy-tokenizer-examples.html

I think if you just converted to 5.2 syntax it might be a good starting place

Yup agree you are well on your way and your welcome... and now I am playing with path analyzers too LOL! :slight_smile:

That's the example I worked with while setting it all up :slight_smile:
It works perfectly, though there are problems once I try it with my documents. But that's a problem for tomorrow.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.