When I set up the template file and used logstash to create the index, I got an error

The execution environment is shown below.

I would like to create an index by specifying mapping in a json file with index template set from logstash.
How can I do this?

logstash-7.13.0

Settings in the conf file

input {
    file {
        mode => "tail"
        path => ["/Users/UserName/logstash-7.13.0/testdata.csv"]
        start_position => "beginning"
        sincedb_path => "/Users/UserName/logstash-7.13.0/sincedb.txt"
        codec => plain { 
            charset => "UTF-8"
        }
    }
}

filter {
    csv {
        separator => ","
        columns => ["id","title","foreign_or_domestic","genre","actor","rate","director","imageUrl"]
        convert => {
            "id" => "integer"
        }
        skip_header => true
    }
}

output {
    elasticsearch { 
    hosts => [ "https://4afbf336dc56.asia-northeast1.gcp.cloud.es.io:9243" ]
    index => "demo-item-test"
    template => "/Users/UserName/logstash-7.13.0/config/test-template.json"
    template_name => "test-template"
    template_overwrite => true
    user => "elastic" 
    password => "*******" 
  } 
    stdout {
        codec => json
    }
}

Settings in template.json

{
  "test-template": {
    "mappings": {
      "properties": {
        "title": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "foreign_or_domestic": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "imageUrl": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "actor": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "director": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "rate": {
          "type": "long"
        },
        "genre": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "id": {
          "type": "long"
        }
      }
    }
  }
}

logs

sudo bin/logstash -f config/test.conf
Using bundled JDK: /Users/UserName/logstash-7.13.0/jdk.app/Contents/Home
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
Sending Logstash logs to /Users/UserName/logstash-7.13.0/logs which is now configured via log4j2.properties
[2021-06-09T13:01:10,368][INFO ][logstash.runner          ] Log4j configuration path used is: /Users/UserName/logstash-7.13.0/config/log4j2.properties
[2021-06-09T13:01:10,380][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.13.0", "jruby.version"=>"jruby 9.2.16.0 (2.5.7) 2021-03-03 f82228dc32 OpenJDK 64-Bit Server VM 11.0.10+9 on 11.0.10+9 +indy +jit [darwin-x86_64]"}
[2021-06-09T13:01:10,552][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2021-06-09T13:01:11,637][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2021-06-09T13:01:12,322][INFO ][org.reflections.Reflections] Reflections took 61 ms to scan 1 urls, producing 24 keys and 48 values
[2021-06-09T13:01:13,480][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://4afbf336dc56486b9c3cf1924d0e1361.asia-northeast1.gcp.cloud.es.io:9243"]}
[2021-06-09T13:01:13,886][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@4afbf336dc56486b9c3cf1924d0e1361.asia-northeast1.gcp.cloud.es.io:9243/]}}
[2021-06-09T13:01:14,627][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@4afbf336dc56486b9c3cf1924d0e1361.asia-northeast1.gcp.cloud.es.io:9243/"}
[2021-06-09T13:01:14,766][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (7.4.2) {:es_version=>7}
[2021-06-09T13:01:14,769][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2021-06-09T13:01:14,907][INFO ][logstash.outputs.elasticsearch][main] Using mapping template from {:path=>"/Users/UserName/logstash-7.13.0/config/test-template.json"}
[2021-06-09T13:01:14,938][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/Users/UserName/logstash-7.13.0/config/test.conf"], :thread=>"#<Thread:0x72b5a0ac run>"}
[2021-06-09T13:01:15,002][INFO ][logstash.outputs.elasticsearch][main] Installing Elasticsearch template {:name=>"test-template"}
[2021-06-09T13:01:15,188][ERROR][logstash.outputs.elasticsearch][main] Failed to install template {:message=>"Got response code '400' contacting Elasticsearch at URL 'https://4afbf336dc56486b9c3cf1924d0e1361.asia-northeast1.gcp.cloud.es.io:9243/_template/test-template'", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError, :backtrace=>["/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80:in `perform_request'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:306:in `perform_request_to_url'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:293:in `block in perform_request'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:382:in `with_connection'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:292:in `perform_request'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:300:in `block in Pool'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:390:in `template_put'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:84:in `template_install'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/template_manager.rb:29:in `install'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/template_manager.rb:17:in `install_template'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch.rb:496:in `install_template'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch.rb:309:in `finish_register'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch.rb:279:in `block in register'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/plugin_mixins/elasticsearch/common.rb:145:in `block in after_successful_connection'"]}
[2021-06-09T13:01:15,952][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.01}
[2021-06-09T13:01:16,181][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2021-06-09T13:01:16,214][INFO ][filewatch.observingtail  ][main][d370e9c2ca333187908d622d7289e760793d2ca3cb939a53e2ad6f141894e826] START, creating Discoverer, Watch with file and sincedb collections
[2021-06-09T13:01:16,256][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

I'd start by posting that template directly to Elasticsearch and seeing what it says. I don't think that template is valid as it doesn't contain things like an index pattern to match (for eg).

I PUT the template first, and then ran logstash to create the index.
The mapping of the created index shows the fields that were not included in the template content when it was PUT.
What could be the cause of this?
What can I do to avoid this?

{
    "index_patterns": "test-index*",
    "settings": {
      "index": {
        "number_of_shards": 1
       },
     "analysis": {
            "analyzer": {
              "my_kuromoji_analyzer": {
                "type": "custom",
                "tokenizer": "ja_tokenizer"
              }
            },
            "tokenizer": {
              "ja_tokenizer": {
                "type": "kuromoji_tokenizer",
                "mode": "normal"
              }
            }
        }
    },
    "mappings": {
      "properties": {
        "title": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "foreign_or_domestic": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "imageUrl": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "actor": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "director": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "rate": {
          "type": "long"
        },
        "genre": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "id": {
          "type": "long"
        }
      }
    }
}

Contents of the mapping of the created index

{
  "test-index-1": {
    "mappings": {
      "properties": {
        "imageUrl": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "title": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "foreign_or_domestic": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "@timestamp": {
          "type": "date"
        },
        "rate": {
          "type": "long"
        },
        "actor": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "director": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "host": {
          "fields": {
            "keyword": {
              "ignore_above": 256,
              "type": "keyword"
            }
          },
          "type": "text"
        },
        "genre": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "path": {
          "fields": {
            "keyword": {
              "ignore_above": 256,
              "type": "keyword"
            }
          },
          "type": "text"
        },
        "message": {
          "fields": {
            "keyword": {
              "ignore_above": 256,
              "type": "keyword"
            }
          },
          "type": "text"
        },
        "@version": {
          "fields": {
            "keyword": {
              "ignore_above": 256,
              "type": "keyword"
            }
          },
          "type": "text"
        },
        "id": {
          "type": "long"
        }
      }
    }
  }
}

Ok that makes sense of the 400 then. If you did that then you don't want Logstash to overwrite it (as per your output config).

My explanation is not clear and seems to have misled you, so I will correct it.

I did not PUT the template at the time of the first question.
I did not PUT the template, but set the file path of template.json in the conf file and created the index in logstash first.
Then, after your first answer,

I'd start by posting that template directly to Elasticsearch and seeing what it says. I don't think that template is valid as it doesn't contain things like an index pattern to match (for eg).

I PUT the template, removed the template.json from the conf file, and created the index in logstash.

The flow of the story is getting confusing and difficult to understand, but what I want to know is the following.

  1. Is it possible to create an index with logstash by specifying template.json in the conf file without first PUTTING the template?

  2. If I can't create an index using the method described in question 1, is it correct to first PUT the template and then create the index from logstash?
    In that case, how can I avoid the fields that are not specified in the template to be included in the mapping of the created index?

I apologize for my poor English, but I hope you can answer my question.

Yes, it is.

So how do you do that?
In my case, the result was as shown in the very first question text (Failed to install template).

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.