logstashでindex作成時にtemplateとtemplate-nameを指定するとFailed to install templateになる

elasticsearch serviceのインスタンスへ向けて実行しています。
予めtemplateをPUTしておかなくてはならないのでしょうか?
(でもそれだとconfファイルにtemplateのJSONファイルやtemplate_nameをわざわざ指定する意味はないですよね・・・)

実行環境
mac book pro
logstash-7.13.0

confファイルの内容

input {
    file {
        mode => "tail"
        path => ["/Users/UserName/logstash-7.13.0/testdata.csv"]
        start_position => "beginning"
        sincedb_path => "/Users/UserName/logstash-7.13.0/sincedb.txt"
        codec => plain { 
            charset => "UTF-8"
        }
    }
}

filter {
    csv {
        separator => ","
        columns => ["id","title","foreign_or_domestic","genre","actor","rate","director","imageUrl"]
        convert => {
            "id" => "integer"
        }
        skip_header => true
    }
}

output {
    elasticsearch { 
    hosts => [ "https://4afbf336dc56486b9c3cf1924d0e1361.asia-northeast1.gcp.cloud.es.io:9243" ]
    index => "mynd-demo-item-test"
    template => "/Users/UserName/logstash-7.13.0/config/test-template.json"
    template_name => "test-template"
    template_overwrite => true
    user => "elastic" 
    password => "*******" 
  } 
    stdout {
        codec => json
    }
}

template.jsonの内容

{
  "test-template": {
    "mappings": {
      "properties": {
        "title": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "foreign_or_domestic": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "imageUrl": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "actor": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "director": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "rate": {
          "type": "long"
        },
        "genre": {
          "type": "text",
          "analyzer": "my_kuromoji_analyzer"
        },
        "id": {
          "type": "long"
        }
      }
    }
  }
}

実行ログ

sudo bin/logstash -f config/test.conf
Using bundled JDK: /Users/UserName/logstash-7.13.0/jdk.app/Contents/Home
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
Sending Logstash logs to /Users/UserName/logstash-7.13.0/logs which is now configured via log4j2.properties
[2021-06-09T13:01:10,368][INFO ][logstash.runner          ] Log4j configuration path used is: /Users/UserName/logstash-7.13.0/config/log4j2.properties
[2021-06-09T13:01:10,380][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.13.0", "jruby.version"=>"jruby 9.2.16.0 (2.5.7) 2021-03-03 f82228dc32 OpenJDK 64-Bit Server VM 11.0.10+9 on 11.0.10+9 +indy +jit [darwin-x86_64]"}
[2021-06-09T13:01:10,552][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2021-06-09T13:01:11,637][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2021-06-09T13:01:12,322][INFO ][org.reflections.Reflections] Reflections took 61 ms to scan 1 urls, producing 24 keys and 48 values
[2021-06-09T13:01:13,480][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://4afbf336dc56486b9c3cf1924d0e1361.asia-northeast1.gcp.cloud.es.io:9243"]}
[2021-06-09T13:01:13,886][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@4afbf336dc56486b9c3cf1924d0e1361.asia-northeast1.gcp.cloud.es.io:9243/]}}
[2021-06-09T13:01:14,627][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@4afbf336dc56486b9c3cf1924d0e1361.asia-northeast1.gcp.cloud.es.io:9243/"}
[2021-06-09T13:01:14,766][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (7.4.2) {:es_version=>7}
[2021-06-09T13:01:14,769][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2021-06-09T13:01:14,907][INFO ][logstash.outputs.elasticsearch][main] Using mapping template from {:path=>"/Users/UserName/logstash-7.13.0/config/test-template.json"}
[2021-06-09T13:01:14,938][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/Users/UserName/logstash-7.13.0/config/test.conf"], :thread=>"#<Thread:0x72b5a0ac run>"}
[2021-06-09T13:01:15,002][INFO ][logstash.outputs.elasticsearch][main] Installing Elasticsearch template {:name=>"test-template"}
[2021-06-09T13:01:15,188][ERROR][logstash.outputs.elasticsearch][main] Failed to install template {:message=>"Got response code '400' contacting Elasticsearch at URL 'https://4afbf336dc56486b9c3cf1924d0e1361.asia-northeast1.gcp.cloud.es.io:9243/_template/test-template'", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError, :backtrace=>["/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80:in `perform_request'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:306:in `perform_request_to_url'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:293:in `block in perform_request'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:382:in `with_connection'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:292:in `perform_request'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:300:in `block in Pool'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:390:in `template_put'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:84:in `template_install'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/template_manager.rb:29:in `install'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch/template_manager.rb:17:in `install_template'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch.rb:496:in `install_template'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch.rb:309:in `finish_register'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/outputs/elasticsearch.rb:279:in `block in register'", "/Users/UserName/logstash-7.13.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.2-java/lib/logstash/plugin_mixins/elasticsearch/common.rb:145:in `block in after_successful_connection'"]}
[2021-06-09T13:01:15,952][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.01}
[2021-06-09T13:01:16,181][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2021-06-09T13:01:16,214][INFO ][filewatch.observingtail  ][main][d370e9c2ca333187908d622d7289e760793d2ca3cb939a53e2ad6f141894e826] START, creating Discoverer, Watch with file and sincedb collections
[2021-06-09T13:01:16,256][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}