logstashからElasticsearch Serviceへindex作成を実行すると実行ログが途中で止まる

試しにindexの作成を行おうと思い、以下の内容で実行したのですがログが途中で止まり、
indexが作成されてるかクエリで確認しても作成されてないという状態で、どこの設定が間違ってるのかわからない状況です。

お手数ですが、ご教示いただけると幸いです。

参考にした記事

使用したconfファイルの内容

input {
    file {
        mode => "tail"
        path => ["/Users/User_Name/logstash-7.13.0/testdata.csv"]
        start_position => "beginning"
        codec => plain { 
            charset => "UTF-8"
        }
    }
}

filter {
    csv {
        columns => ["id","title","foreign_or_domestic","genre","actor","rate","director","imageUrl"]
        convert => {
            "id" => "integer"
        }
        skip_header => true
    }
}

output {
    elasticsearch { 
    hosts => [ "https://elasticsearch_endpoint_url" ]
    index => "test_index" 
    user => "elastic" 
    password => "my_elasticsearch_password" 
  } 
    stdout {
        codec => rubydebug
    }
}

実行したコマンド

sudo bin/logstash -f config/test.conf

実行ログ

OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
Sending Logstash logs to /Users/UserName/logstash-7.13.0/logs which is now configured via log4j2.properties
[2021-06-08T12:38:20,145][INFO ][logstash.runner          ] Log4j configuration path used is: /Users/UserName/logstash-7.13.0/config/log4j2.properties
[2021-06-08T12:38:20,156][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.13.0", "jruby.version"=>"jruby 9.2.16.0 (2.5.7) 2021-03-03 f82228dc32 OpenJDK 64-Bit Server VM 11.0.10+9 on 11.0.10+9 +indy +jit [darwin-x86_64]"}
[2021-06-08T12:38:20,237][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2021-06-08T12:38:20,896][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2021-06-08T12:38:22,777][INFO ][org.reflections.Reflections] Reflections took 237 ms to scan 1 urls, producing 24 keys and 48 values
[2021-06-08T12:38:24,640][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://4afbf336dc56486b9c3cf1924d0e1361.asia-northeast1.gcp.cloud.es.io:9243"]}
[2021-06-08T12:38:24,933][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@4afbf336dc56486b9c3cf1924d0e1361.asia-northeast1.gcp.cloud.es.io:9243/]}}
[2021-06-08T12:38:25,939][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@4afbf336dc56486b9c3cf1924d0e1361.asia-northeast1.gcp.cloud.es.io:9243/"}
[2021-06-08T12:38:26,075][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (7.4.2) {:es_version=>7}
[2021-06-08T12:38:26,077][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2021-06-08T12:38:26,195][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2021-06-08T12:38:26,222][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/Users/UserName/logstash-7.13.0/config/test.conf"], :thread=>"#<Thread:0x2df41202 run>"}
[2021-06-08T12:38:27,309][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.08}
[2021-06-08T12:38:27,569][INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/Users/UserName/logstash-7.13.0/data/plugins/inputs/file/.sincedb_70e668f40bb76241a4ce142f8dba4d7b", :path=>["/Users/UserName/logstash-7.13.0/testdata.csv"]}
[2021-06-08T12:38:27,586][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2021-06-08T12:38:27,635][INFO ][filewatch.observingtail  ][main][9457f4cd1ff3a93160a05a139ade7bdea2b446bba23c86ab749f4e152f978b46] START, creating Discoverer, Watch with file and sincedb collections
[2021-06-08T12:38:27,646][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.