Stuck at "Successfully started Lgstash API endpoint {:port=>9600}" and nothing happend when import csv

Hi. I got problem when I import csv file to logstash and output to elasticsearch.

1,Triple Frontier Trailer #2 (2019) | Movieclips Trailers,en,I have never had a feeling as pure a,0.03,14.479
2,Triple Frontier Trailer #2 (2019) | Movieclips Trailers,en,proud as completing a mission all you,8.69,5.819
3,Triple Frontier Trailer #2 (2019) | Movieclips Trailers,en,everything we've done in the last 17,14.54,5.92
4,Triple Frontier Trailer #2 (2019) | Movieclips Trailers,en,years trying to make a difference only,16.89,5.059
5,Triple Frontier Trailer #2 (2019) | Movieclips Trailers,en,never took it done,20.46,4.079
6,Triple Frontier Trailer #2 (2019) | Movieclips Trailers,en,you've been found five times for your,21.949,4.781
7,Triple Frontier Trailer #2 (2019) | Movieclips Trailers,en,country and you can't even afford to,24.539,4.771
8,Triple Frontier Trailer #2 (2019) | Movieclips Trailers,en,send your kids to college I got a job,26.73,6.329
9,Triple Frontier Trailer #2 (2019) | Movieclips Trailers,en,for you I'm retired fish I need a pilot,29.31,5.76

This is my csv file for test just 10 sentences and I tried to import it to elasticsearch through logstash.(This is the only way I know, if you had any other way to index my csv file to elasticsearch index plz let me know)

input {
    file {
        path => "/home/Elastic/logstash/data/subtitles.csv"
        start_position => "beginning"
        sincedb_path => "/dev/null"

filter {
    csv {
        separator => ","
        skip_header => true
        columns => ["id", "title", "defaultLanguage", "caption", "start", "duration"]

output {
    elasticsearch {
        hosts => ["localhost:9200"]
        index => ["subtitles"]
        document_type => "captions"
    stdout { codec => rubydebug }

this is my 'subtitles.conf' file and I load it like

bin/logstash -f config/subtitles.conf

and it show like

OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by com.headius.backport9.modules.Modules (file:/home/zlslsp54/Elastic/logstash/logstash-core/lib/jars/jruby-complete- to field
WARNING: Please consider reporting this to the maintainers of com.headius.backport9.modules.Modules
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Sending Logstash logs to /home/zlslsp54/Elastic/logstash/logs which is now configured via
[WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.0.1"}
[WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"subtitles", id=>"18db3fc1f542d1ea24ea196ec73ac9798d89acac1ff0589ad4759f2f1e206f00", hosts=>[//localhost:9200], document_type=>"captions", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_13ffd8d1-25cf-4a61-a72a-89c90f40e726", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_rollover_alias=>"logstash", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[INFO ][logstash.outputs.elasticsearch] Using default mapping template
[INFO ][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, :thread=>"#<Thread:0x25bbb3b8 run>"}
[INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[INFO ][logstash.javapipeline    ] Pipeline started {""=>"main"}
[INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

As i said, It stuck at [2019-05-02T21:13:15,565][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} this sentence.

I checked my elasticsearch index like curl -X GET "localhost:9200/_cat/indices?v"

and there is no subtitles index which I try to make.

There are some WARN message and I searched at google but I could not find how to solve it.

Thank you for reading.

I would suggest downgrading to an older version of Java.

I am facing the similar issue.

Did it work after downgrading java? If yes, what version are you on currently?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.