Logstash setup with Docker - [exits after ~25s][not able to find the problem]

Hi all,

I'm trying to insert a large csv file in Elasticsearch using Logstash.
I manage to make it work, but after a while testing some different mutate transformations it stopped working.
I even tried a minimal config file (see below), what is not working either.
The log file doesn't show any particular issue but a generic one ("Expected one of [ \\t\\r\\n], \"#\", \"input\", \"filter\", \"output\" at line 1, column 1 (byte 1)") which is shown completely in line 22.

Every time I change the config file, I removed the container and create it again. And when it was working, I was deleting the index in elasticsearch (DELETE /g_a)

I checked other threads about the Logstash config file and I verified if the config file is in UTF-8 enconding, if there is no wrong syntax, and I even rewrited everything to make sure there were no invisible characters introduced by a copy and paste from some webpage.

Docker commands:

docker network create elasticnet

docker run -d --name elasticsearch --net elasticnet -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" -v d:/elk/elasticsearch/data:/usr/share/elasticsearch/data elasticsearch:7.14.0

docker run -d --name kibana --net elasticnet -p 5601:5601 kibana:7.14.0

docker run -d --name logstash --net elasticnet -v d:/elk/logstash/pipeline:/usr/share/logstash/pipeline -v d:/si/gn:/home logstash:7.14.0

Logstash config file (logstash.conf):

input {
  file {
    path => "/home/all/all.txt"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}

filter {
  csv{
      separator => "	"
      skip_header => "false"
      columns => ["gd","name","asciiname","alternatenames","latitude","longitude","feature class","feature code","country code","cc2","admin1 code","admin2 code","admin3 code","admin4 code","population","elevation","dem","timezone","modification date"]
  }
}

output {
   elasticsearch {
     hosts => ["elasticsearch","http://elasticsearch:9200", "elasticsearch:9200", "http://localhost:9200"]
     index => "g_a"
  }
stdout {}
}

Logstash logs

Using bundled JDK: /usr/share/logstash/jdk
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bundler-1.17.3/lib/bundler/rubygems_integration.rb:200: warning: constant Gem::ConfigMap is deprecated
Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
[2021-08-12T13:05:56,512][INFO ][logstash.runner ] Log4j configuration path used is: /usr/share/logstash/config/log4j2.properties
[2021-08-12T13:05:56,522][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.14.0", "jruby.version"=>"jruby 9.2.19.0 (2.5.8) 2021-06-15 55810c552b OpenJDK 64-Bit Server VM 11.0.11+9 on 11.0.11+9 +indy +jit [linux-x86_64]"}
[2021-08-12T13:05:56,548][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
[2021-08-12T13:05:56,560][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
[2021-08-12T13:05:56,835][INFO ][logstash.agent ] No persistent UUID file found. Generating new UUID {:uuid=>"e71a12c4-be48-428d-be6c-7f579c34ccaf", :path=>"/usr/share/logstash/data/uuid"}
[2021-08-12T13:05:57,522][WARN ][logstash.monitoringextension.pipelineregisterhook] xpack.monitoring.enabled has not been defined, but found elasticsearch configuration. Please explicitly set xpack.monitoring.enabled: true in logstash.yml
[2021-08-12T13:05:57,524][WARN ][deprecation.logstash.monitoringextension.pipelineregisterhook] Internal collectors option for Logstash monitoring is deprecated and targeted for removal in the next major version.
Please configure Metricbeat to monitor Logstash. Documentation can be found at:
Collect Logstash monitoring data with Metricbeat | Logstash Reference [8.11] | Elastic
[2021-08-12T13:05:57,805][WARN ][deprecation.logstash.outputs.elasticsearch] Relying on default value of pipeline.ecs_compatibility, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode.
[2021-08-12T13:05:58,086][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://elasticsearch:9200/]}}
[2021-08-12T13:05:58,276][WARN ][logstash.licensechecker.licensereader] Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
[2021-08-12T13:05:58,311][INFO ][logstash.licensechecker.licensereader] Elasticsearch version determined (7.14.0) {:es_version=>7}
[2021-08-12T13:05:58,312][WARN ][logstash.licensechecker.licensereader] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2021-08-12T13:05:58,386][INFO ][logstash.monitoring.internalpipelinesource] Monitoring License OK
[2021-08-12T13:05:58,387][INFO ][logstash.monitoring.internalpipelinesource] Validated license for monitoring. Enabling monitoring pipeline.
[2021-08-12T13:05:58,666][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2021-08-12T13:05:59,079][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \t\r\n], "#", "input", "filter", "output" at line 1, column 1 (byte 1)", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:in compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:187:in initialize'", "org/logstash/execution/JavaBasePipelineExt.java:72:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:47:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:391:in block in converge_state'"]}
[2021-08-12T13:05:59,152][INFO ][org.reflections.Reflections] Reflections took 57 ms to scan 1 urls, producing 120 keys and 417 values
[2021-08-12T13:05:59,631][WARN ][deprecation.logstash.outputs.elasticsearchmonitoring] Relying on default value of pipeline.ecs_compatibility, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode.
[2021-08-12T13:05:59,670][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearchMonitoring", :hosts=>["http://elasticsearch:9200"]}
[2021-08-12T13:05:59,686][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://elasticsearch:9200/]}}
[2021-08-12T13:05:59,694][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
[2021-08-12T13:05:59,699][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Elasticsearch version determined (7.14.0) {:es_version=>7}
[2021-08-12T13:05:59,700][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2021-08-12T13:05:59,739][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Configuration is data stream compliant but due backwards compatibility Logstash 7.x will not assume writing to a data-stream, default behavior will change on Logstash 8.0 (set data_stream => true/false to disable this warning)
[2021-08-12T13:05:59,740][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Configuration is data stream compliant but due backwards compatibility Logstash 7.x will not assume writing to a data-stream, default behavior will change on Logstash 8.0 (set data_stream => true/false to disable this warning)
[2021-08-12T13:05:59,747][WARN ][logstash.javapipeline ][.monitoring-logstash] 'pipeline.ordered' is enabled and is likely less efficient, consider disabling if preserving event order is not necessary
[2021-08-12T13:05:59,792][INFO ][logstash.javapipeline ][.monitoring-logstash] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2, "pipeline.sources"=>["monitoring pipeline"], :thread=>"#<Thread:0x5c245967 run>"}
[2021-08-12T13:06:00,330][INFO ][logstash.javapipeline ][.monitoring-logstash] Pipeline Java execution initialization time {"seconds"=>0.54}
[2021-08-12T13:06:00,350][INFO ][logstash.javapipeline ][.monitoring-logstash] Pipeline started {"pipeline.id"=>".monitoring-logstash"}
[2021-08-12T13:06:01,732][INFO ][logstash.javapipeline ][.monitoring-logstash] Pipeline terminated {"pipeline.id"=>".monitoring-logstash"}
[2021-08-12T13:06:02,500][INFO ][logstash.runner ] Logstash shut down.

Thanks!

See this as an example of the many threads about this.

Thank you @Badger
I didn't know that a file with a completely different name could cause this error.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.