Problem with logstash-plain.log

Hi, i'm having problems with my logstash-plain.log file. When I execute the "less" command, this is the code that i receive:

[2020-03-06T11:35:43,304][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-03-06T11:35:43,624][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.6.0"}
[2020-03-06T11:35:47,022][INFO ][org.reflections.Reflections] Reflections took 67 ms to scan 1 urls, producing 20 keys and 40 values
[2020-03-06T11:35:49,011][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2020-03-06T11:35:49,543][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-03-06T11:35:49,678][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-03-06T11:35:49,689][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-03-06T11:35:49,783][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2020-03-06T11:35:49,898][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-03-06T11:35:49,991][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2020-03-06T11:35:50,000][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/etc/logstash/conf.d/logstash-sample.conf"], :thread=>"#<Thread:0x53d572df run>"}
[2020-03-06T11:35:50,039][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-03-06T11:35:52,305][INFO ][logstash.inputs.beats    ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2020-03-06T11:35:52,329][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-03-06T11:35:52,474][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-03-06T11:35:52,503][INFO ][org.logstash.beats.Server][main] Starting server on port: 5044
[2020-03-06T11:35:52,968][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2020-03-06T11:38:24,324][WARN ][logstash.runner          ] SIGINT received. Shutting down.
[2020-03-06T11:38:29,370][WARN ][logstash.runner          ] Received shutdown signal, but pipeline is still waiting for in-flight events
to be processed. Sending another ^C will force quit Logstash, but this may cause
data loss.
[2020-03-06T11:38:29,592][WARN ][org.logstash.execution.ShutdownWatcherExt] {"inflight_count"=>0, "stalling_threads_info"=>{"other"=>[{"thread_id"=>34, "name"=>"[main]<beats", "current_call"=>"[...]/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.4-java/lib/logstash/inputs/beats.rb:204:in `run'"}, {"thread_id"=>30, "name"=>"[main]>worker0", "current_call"=>"[...]/logstash-core/lib/logstash/java_pipeline.rb:262:in `block in start_workers'"}, {"thread_id"=>31, "name"=>"[main]>worker1", "current_call"=>"[...]/logstash-core/lib/logstash/java_pipeline.rb:262:in `block in start_workers'"}, {"thread_id"=>32, "name"=>"[main]>worker2", "current_call"=>"[...]/logstash-core/lib/logstash/java_pipeline.rb:262:in `block in start_workers'"}, {"thread_id"=>33, "name"=>"[main]>worker3", "current_call"=>"[...]/logstash-core/lib/logstash/java_pipeline.rb:262:in `block in start_workers'"}]}}
[2020-03-06T11:38:29,622][ERROR][org.logstash.execution.ShutdownWatcherExt] The shutdown process appears to be stalled due to busy or blocked plugins. Check the logs for more information.
[2020-03-06T11:38:30,951][INFO ][logstash.javapipeline    ] Pipeline terminated {"pipeline.id"=>"main"}
[2020-03-06T11:38:32,024][INFO ][logstash.runner          ] Logstash shut down.

Can someone help me solve the problem?

Thanks a lot!

@Fabio-sama @xeraa help me please

Hi Pablo,

can you better contextualize your problem? I can't see what's wrong not knowing your architecture nor your logstash config file.

Received shutdown signal, but pipeline is still waiting for in-flight events
to be processed. Sending another ^C will force quit Logstash, but this may cause

This makes me think you terminated your process while running (maybe because you were not receiving anything? who knows...).

Then you have an error

The shutdown process appears to be stalled due to busy or blocked plugins.

because you shut down the process while still loading some plugins most likely.

So, which exactly is your problem? You cannot simply post a messed up log screen and shout "Please somebody help! Things are broken!".

Explain your EXACT problem (or what you think it is at least), provide useful info like configuration files and pipelines (properly spaced and formatted) and then wait for an answer before compulsively tag other people.

Thank you

First of all, thanks for your quick answer.

Received shutdown signal, but pipeline is still waiting for in-flight events
to be processed. Sending another ^C will force quit Logstash, but this may cause

For this part. I didn't stop the process.

Also, thanks for the advices. I'm relatively new in this forum and i'm still learning.

This is my logstash config file:

# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.

input {
  beats {
    port => 5044
  }
}

#filter {
#
#}

output {
   #stdout {codec => rubydebug}
  elasticsearch {
    hosts => ["http://localhost:9200"]
    #index => "%{[@metadata][beat]}-%{[@metadata][ version]}"
    #index => "indiceprueba"
    #user => "elastic"
    #password => "changeme"
  }
   stdout {codec => rubydebug}
}

What i want it's to connect Filebeat with Logstash and show them in Kibana. Now i am testing if logstash works correctly and apparently is not.

Don't know if I was more clear.

No problem.

Anyway, it is strange that you didn't stop the process and it logs Sending another ^C will force quit Logstash. You sure you didn't quit it, maybe by mistake?

Also, just remove everyhting, even the elasticsearch section in output. Simply leave the input from beats and the output to stdout and launch logstash so to see in real-time what is happening (bin/logstash -f /whatever_your_conf_file_is).

Finally, can you post here your beat yml configuration? Did you enable the ouput to logstash in that yml config file?

Yes, I'm sure that i'm not stopping the process.

> bin/logstash --path.settings /etc/logstash/ -f /etc/logstash/conf.d/logstash-sample.conf
> OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
> WARNING: An illegal reflective access operation has occurred
> WARNING: Illegal reflective access by com.headius.backport9.modules.Modules (file:/usr/share/logstash/logstash-core/lib/jars/jruby-complete-9.2.9.0.jar) to method sun.nio.ch.NativeThread.signal(long)
> WARNING: Please consider reporting this to the maintainers of com.headius.backport9.modules.Modules
> WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
> WARNING: All illegal access operations will be denied in a future release
> Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
> [2020-03-17T12:48:51,415][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
> [2020-03-17T12:48:51,643][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.6.0"}
> [2020-03-17T12:48:55,126][INFO ][org.reflections.Reflections] Reflections took 56 ms to scan 1 urls, producing 20 keys and 40 values
> [2020-03-17T12:48:56,608][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.RubyArray) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
> [2020-03-17T12:48:56,696][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/etc/logstash/conf.d/logstash-sample.conf"], :thread=>"#<Thread:0x6f7d594a run>"}
> [2020-03-17T12:48:59,205][INFO ][logstash.inputs.beats    ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
> [2020-03-17T12:48:59,233][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
> [2020-03-17T12:48:59,458][INFO ][org.logstash.beats.Server][main] Starting server on port: 5044
> [2020-03-17T12:48:59,458][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
> [2020-03-17T12:49:00,005][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

I think that is running. But i have a doubt, do you know what de WARNING lines are? Is it because i'm not using the last version?

The problem is that i'm trying to connect logstash and filebeat and see the logs on Kibana. And it's not working.
I did it only with filebeat and that works, but when I use logstash I don't see anything in Kibana.
So I don't know what could be the problem.

And yes, i've enable logstash output in my filebeat.yml, ad also I write the route of the file i want to extract the logs.

Thanks for the help

I think that is running. But i have a doubt, do you know what de WARNING lines are?

This might be the cause. What version of Java are you running? And which version of logstash?

I'm Using java 11.0.6 and logstash 7.6.0

The proposed solution in the link you send me is not working for me.

But I think that those WARNINGS, shouldn't affect the correct functioning of logstash, am i wrong?

Yeah, I supposed it, too. But they might be linked to the problem you're having.
This guy seemed to have had a very similar issue, but he cleverly thought not to write here his solution. You ca try to contact him in private.

Anyway, you sure your filebeat is actually getting something and sending it to logstash, right? Did you try a tcpdump to see the traffic between the two?

Yeah, i'm sure because i see it on the terminal. But i don't see the data in Kibana (Discover). I don't know how to use a tcpdump :sweat_smile:

There is an error that is saying that the data is not indexing in elasticsearch.
I'll show you the command i'm using to start logstash and the error that happens when i start the filebeat.

# bin/logstash --path.settings /etc/logstash/ -f /etc/logstash/conf.d/logstash-sample.conf
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by com.headius.backport9.modules.Modules (file:/usr/share/logstash/logstash-core/lib/jars/jruby-complete-9.2.9.0.jar) to method sun.nio.ch.NativeThread.signal(long)
WARNING: Please consider reporting this to the maintainers of com.headius.backport9.modules.Modules
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2020-03-19T17:05:10,652][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-03-19T17:05:10,878][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.6.0"}
[2020-03-19T17:05:14,763][INFO ][org.reflections.Reflections] Reflections took 95 ms to scan 1 urls, producing 20 keys and 40 values
[2020-03-19T17:05:17,544][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2020-03-19T17:05:18,007][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-03-19T17:05:18,142][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-03-19T17:05:18,167][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-03-19T17:05:18,282][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2020-03-19T17:05:18,449][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-03-19T17:05:18,526][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2020-03-19T17:05:18,537][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/etc/logstash/conf.d/logstash-sample.conf"], :thread=>"#<Thread:0x5e74a8da run>"}
[2020-03-19T17:05:18,640][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-03-19T17:05:21,209][INFO ][logstash.inputs.beats    ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2020-03-19T17:05:21,233][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-03-19T17:05:21,431][INFO ][org.logstash.beats.Server][main] Starting server on port: 5044
[2020-03-19T17:05:21,456][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-03-19T17:05:21,915][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2020-03-19T17:06:40,647][ERROR][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-%{[@metadata][ version]}", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x35f571de>], :response=>{"index"=>{"_index"=>"filebeat-%{[@metadata][ version]}", "_type"=>"_doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"invalid_index_name_exception", "reason"=>"Invalid index name [filebeat-%{[@metadata][ version]}], must not contain the following characters [ , \", *, \\, <, |, ,, >, /, ?]", "index_uuid"=>"_na_", "index"=>"filebeat-%{[@metadata][ version]}"}}}}
[2020-03-19T17:06:40,662][ERROR][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-%{[@metadata][ version]}", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x3b102d58>], :response=>{"index"=>{"_index"=>"filebeat-%{[@metadata][ version]}", "_type"=>"_doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"invalid_index_name_exception", "reason"=>"Invalid index name [filebeat-%{[@metadata][ version]}], must not contain the following characters [ , \", *, \\, <, |, ,, >, /, ?]", "index_uuid"=>"_na_", "index"=>"filebeat-%{[@metadata][ version]}"}}}}
[2020-03-19T17:06:40,669][ERROR][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-%{[@metadata][ version]}", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x24c1badf>], :response=>{"index"=>{"_index"=>"filebeat-%{[@metadata][ version]}", "_type"=>"_doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"invalid_index_name_exception", "reason"=>"Invalid index name [filebeat-%{[@metadata][ version]}], must not contain the following characters [ , \", *, \\, <, |, ,, >, /, ?]", "index_uuid"=>"_na_", "index"=>"filebeat-%{[@metadata][ version]}"}}}}
[2020-03-19T17:06:40,669][ERROR][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-%{[@metadata][ version]}", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x47d392c1>], :response=>{"index"=>{"_index"=>"filebeat-%{[@metadata][ version]}", "_type"=>"_doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"invalid_index_name_exception", "reason"=>"Invalid index name [filebeat-%{[@metadata][ version]}], must not contain the following characters [ , \", *, \\, <, |, ,, >, /, ?]", "index_uuid"=>"_na_", "index"=>"filebeat-%{[@metadata][ version]}"}}}}
[2020-03-19T17:06:40,691][ERROR][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-%{[@metadata][ version]}", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x6c51e763>], :response=>{"index"=>{"_index"=>"filebeat-%{[@metadata][ version]}", "_type"=>"_doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"invalid_index_name_exception", "reason"=>"Invalid index name [filebeat-%{[@metadata][ version]}], must not contain the following characters [ , \", *, \\, <, |, ,, >, /, ?]", "index_uuid"=>"_na_", "index"=>"filebeat-%{[@metadata][ version]}"}}}}
[2020-03-19T17:06:40,698][ERROR][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-%{[@metadata][ version]}", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x11e1f026>], :response=>{"index"=>{"_index"=>"filebeat-%{[@metadata][ version]}", "_type"=>"_doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"invalid_index_name_exception", "reason"=>"Invalid index name [filebeat-%{[@metadata][ version]}], must not contain the following characters [ , \", *, \\, <, |, ,, >, /, ?]", "index_uuid"=>"_na_", "index"=>"filebeat-%{[@metadata][ version]}"}}}}
[2020-03-19T17:06:40,704][ERROR][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-%{[@metadata][ version]}", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x2755b110>], :response=>{"index"=>{"_index"=>"filebeat-%{[@metadata][ version]}", "_type"=>"_doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"invalid_index_name_exception", "reason"=>"Invalid index name [filebeat-%{[@metadata][ version]}], must not contain the following characters [ , \", *, \\, <, |, ,, >, /, ?]", "index_uuid"=>"_na_", "index"=>"filebeat-%{[@metadata][ version]}"}}}}
[2020-03-19T17:06:40,710][ERROR][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-%{[@metadata][ version]}", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x8ac0ecc>], :response=>{"index"=>{"_index"=>"filebeat-%{[@metadata][ version]}", "_type"=>"_doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"invalid_index_name_exception", "reason"=>"Invalid index name [filebeat-%{[@metadata][ version]}], must not contain the following characters [ , \", *, \\, <, |, ,, >, /, ?]", "index_uuid"=>"_na_", "index"=>"filebeat-%{[@metadata][ version]}"}}}}
[2020-03-19T17:06:40,715][ERROR][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-%{[@metadata][ version]}", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x57ef567>], :response=>{"index"=>{"_index"=>"filebeat-%{[@metadata][ version]}", "_type"=>"_doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"invalid_index_name_exception", "reason"=>"Invalid index name [filebeat-%{[@metadata][ version]}], must not contain the following characters [ , \", *, \\, <, |, ,, >, /, ?]", "index_uuid"=>"_na_", "index"=>"filebeat-%{[@metadata][ version]}"}}}}
[2020-03-19T17:06:40,720][ERROR][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-%{[@metadata][ version]}", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x5ddc888f>], :response=>{"index"=>{"_index"=>"filebeat-%{[@metadata][ version]}", "_type"=>"_doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"invalid_index_name_exception", "reason"=>"Invalid index name [filebeat-%{[@metadata][ version]}], must not contain the following characters [ , \", *, \\, <, |, ,, >, /, ?]", "index_uuid"=>"_na_", "index"=>"filebeat-%{[@metadata][ version]}"}}}}
[2020-03-19T17:06:40,719][ERROR][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-%{[@metadata][ version]}", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x1efd5ed2>], :response=>{"index"=>{"_index"=>"filebeat-%{[@metadata][ version]}", "_type"=>"_doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"invalid_index_name_exception", "reason"=>"Invalid index name [filebeat-%{[@metadata][ version]}], must not contain the following characters [ , \", *, \\, <, |, ,, >, /, ?]", "index_uuid"=>"_na_", "index"=>"filebeat-%{[@metadata][ version]}"}}}}
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/awesome_print-1.7.0/lib/awesome_print/formatters/base_formatter.rb:31: warning: constant ::Fixnum is deprecated

PD: I have delete some ERROR code lines, because it sais the same all the time. After all of this code, in the terminal, as i said previously, i can see all the data i'm sending with the index structure i have in my logstash config file. But not in kibana.

Thanks a lot for "wasting your time" on trying to help me!

Hold on, why is there a space before version in your index %{[@metadata][beat]}-%{[@metadata][ version]} ? Does it still give you this error if in your elasticsearch output you specify the index like:

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "testing_index"
  }
  
  stdout {}
}
1 Like

Issue could be brace issues...Today in my observation for logstash configuration file it didn't accepted the square braces([]). Please remove them and try with curly braces({})

Square brackets are needed to reference nested fields. In this case he needs the version field nested in metadata field, so square brackets are ok. The problem is Logstash is seeing that string literally, not interpreting the variable, so it's trying to crate and index with literal square brackets in its name (which is not allowed). That is happening because that blank before version shouldn't be there.

OMG, you are a genious! Thanks a lot, that was the problem! Thanks for your time!
Really appreciate your help!

No problem at all :slight_smile:

Now i'm having trouble with this part:

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "testing_index"
  }
  
  stdout {}
}

When I fixed the error of the space, the logs showed in Kibana, but, when I change my index like this one index => "testing_index" it's like Elasticsearch isn't receiving any data.

Do you know what could be the problem?

I'm trying with a csv file, and every time i stop the filebeat, i delete the registry, becasuse for now it's a test.