Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:vpn, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \t\r\n], "#", [A-Za-z0-9_-], '"', "'", [A-Za-z_], "-", [0-9], "[", "{" at line 4, column 43 (byte 108) after input {\n kafka {\n bootstrap_servers => "lc-helk.nic.in:9092"\n topics => ["vpn1","vpn2","vpn7","ise",", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:in compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:187:in initialize'", "org/logstash/execution/JavaBasePipelineExt.java:72:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:47:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:383:in block in converge_state'"]}
here is my input file
input {
kafka {
bootstrap_servers => "x.x.x.x:9092"
topics => ["vpn1","vpn2","vpn7","ise"]
decorate_events => true
#codec => "json"
max_poll_records => 100
max_poll_interval_ms => 600000
############################# HELK Kafka Group Consumption #############################
# Enable logstash to not continuously restart consumption of docs/logs it already has. However if you need it to, then change the 'group_id' value to something else (ex: could be a simp value like '100_helk_logstash')
enable_auto_commit => true
# During group_id or client_id changes, the kafka client will consume from earliest document so as not to lose data
auto_offset_reset => "earliest"
# If you have multiple logstash instances, this is your ID so that each instance consumes a slice of the Kafka pie.
# No need to change this unless you know what your doing and for some reason have the need
group_id => "vpn_user"
# Change to number of Kafka partitions, only change/set if scaling on large environment & customized your Kafka partitions
# Default value is 1, read documentation for more info: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-kafka.html#plugins-inputs-kafka-consumer_threads
consumer_threads => 3
#heartbeat_interval_ms => 1000
#poll_timeout_ms => 10000
#session_timeout_ms => 120000
#request_timeout_ms => 130000
}
}
Are The input and output in separate files? Or is that one file?
If it's in separate files. Are there other files that are being pulled in? In other words, are there other files in the same conf directory?
Can you show the complete pipeline altogether?
It really helps us if you give us complete context at one time.
Basically that error is a basic syntax error What can make it a little bit hard is sometimes the error line number is not the exact line because it's missing some closing, syntax etc later in the file.
I would recommend that you double check your config, if you copied it from another place, delete and write it again, check your double quotes, sometimes depending on where you are editing your file they may not be correct.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.