Expected one of #, \", ', -, [, / at line 77, column 52 (byte 2039)

(Bao Thai) #1

Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, ", ', -, [, / at line 77, column 52 (byte 2039) after filter {\n\t#Extract key value pairs from fields\n\tkv {\n\t\tsource => "_fields"\n\t\tfield_split => ";"\n\t}\n # Unify the format of MessageType\n if ([operation] == "unmarshalRequest") {\n mutate {\n add_field => {"messageType" => "%{request}Request"}\n }\n }\n if ([messageType] == "CreateOrder" or [messageType] == "ChangeOrder") {\n if ([stage] == "end") {\n mutate {\n update => {"messageType" => "%{messageType}Response"}\n }\n }\n }\n\truby {\n\t code => "event.set('kafka_retry_time', (Time.now.getutc.to_i - event.get('@timestamp').to_i) / 60)"\n\t}\n\n # check the retry time and retry count\n\tif ([kafka_retry_count] and [kafka_retry_count] > ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:42:in compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:50:incompile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:12:in block in compile_sources'", "org/jruby/RubyArray.java:2486:inmap'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in compile_sources'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:51:ininitialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:169:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:inexecute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:315:in block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:inwith_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:312:in block in converge_state'", "org/jruby/RubyArray.java:1734:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:299:in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:inblock in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:inconverge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:90:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:348:inblock in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}

filter {
	#Extract key value pairs from fields
	kv {
		source => "_fields"
		field_split => ";"
    # Unify the format of MessageType
    if ([operation] == "unmarshalRequest") {
        mutate {
            add_field => {"messageType" => "%{request}Request"}
    if ([messageType] == "CreateOrder" or [messageType] == "ChangeOrder") {
        if ([stage] == "end") {
            mutate {
                update => {"messageType" => "%{messageType}Response"}
	ruby {
	    code => "event.set('kafka_retry_time') = (Time.now.getutc.to_i - event.get('@timestamp').to_i) / 60"

    # check the retry time and retry count
	if ([kafka_retry_count] and [kafka_retry_count] > ${KAFKA_MAX_RETRY_COUNT} and [kafka_retry_time]> ${KAFKA_MAX_RETRY_TIME}) {
	    ruby {
	        code => "puts 'Max Retries Reached:' + event.get('kafka_retry_count').to_s + '---' + event.get('kafka_retry_time').to_s + '---' + event.get('orderOperationCorrelationID').to_s"

I could build this on my local machine, but on a remote machine, keeps getting this error message.

*typo of error message and code


And you say your configuration contains

if ([kafka_retry_count] and [kafka_retry_time]> ${KAFKA_MAX_RETRY_TIME}) {

That is not consistent (one is checking the retry_time and the other the retry_count), but it suggests it is failing when it tries to parse the environment substitution ${KAFKA_MAX_RETRY_TIME}.

The documentation says "You can set environment variable references in the configuration for Logstash plugins by using ${var} ." You are trying to do an environment variable reference in the logic of the filter, which is not a plugin configuration. It is not, as far as I know, expected to work.

(Bao Thai) #3


Thanks for your response, regarding the inconsistency, that was my fault for copying the wrong code which I modified post-error message. Please see latest edit.

Actually I have tried using just integers instead of environment variable, there is still some hidden underlying issue (which I suspect it shouldnt be because of the env variable when my local machine can built). Will get back to you upon better investigation.


If you want to post a configuration, then paste it into the edit pane, select it, and click on </> in the toolbar above the edit pane. That will indent it, which results in the formatting and special characters being preserved.

(Bao Thai) #5

I was actually able to reproduce this issue on my local machine now. And that seems to be a env ariable issue after I forced a regular value... but this document suggests that environment variable works within filters?



It works within plugins.

filter {
    if "${KAFKA_MAX_RETRY_COUNT}" ...

is NOT inside a plugin. However, you could do

mutate { add_field => { "[@metadata][kafka_max_retry_count]" => "${KAFKA_MAX_RETRY_COUNT}" } }
mutate { convert => { "[@metadata][kafka_max_retry_count]" => "integer" } }
if [@metadata][kafka_max_retry_count] ...

and that will work. (Fields inside [@metadata] are not added to the event that is output, so it is handy for stashing temporary variables.)

(system) closed #7

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.