Logstash starting error with JSON codec plugin

Hi,

I'm trying to send some logs with rsyslog in JSON format to my logstash v8.6 server, but it seems that there is a problem with my JSON codec.

Here is the error log :

[2023-01-27T11:31:47,917][INFO ][logstash.runner          ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2023-01-27T11:31:47,923][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.6.0", "jruby.version"=>"jruby 9.3.8.0 (2.6.8) 2022-09-13 98d69c9461 OpenJDK 64-Bit Server VM 17.0.5+8 on 17.0.5+8 +indy +jit [x86_64-linux]"}
[2023-01-27T11:31:47,926][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2023-01-27T11:31:48,585][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2023-01-27T11:31:48,973][INFO ][org.reflections.Reflections] Reflections took 141 ms to scan 1 urls, producing 127 keys and 444 values
[2023-01-27T11:31:49,027][ERROR][logstash.plugins.registry] Unable to load plugin. {:type=>"codec", :name=>"\"json\""}
[2023-01-27T11:31:49,038][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (PluginLoadingError) **Couldn't find any codec plugin named '\"json\"'**. Are you sure this is correct? Trying to load the \"json\" codec plugin resulted in this error: Unable to load the requested plugin named \"json\" of type codec. The plugin is not installed.", :backtrace=>["org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:120)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:181)", "org.logstash.execution.AbstractPipelineExt$INVOKER$i$initialize.call(AbstractPipelineExt$INVOKER$i$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:846)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1229)", "org.jruby.ir.instructions.InstanceSuperInstr.interpret(InstanceSuperInstr.java:131)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:361)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:128)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:115)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:329)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:87)", "org.jruby.RubyClass.newInstance(RubyClass.java:911)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:329)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:87)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:549)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:361)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:92)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:238)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:225)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:226)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:393)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:206)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:325)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:116)", "org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:66)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.Block.call(Block.java:143)", "org.jruby.RubyProc.call(RubyProc.java:309)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:107)", "java.base/java.lang.Thread.run(Thread.java:833)"]}
[2023-01-27T11:31:49,060][INFO ][logstash.runner          ] Logstash shut down.
[2023-01-27T11:31:49,072][FATAL][org.logstash.Logstash    ] Logstash stopped processing because of an error: (SystemExit) exit

The fact is that the JSON codec and filter and installed and updated to the latest version:

# /usr/share/logstash/bin/logstash-plugin list | grep json
logstash-codec-json
logstash-codec-json_lines
logstash-filter-json

Here is my rsyslog conf file :

# cat logstash-json.conf
template(name="json-template"
        type="list"
        option.json="on") {
                constant(value="{")
                        constant(value="\"@timestamp\":\"")     property(name="timereported" dateFormat="rfc3339")
                        constant(value="\",\"@version\":\"1")
                        constant(value="\",\"message\":\"")     property(name="msg")
                        constant(value="\",\"host\":\"")        property(name="hostname")
                        constant(value="\",\"severity\":\"")    property(name="syslogseverity-text")
                        constant(value="\",\"facility\":\"")    property(name="syslogfacility-text")
                        constant(value="\",\"programname\":\"") property(name="programname")
                        constant(value="\",\"procid\":\"")      property(name="procid")
                constant(value="\"}\n")
}

action(type="omfwd" target="elkglbvprd1.mediapost.fr" port="10514" protocol="tcp" template="json-template")

The Logstash configuration file :

# cat syslog-json.conf
input {
        tcp {
                port => 10514
                codec => "json"
                type => "rsyslog"
        }
}

filter {
        # This replaces the host field (UDP source) with the host that generated the message (sysloghost)
        if [sysloghost] {
                mutate {
                        replace => [ "host", "%{sysloghost}" ]
                        remove_field => "sysloghost" # prune the field after successfully replacing "host"
                }
        }
}

output {
        elasticsearch {
                hosts => "https://localhost:9200"
                ssl => true
                cacert => "/etc/logstash/certs/http_ca.crt"
                user => "logstash_writer"
                password => "logstash"
                index => "rsyslog-%{+YYYY.MM.dd}"
        }
}

Any idea on how to solve this is welcome.

Regards,

There is something wrong in your config since it is escaping the quotes.

Did you create it manually or used any automation? Did you create it in the same system you are running or created it elsewhere and copy-pasted the configuration into the file?

Can you try to edit the file, delete the double quotes and type again to see if it works or give you a different error?

Hi,

Just replaced my logstash conf file with this one and it seems to work :

input {
        tcp {
                port => 10514
                codec => "json_lines"
        }
}

filter {
        # This replaces the host field (UDP source) with the host that generated the message (sysloghost)
        if [sysloghost] {
                mutate {
                        replace => [ "host", "%{sysloghost}" ]
                        remove_field => "sysloghost" # prune the field after successfully replacing "host"
                }
        }
}

output {
        elasticsearch {
                hosts => "https://localhost:9200"
                ssl => true
                cacert => "/etc/logstash/certs/http_ca.crt"
                user => "logstash_writer"
                password => "logstash"
                index => "rsyslog-%{+YYYY.MM.dd}"
        }
}

Just have those "errors" which I need to investigate:

[2023-01-27T14:35:32,122][INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"rsyslog-%{+YYYY.MM.dd}"}
[2023-01-27T14:35:32,122][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`

Could you quicly explain some best practice and if data stream is needed or not ?

Thx

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.