Logstash log containing huge nested JSON-objects

Hey guys,

since we upgraded our stack components to version 8.10.2, Logstash's internal logging behaviour has changed.

For example, after all pipelines were startet, Logstash logs the following message:

{
  "level": "INFO",
  "loggerName": "logstash.agent",
  "timeMillis": 1699364877155,
  "thread": "Agent thread",
  "logEvent": {
    "message": "Pipelines running",
    "count": 21,
    "running_pipelines": [
      {
        "id": 18957,
        "type": {
          "id": 26,
          "idTest": {
            "varargsCollector": false
          },
          "baseName": "Symbol",
          "methods": {
            "next": {
              "implementationClass": {
                "id": 26,
                "idTest": {
                  "varargsCollector": false
                },
                "baseName": "Symbol",
                "methods": {
                  "next": {
                    "implementationClass": {
                      "id": 26,
                      "idTest": {
                        "varargsCollector": false
                      },
                      "baseName": "Symbol",
                      "methods": {
                        "next": {
                          "implementationClass": {
                            "id": 26,
                            "idTest": {
                              "varargsCollector": false
                            },
  ### REPEATING SO OFTEN THAT MY BROWSER CRASHED WHILE TRYING TO ENTER IT HERE ###

          "methods": {
  "next": {
    "implementationClass": {
      "running_pipelines": "[OUR_RUNNING_PIPELINES]",
      "non_running_pipelines": []
     }
  }
}

Another example:

{
  "level": "WARN",
  "loggerName": "logstash.filters.translate",
  "timeMillis": 1699365074524,
  "thread": "Converge PipelineAction::Create<solr>",
  "logEvent": {
    "message": "You are using a deprecated config setting \"field\" set in translate. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Use `source` option instead. If you have any questions about this, please visit the #logstash channel on freenode irc.",
    "name": "field",
    "plugin": {
      "var7": {
        "dictionary": {
          "/replication": "replication",
          "/update": "update",
          "/select": "search"
        },
        "id": "3b66853647748445bf34d14a5a6117ad0a5c88ea1621ebd206d9efcdb986f3b3",
        "field": "[solr][path]",
        "target": "[event][action]"
      },
      "var9": {
        "nativeClassIndex": "OBJECT",
        "runtime": {
          "nilPrefilledArray": [
            {
              "nativeClassIndex": "NIL",
              "immediate": true,
              "singletonClass": {
                "id": 15,
                "idTest": {
                  "varargsCollector": false
                },
                "baseName": "NilClass",
                "methods": {
                  "=~": {
                    "implementationClass": {
                      "id": 15,
                      "idTest": {
                        "varargsCollector": false
                      },
                      "baseName": "NilClass",
                      "methods": {
                        "=~": {
                          "implementationClass": {
                            "id": 15,
                            "idTest": {
                              "varargsCollector": false
                            },

Logstash runs in a container and produces json logs with this log4j2 config (only showing relevant sections):

appender.json_console.type = Console
appender.json_console.name = json_console
appender.json_console.layout.type = JSONLayout
appender.json_console.layout.compact = true
appender.json_console.layout.eventEol = true

appender.json_rolling.type = RollingFile
appender.json_rolling.name = json_rolling
appender.json_rolling.fileName = ${sys:ls.logs}/logstash-json.log
appender.json_rolling.filePattern = ${sys:ls.logs}/logstash-json-%d{yyyy-MM-dd}-%i.log.gz
appender.json_rolling.policies.type = Policies
appender.json_rolling.policies.time.type = TimeBasedTriggeringPolicy
appender.json_rolling.policies.time.interval = 1
appender.json_rolling.policies.time.modulate = true
appender.json_rolling.layout.type = JSONLayout
appender.json_rolling.layout.compact = true
appender.json_rolling.layout.eventEol = true
appender.json_rolling.policies.size.type = SizeBasedTriggeringPolicy
appender.json_rolling.policies.size.size = 100MB
appender.json_rolling.strategy.type = DefaultRolloverStrategy
appender.json_rolling.strategy.max = 30
appender.json_rolling.avoid_pipelined_filter.type = PipelineRoutingFilter

rootLogger.level = ${sys:ls.log.level}

# Select which logging format should be used by (out-)commenting two of the following four lines
#rootLogger.appenderRef.console.ref = plain_console
#rootLogger.appenderRef.rolling.ref = plain_rolling
rootLogger.appenderRef.console.ref = json_console
rootLogger.appenderRef.rolling.ref = json_rolling
rootLogger.appenderRef.routing.ref = pipeline_routing_appender

These huge JSON objects lead to errors while trying to log these events as well:

2023-11-07 13:51:15,210 Converge PipelineAction::Create<solr> ERROR com.fasterxml.jackson.core.JsonGenerationException: Can not write a field name, expecting a value com.fasterxml.jackson.core.JsonGenerationException: Can not write a field name, expecting a value
	at com.fasterxml.jackson.core.JsonGenerator._reportError(JsonGenerator.java:2849)
	at com.fasterxml.jackson.core.json.WriterBasedJsonGenerator.writeFieldName(WriterBasedJsonGenerator.java:153)
	at com.fasterxml.jackson.core.JsonGenerator.writeObjectField(JsonGenerator.java:2408)
	at org.logstash.log.CustomLogEventSerializer.serialize(CustomLogEventSerializer.java:53)
	at org.logstash.log.CustomLogEventSerializer.serialize(CustomLogEventSerializer.java:34)
	at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider._serialize(DefaultSerializerProvider.java:479)
	at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:318)
	at com.fasterxml.jackson.databind.ObjectWriter$Prefetch.serialize(ObjectWriter.java:1572)
	at com.fasterxml.jackson.databind.ObjectWriter._writeValueAndClose(ObjectWriter.java:1273)
	at com.fasterxml.jackson.databind.ObjectWriter.writeValue(ObjectWriter.java:1114)
	at org.apache.logging.log4j.core.layout.AbstractJacksonLayout.toSerializable(AbstractJacksonLayout.java:344)
	at org.apache.logging.log4j.core.layout.JsonLayout.toSerializable(JsonLayout.java:292)
	at org.apache.logging.log4j.core.layout.AbstractJacksonLayout.toSerializable(AbstractJacksonLayout.java:292)
	at org.apache.logging.log4j.core.layout.JsonLayout.toSerializable(JsonLayout.java:70)
	at org.apache.logging.log4j.core.layout.AbstractJacksonLayout.toSerializable(AbstractJacksonLayout.java:52)
	at org.apache.logging.log4j.core.layout.AbstractStringLayout.toByteArray(AbstractStringLayout.java:282)
	at org.apache.logging.log4j.core.layout.AbstractLayout.encode(AbstractLayout.java:209)
	at org.apache.logging.log4j.core.layout.AbstractLayout.encode(AbstractLayout.java:37)
	at org.apache.logging.log4j.core.appender.AbstractOutputStreamAppender.directEncodeEvent(AbstractOutputStreamAppender.java:197)
	at org.apache.logging.log4j.core.appender.AbstractOutputStreamAppender.tryAppend(AbstractOutputStreamAppender.java:190)
	at org.apache.logging.log4j.core.appender.AbstractOutputStreamAppender.append(AbstractOutputStreamAppender.java:181)
	at org.apache.logging.log4j.core.config.AppenderControl.tryCallAppender(AppenderControl.java:161)
	at org.apache.logging.log4j.core.config.AppenderControl.callAppender0(AppenderControl.java:134)
	at org.apache.logging.log4j.core.config.AppenderControl.callAppenderPreventRecursion(AppenderControl.java:125)
	at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:89)
	at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:542)
	at org.apache.logging.log4j.core.config.LoggerConfig.processLogEvent(LoggerConfig.java:500)
	at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:483)
	at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:417)
	at org.apache.logging.log4j.core.config.AwaitCompletionReliabilityStrategy.log(AwaitCompletionReliabilityStrategy.java:82)
	at org.apache.logging.log4j.core.Logger.log(Logger.java:161)
	at org.apache.logging.log4j.spi.AbstractLogger.tryLogMessage(AbstractLogger.java:2205)
	at org.apache.logging.log4j.spi.AbstractLogger.logMessageTrackRecursion(AbstractLogger.java:2159)
	at org.apache.logging.log4j.spi.AbstractLogger.logMessageSafely(AbstractLogger.java:2142)
	at org.apache.logging.log4j.spi.AbstractLogger.logMessage(AbstractLogger.java:2034)
	at org.apache.logging.log4j.spi.AbstractLogger.logIfEnabled(AbstractLogger.java:1899)
	at org.apache.logging.log4j.spi.AbstractLogger.warn(AbstractLogger.java:2784)
	at org.logstash.log.LoggerExt.rubyWarn(LoggerExt.java:107)
	at usr.share.logstash.logstash_minus_core.lib.logstash.config.mixin.RUBY$block$config_init$5(/usr/share/logstash/logstash-core/lib/logstash/config/mixin.rb:126)
	at org.jruby.runtime.CompiledIRBlockBody.yieldDirect(CompiledIRBlockBody.java:151)
	at org.jruby.runtime.IRBlockBody.yieldSpecificMultiArgsCommon(IRBlockBody.java:110)
	at org.jruby.runtime.IRBlockBody.yieldSpecific(IRBlockBody.java:122)
	at org.jruby.runtime.Block.yieldSpecific(Block.java:175)
	at org.jruby.RubyHash$10.visit(RubyHash.java:1605)
	at org.jruby.RubyHash$10.visit(RubyHash.java:1599)
	at org.jruby.RubyHash.visitLimited(RubyHash.java:759)
	at org.jruby.RubyHash.visitAll(RubyHash.java:744)
	at org.jruby.RubyHash.iteratorVisitAll(RubyHash.java:1563)
	at org.jruby.RubyHash.each_pairCommon(RubyHash.java:1594)
	at org.jruby.RubyHash.each(RubyHash.java:1587)
	at usr.share.logstash.logstash_minus_core.lib.logstash.config.mixin.RUBY$method$config_init$0(/usr/share/logstash/logstash-core/lib/logstash/config/mixin.rb:121)
	at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:165)
	at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:185)
	at org.jruby.ir.targets.indy.InvokeSite.failf(InvokeSite.java:404)
	at usr.share.logstash.logstash_minus_core.lib.logstash.filters.base.RUBY$method$initialize$0(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:141)
	at usr.share.logstash.logstash_minus_core.lib.logstash.filters.base.RUBY$method$initialize$0$__VARARGS__(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:139)
	at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:139)
	at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:112)
	at org.jruby.ir.runtime.IRRuntimeHelpers.unresolvedSuper(IRRuntimeHelpers.java:1427)
	at org.jruby.ir.runtime.IRRuntimeHelpers.unresolvedSuperSplatArgs(IRRuntimeHelpers.java:1404)
	at org.jruby.ir.targets.indy.UnresolvedSuperInvokeSite.invoke(UnresolvedSuperInvokeSite.java:22)
	at usr.share.logstash.vendor.bundle.jruby.$3_dot_1_dot_0.gems.logstash_minus_mixin_minus_ecs_compatibility_support_minus_1_dot_3_dot_0_minus_java.lib.logstash.plugin_mixins.ecs_compatibility_support.selector.RUBY$block$initialize$0(/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-mixin-ecs_compatibility_support-1.3.0-java/lib/logstash/plugin_mixins/ecs_compatibility_support/selector.rb:61)
	at org.jruby.runtime.CompiledIRBlockBody.callDirect(CompiledIRBlockBody.java:141)
	at org.jruby.runtime.MixedModeIRBlockBody.callDirect(MixedModeIRBlockBody.java:103)
	at org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:64)
	at org.jruby.runtime.Block.call(Block.java:147)
	at org.jruby.RubyProc.call(RubyProc.java:373)
	at org.jruby.internal.runtime.methods.ProcMethod.call(ProcMethod.java:66)
	at org.jruby.runtime.Helpers.invokeSuper(Helpers.java:754)
	at org.jruby.runtime.Helpers.invokeSuper(Helpers.java:724)
	at org.logstash.plugins.factory.ContextualizerExt.initialize(ContextualizerExt.java:97)
	at org.logstash.plugins.factory.ContextualizerExt$INVOKER$s$0$0$initialize.call(ContextualizerExt$INVOKER$s$0$0$initialize.gen)
	at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:452)
	at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:92)
	at org.jruby.RubyClass.newInstance(RubyClass.java:931)
	at org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)
	at org.jruby.RubyClass.finvokeWithRefinements(RubyClass.java:522)
	at org.jruby.RubyClass.finvoke(RubyClass.java:510)
	at org.jruby.runtime.Helpers.invoke(Helpers.java:644)
	at org.jruby.RubyBasicObject.callMethod(RubyBasicObject.java:386)
	at org.logstash.plugins.factory.ContextualizerExt.initializePlugin(ContextualizerExt.java:80)
	at org.logstash.plugins.factory.ContextualizerExt.initializePlugin(ContextualizerExt.java:53)
	at org.logstash.plugins.factory.PluginFactoryExt.filterDelegator(PluginFactoryExt.java:73)
	at org.logstash.plugins.factory.PluginFactoryExt.plugin(PluginFactoryExt.java:250)
	at org.logstash.plugins.factory.PluginFactoryExt.buildFilter(PluginFactoryExt.java:147)
	at org.logstash.config.ir.CompiledPipeline.setupFilters(CompiledPipeline.java:187)
	at org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:117)
	at org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:186)
	at org.logstash.execution.AbstractPipelineExt$INVOKER$i$initialize.call(AbstractPipelineExt$INVOKER$i$initialize.gen)
	at org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:847)
	at org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1318)
	at org.jruby.ir.instructions.InstanceSuperInstr.interpret(InstanceSuperInstr.java:139)
	at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)
	at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)
	at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:128)
	at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:115)
	at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)
	at org.jruby.RubyClass.newInstance(RubyClass.java:931)
	at org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)
	at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)
	at org.jruby.ir.instructions.CallBase.interpret(CallBase.java:561)
	at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)
	at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)
	at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:88)
	at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:238)
	at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:225)
	at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:228)
	at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:291)
	at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:328)
	at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)
	at org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:116)
	at org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)
	at org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:66)
	at org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)
	at org.jruby.runtime.Block.call(Block.java:143)
	at org.jruby.RubyProc.call(RubyProc.java:352)
	at org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:110)
	at java.base/java.lang.Thread.run(Thread.java:833)

Why does the message contain all these nested objects and how do I get rid of them?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.