Gelf with log4j 1.2

Hi all,

First post on this forum, so hi all ! And long life to elastic ! :vulcan_salute:

I'm here cause I need some advice about GELF and log4j.

As for now, we used "log4j" input to receive data from app using log4j 1.2 with MDC.

As we want to upgrade the elk to latest version, log4j input plugin is now deprecated, and no log are parse by the plugin.

As we can't use filebeat cause it don't manage the MDC fields (and don't want more disk io), we try gelf.
It works, but I think we lost logs.

Here the config:
INPUT:

input {
gelf {
host => "10.x.x.x"
type => "java_app"
port_tcp => "4560"
use_tcp => "true"
use_udp => "false"
}
}

OUTPUT:

output {
if [type] == "java_app" {
elasticsearch {
index => "java_app-%{+YYYY.MM.dd}"
hosts => [ "host1","host2" ]
manage_template => false
}
}
}

No filter for now, I just try to be more simple as possible for the test.

And the last, is the log4j appender:

   <appender name="SOCKET" class="biz.paluch.logging.gelf.log4j.GelfLogAppender">
          <param name="Threshold" value="DEBUG" />
          <param name="Host" value="tcp:logstash.hostname" />
           <param name="Port" value="4560" />
           <param name="ExtractStackTrace" value="true" />
           <param name="FilterStackTrace" value="true" />
           <param name="TimestampPattern" value="yyyy-MM-dd HH:mm:ss,SSS" />
           <param name="MaximumMessageSize" value="32768" />
           <param name="IncludeFullMdc" value="true" />
           <param name="additionalFields" value="application=${applicationName}" />
           <param name="additionalFieldTypes" value="application=String" />
           <GelfLayout compressionType="GZIP" compressionThreshold="1" />
   </appender>

As I say, everything works , I see logs parse by logstash and available in Kibana, but I get many warn logs like this in logstash:

[2018-02-22T20:42:12,196][WARN ][logstash.inputs.gelf ] Gelf (tcp): failed to parse a message. Skipping: eventIdt {:exception=>#<JSON::ParserError: unexpected token at 'eventIdt'>, :backtrace=>["json/ext/Parser.java:250:in parse'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/json-1.8.6-java/lib/json/common.rb:155:in parse'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-gelf-3.1.0/lib/logstash/inputs/gelf.rb:155:in tcp_listener'"]}

And I don't why and how to solve that !
I try to set the codec to "json" in the input, with no success.

Somebody have an idea why ?

I take advice or other experience too if there a better way to send directly logs from app using log4j 1.2 to logstash.

Thanks for your help !
Mouglou

Gelf seems to don't be the most used input plugin :wink:

I try few differents way to send the logs, with differents success.

1- 2 logstash server.
I create a new logstash server with the 2.4 version, listening with the log4j input plugin, the using the output tcp plugin with the codec to "json" to the logstash 5.6 version, which is listing with the input tcp plugin, with "json" codec, and then send to elasticsearch.

No success. The flow is ok, I see the packet on the logstah 5.6 version arrive (tcpdump), but nothing from logstash. Nothing append, and don't send anything on logstash... And no logs from logstash, even in debug.

2- logstash 2.4 only.
I check the the matric compatibilty to ensure it can works, so I work only with the logstash 2.4 version, with log4j input plugin and then send direcly to elasticsearch 5,6.
It works. logs are available. So could be an option but the good one... Lost the latest option of logstash, as x-pack, and newest options...
And the matrix show that the 6.x version of elastic is not compatible with logstash 2.4...

Nobody is sending logs from Java with log4 1.2 to a 5.x or 6.x version of elasticsearch ?

Hope somebody can give us a idea how to do that ! Without using filebeat ! Unless MDC could injected :wink:

Mouglou

Hum its better !

I add the codec json to the gelf input plugin like this:

input {
gelf {
type => "java_app"
port => "4560"
use_tcp => "true"
use_udp => "false"
codec => json
}
}

No more warns in the logstash log and the flow seems ok.

But all MDC value are not parse by logstash.

I follow the official documentation to configure the appender:
http://logging.paluch.biz/examples/log4j-1.2.x-json.html

Any idea why all MDC are not parse ?

Any idea too why my configuration with 2 logstash is not working ? If Gelf is not pasring all MDC, this could be an option to ensure the logs flow...

Mouglou

It works. For my 2 solutions.

First with Gelf. The solution is to set the codec to "json" and now logs are in without warnings. But all MDC are not effective... And I don't find why... And have not the name decalred in the code... So works, but not as it should be.

Second, with the 2 logstash server. I set the first one in the 2.4 version with log4 as input, tcp as output, and set the codec to "json_lines".
The second logstash, in 5.6 version, with tcp as input and codec as "json_lines", and elasticsearch as output. Everything work fine, all MDC are effective, have the good name...

Could be the solution as we run with log4j 1.2 version...

I will mark the post as solved, but if somebody have a better solution for a Java app with log4j 1.2 and logstash 5.x or 6.x, it could be very useful !

Mouglou

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.