Need help on logstash.conf for Wildfly logs

(Chandana Priya Sathavalli) #1

I converted the wildfly logs to logstash readable format (Json).
Here is how my log looks like

{"@timestamp":"2018-06-22T18:36:55.138+0530","@message":"2018-06-22 18:36:55,138 INFO [com.vcl.util.LogTracer] (default task-2) - [T:293148959518596_w4qpV6V4Oq|L:null|LU:null] - | / | ? | ? | 5559\r","@source":"stdout","@source_host":"xxx-laptop","@fields":{"timestamp":1529672815138,"level":"INFO","line_number":0,"class":"org.jboss.stdio.AbstractLoggingWriter","method":"write"},"@tags":["UNKNOWN"]}

I am trying to parse these logs using the below logshash.conf file

                 path => "D:/elkstack/QfundTestLogs/*.log"
                 start_position => beginning
            match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:log-level} \[%{DATA:class}\] - \[%{DATA:message}\] - %{IP:ip} %{GREEDYDATA:message}"}
			source => "message"
			target => "parsedJson"
                 hosts => "localhost:9200"
                 index => "logstash-%{+YYY.MM.dd}"

It would be really great if someone can help me write a correct logstash.conf .

(Magnus Bäck) #2

As a start use codec => json in your file input.

(Chandana Priya Sathavalli) #3

Many Thanks @magnusbaeck. I tried using codec => json , no luck.
I dont see anything in Kibana, I think my log is not getting parsed.

(Magnus Bäck) #4

Use a stdout { codec => rubydebug } while debugging your inputs and your filters. Once that looks okay you can reenable your elasticsearch output.

(Chandana Priya Sathavalli) #5

@magnusbaeck I see below output .

[2018-06-27T16:43:31,495][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"D:/elkstack/logstash-6.2.3/modules/fb_apache/configuration"}
[2018-06-27T16:43:31,515][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"D:/elkstack/logstash-6.2.3/modules/netflow/configuration"}
[2018-06-27T16:43:31,766][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-06-27T16:43:32,610][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.2.3"}
[2018-06-27T16:43:33,426][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-06-27T16:43:39,275][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-06-27T16:43:40,221][INFO ][logstash.pipeline        ] Pipeline started succesfully {:pipeline_id=>"main", :thread=>"#<Thread:0x4a810897 sleep>"}
[2018-06-27T16:43:40,373][INFO ][logstash.agent           ] Pipelines running {:count=>1, :pipelines=>["main"]}
       "@tags" => [
    [0] "UNKNOWN"
        "path" => "D:/elkstack/QfundTestLogs/wildfly_logstash.log",
  "@timestamp" => 2018-06-22T13:06:49.717Z,
    "@message" => "2018-06-22 18:36:49,717 INFO  [com.vcl.util.LogTracer] (default task-2) -  [T:293148959518596_w4qpV6V4Oq|L:null|LU:null] - LookUp : select : Query Elapsed time0.02sec\r",
     "@source" => "stdout",
     "@fields" => {
    "line_number" => 0,
         "method" => "write",
      "timestamp" => 1529672809717,
          "class" => "org.jboss.stdio.AbstractLoggingWriter",
          "level" => "INFO"
    "@version" => "1",
        "tags" => [
    [0] "_grokparsefailure"
        "host" => "xxx-LAPTOP",
"@source_host" => "yyy-laptop"

(Magnus Bäck) #6

Okay, so Logstash is at least able to read from the file. If Logstash has problems sending to Elasticsearch it'll tell you about it in its own logfile. Cranking up the loglevel will give additional details.

I just saw this in your configuration:

             index => "logstash-%{+YYY.MM.dd}"

There's a "Y" missing here.

(Chandana Priya Sathavalli) #7

I Added the Y over there. Thank You.

It is saying that _grokparsefailure . How do I solve this ?? Do I have make any changes to the pattern?

(Magnus Bäck) #8

To start with the grok filter you posted earlier attempts to parse the message field but your events have no field with that name.

(Chandana Priya Sathavalli) #9

I am a bit puzzled, are you talking about this by any chance? I have a field named message in my logs.

(Magnus Bäck) #10

I have a field named message in my logs.

Nope. As you can see from the stdout output the field is named @message.

(Chandana Priya Sathavalli) #11

match => { "@message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:log-level} [%{DATA:class}] %{DATA:text} - [%{DATA:kvpairs}] - %{GREEDYDATA:message}"}

I changed the match in grok to this and trying to parse the log


{"@message":"2018-06-22 19:36:49,717 INFO [com.vcl.util.LogTracer] (default task-2) - [T:293148959518596_w4qpV6V4Oq|L:null|LU:null] - LookUp : select : No of rows returned:0\r","@source":"stdout","@source_host":"xxx-laptop","@fields":{"timestamp":1529672809717,"level":"INFO","line_number":0,"class":"org.jboss.stdio.AbstractLoggingWriter","method":"write"},"@tags":["UNKNOWN"]}

Am I missing any thing with the pattern match? Can you please help me to match the pattern.

(Magnus Bäck) #12

I can't spot anything OTOH. Build your expression gradually by starting with the very simplest expression (^%{TIMESTAMP_ISO8601:timestamp}). Eventually it'll break and then you've narrowed things down.

(Chandana Priya Sathavalli) #13

Many Thanks @magnusbaeck :slight_smile:
I follwed your suggestion, I broke down and I could succeed parsing the log.

(system) #14

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.