Need help on logstash.conf for Wildfly logs

I converted the wildfly logs to logstash readable format (Json).
Here is how my log looks like

{"@timestamp":"2018-06-22T18:36:55.138+0530","@message":"2018-06-22 18:36:55,138 INFO [com.vcl.util.LogTracer] (default task-2) - [T:293148959518596_w4qpV6V4Oq|L:null|LU:null] - 192.168.3.51 | /login.do | ? | ? | 5559\r","@source":"stdout","@source_host":"xxx-laptop","@fields":{"timestamp":1529672815138,"level":"INFO","line_number":0,"class":"org.jboss.stdio.AbstractLoggingWriter","method":"write"},"@tags":["UNKNOWN"]}

I am trying to parse these logs using the below logshash.conf file

input{
         file{
                 path => "D:/elkstack/QfundTestLogs/*.log"
                 start_position => beginning
         }
 }
 filter{
        grok{
            match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:log-level} \[%{DATA:class}\] - \[%{DATA:message}\] - %{IP:ip} %{GREEDYDATA:message}"}
        }
		
		json{
			source => "message"
			target => "parsedJson"
			remove_field=>["message"]
		}  
 }
 output{
         elasticsearch{
                 hosts => "localhost:9200"
                 index => "logstash-%{+YYY.MM.dd}"
         }
}

It would be really great if someone can help me write a correct logstash.conf .

As a start use codec => json in your file input.

Many Thanks @magnusbaeck. I tried using codec => json , no luck.
I dont see anything in Kibana, I think my log is not getting parsed.

Use a stdout { codec => rubydebug } while debugging your inputs and your filters. Once that looks okay you can reenable your elasticsearch output.

@magnusbaeck I see below output .

[2018-06-27T16:43:31,495][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"D:/elkstack/logstash-6.2.3/modules/fb_apache/configuration"}
[2018-06-27T16:43:31,515][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"D:/elkstack/logstash-6.2.3/modules/netflow/configuration"}
[2018-06-27T16:43:31,766][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-06-27T16:43:32,610][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.2.3"}
[2018-06-27T16:43:33,426][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-06-27T16:43:39,275][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-06-27T16:43:40,221][INFO ][logstash.pipeline        ] Pipeline started succesfully {:pipeline_id=>"main", :thread=>"#<Thread:0x4a810897 sleep>"}
[2018-06-27T16:43:40,373][INFO ][logstash.agent           ] Pipelines running {:count=>1, :pipelines=>["main"]}
{
       "@tags" => [
    [0] "UNKNOWN"
],
        "path" => "D:/elkstack/QfundTestLogs/wildfly_logstash.log",
  "@timestamp" => 2018-06-22T13:06:49.717Z,
    "@message" => "2018-06-22 18:36:49,717 INFO  [com.vcl.util.LogTracer] (default task-2) -  [T:293148959518596_w4qpV6V4Oq|L:null|LU:null] - LookUp : select : Query Elapsed time0.02sec\r",
     "@source" => "stdout",
     "@fields" => {
    "line_number" => 0,
         "method" => "write",
      "timestamp" => 1529672809717,
          "class" => "org.jboss.stdio.AbstractLoggingWriter",
          "level" => "INFO"
},
    "@version" => "1",
        "tags" => [
    [0] "_grokparsefailure"
],
        "host" => "xxx-LAPTOP",
"@source_host" => "yyy-laptop"
}

Okay, so Logstash is at least able to read from the file. If Logstash has problems sending to Elasticsearch it'll tell you about it in its own logfile. Cranking up the loglevel will give additional details.

I just saw this in your configuration:

             index => "logstash-%{+YYY.MM.dd}"

There's a "Y" missing here.

I Added the Y over there. Thank You.

It is saying that _grokparsefailure . How do I solve this ?? Do I have make any changes to the pattern?

To start with the grok filter you posted earlier attempts to parse the message field but your events have no field with that name.

I am a bit puzzled, are you talking about this by any chance? I have a field named message in my logs.

I have a field named message in my logs.

Nope. As you can see from the stdout output the field is named @message.

match => { "@message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:log-level} [%{DATA:class}] %{DATA:text} - [%{DATA:kvpairs}] - %{GREEDYDATA:message}"}

I changed the match in grok to this and trying to parse the log

`

{"@message":"2018-06-22 19:36:49,717 INFO [com.vcl.util.LogTracer] (default task-2) - [T:293148959518596_w4qpV6V4Oq|L:null|LU:null] - LookUp : select : No of rows returned:0\r","@source":"stdout","@source_host":"xxx-laptop","@fields":{"timestamp":1529672809717,"level":"INFO","line_number":0,"class":"org.jboss.stdio.AbstractLoggingWriter","method":"write"},"@tags":["UNKNOWN"]}

`
Am I missing any thing with the pattern match? Can you please help me to match the pattern.

I can't spot anything OTOH. Build your expression gradually by starting with the very simplest expression (^%{TIMESTAMP_ISO8601:timestamp}). Eventually it'll break and then you've narrowed things down.

Many Thanks @magnusbaeck :slight_smile:
I follwed your suggestion, I broke down and I could succeed parsing the log.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.