Logstash config issue for a custom log file


(Kaushik Vankayala) #1

Hi There,

I am a newbie to logstash and working on a grok pattern as below. I am only able to partially achieve my goal. Please help with my issue as described below;

My logs look like;


[2018-06-28 11:49:41.257] INFO authentication [[appname].HTTP_Experience_Listener_Configuration.worker.01]: Basic Authentication Success For Partner.............
[2018-06-28 11:49:41.332] INFO Audit IN [[appname].HTTP_Experience_Listener_Configuration.worker.01]: {"transactionId": "e4f9d0oe99af7da2b639f5a8af001cbf","correlationId":"563c2ec1-76c9-11ej-af44-02cfdf6cebce","request": "POST /api/experience/member/product-status/cancel","timeStamp": "2018-06-28 17:19:41.303" ,"applicationName": "appname", "applicationVersion":"v1.0","environment":"dev", "sourceIP": "/137.226.212.183:20500", "partnerID": "partner"}
[2018-06-28 11:49:39.516] INFO com.mulesoft.agent.configuration.postconfigure.DefaultPostConfigureRunner [[appname].HTTP_Experience_Listener_Configuration.worker.01]: mule.agent.tracking.handler.cloudhub.source initialized successfully.
[2018-06-28 11:49:41.257] INFO authentication [[appname].HTTP_Experience_Listener_Configuration.worker.01]: Basic Authentication Success For Partner.............
[2018-06-28 11:49:41.332] INFO Audit IN [[appname].HTTP_Experience_Listener_Configuration.worker.01]: {"transactionId": "e4f9d0oe99af7da2b639f5a8af001cbf","correlationId":"563c2ec1-76c9-11ej-af44-02cfdf6cebce","request": "POST /api/experience/member/product-status/cancel","timeStamp": "2018-06-28 17:19:41.303" ,"applicationName": "appname", "applicationVersion":"v1.0","environment":"dev", "sourceIP": "/137.226.212.183:20500", "partnerID": "partner"}
[2018-06-28 11:49:39.516] INFO com.mulesoft.agent.configuration.postconfigure.DefaultPostConfigureRunner [[appname].HTTP_Experience_Listener_Configuration.worker.01]: mule.agent.tracking.handler.cloudhub.source initialized successfully.
[2018-06-28 11:49:43.220] INFO Audit OUT [[appname].HTTP_Experience_Listener_Configuration.worker.01]: {"transactionId": "e4f9d0be9ba64djebp19p2a8af001cbf", "responseCode": "200" ,"timeStamp": "2018-06-28 17:19:43.220", "partnerID": "partner"}[2018-06-28 11:49:43.220] INFO Audit OUT [[appname].HTTP_Experience_Listener_Configuration.worker.01]: {"transactionId": "e4f9d0be9ba64djebp19p2a8af001cbf", "responseCode": "200" ,"timeStamp": "2018-06-28 17:19:43.220", "partnerID": "partner"}


Now my logstash conf looks like below;

input
{
	file
	{
		path => "C:\Users\M1045583\Downloads\mule-conf.log"
		start_position => "beginning"
	
	}
}

filter
{

	if "Audit IN" in [message] or "Audit OUT" in [message]
	{
		grok
		{
			match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] %{LOGLEVEL:loglevel}" %{WORD:text}: %{(?<JSON>\{.*\}):json-data}" }
		}
		
		date 
		{
			match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss.SSS Z" , "ISO8601", "yyyy-MM-dd HH:mm:ss.SSS" , "yyyy-MM-dd HH:mm:ss.S" ]
			target => "@timestamp"
			remove_field => "timestamp"
		}
	}
	else { drop { } }
}

output
{
	elasticsearch
	{
		hosts => ["localhost:9200"]
		index => "mule-log"
	}
}

Firstly I only need to get the events that have the string Audit IN and Audit OUT which i have successfully achieved through the if else condition.

Now, the issue i am facing is how to break the line in GROK, to get the timestamp, Loglevel, and the JSON fields into my Elastic index and Discover them accordingly in Kibana.

Please help. I have given above the lines of conf file i am using, but the Elastic is unable to read the log as required.

Also, when i delete the data of Elastic from console of Kibana i am again not able to push the logs with same index name. Please help!

Regards

Kaushik


Visualization to view the difference of time for two events with a particular field value
#2

A hyphen a field name does not work

    grok {
        match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] %{LOGLEVEL:loglevel} Audit (IN|OUT) %{DATA:text}: %{GREEDYDATA:jsondata}" }
    }
    json { source => "jsondata" }

If I understand this correctly then you might need this in your file input.

sincedb_path => "NUL"

(Kaushik Vankayala) #3

@Badger : Thank a lot for your response but below are my findings;

Did not get the logs in the expected format or as per the grok. Please see image below the kibana screenshot.

However the sincedb_path resolved when started pushing the logs again after amendments using the same conf file. Also a correction the feild value is as below;

sincedb_path => "/dev/null"


#4

I assumed that since you had

	path => "C:\Users\M1045583\Downloads\mule-conf.log"

you were running on Windows. If you are running on Linux it should indeed be "/dev/null".


(Kaushik Vankayala) #5

You are absolutely right. My bad! :man_facepalming: but for some reason it worked by giving a linux path also :stuck_out_tongue_winking_eye:

sincedb_path => "NUL" is the right one! (this helped me understand)

However, any luck with the grok pattern?


#6

I don't know what to say about the _grokparsefailure. If I run

output { stdout { codec => rubydebug } }
input { generator { count => 1 message => '[2018-06-28 11:49:41.332] INFO Audit IN [[appname].HTTP_Experience_Listener_Configuration.worker.01]: {"transactionId": "e4f9d0oe99af7da2b639f5a8af001cbf","correlationId":"563c2ec1-76c9-11ej-af44-02cfdf6cebce","request": "POST /api/experience/member/product-status/cancel","timeStamp": "2018-06-28 17:19:41.303" ,"applicationName": "appname", "applicationVersion":"v1.0","environment":"dev", "sourceIP": "/137.226.212.183:20500", "partnerID": "partner"}' } }
filter {
    if "Audit IN" in [message] or "Audit OUT" in [message] {
        grok {
              match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] %{LOGLEVEL:loglevel} Audit (IN|OUT) %{DATA:text}: %{GREEDYDATA:jsondata}" }
        }
    }
}   

then it groks it just fine

  "jsondata" => "{\"transactionId\": \"e4f9d0oe99af7da2b639f5a8af001cbf\",\"correlationId\":\"563c2ec1-76c9-11ej-af44-02cfdf6cebce\",\"request\": \"POST /api/experience/member/product-status/cancel\",\"timeStamp\": \"2018-06-28 17:19:41.303\" ,\"applicationName\": \"appname\", \"applicationVersion\":\"v1.0\",\"environment\":\"dev\", \"sourceIP\": \"/137.226.212.183:20500\", \"partnerID\": \"partner\"}",
  "loglevel" => "INFO",
      "text" => "[[appname].HTTP_Experience_Listener_Configuration.worker.01]",

(Kaushik Vankayala) #7

@Badger : You are amazing :star_struck: . It worked!!! :ok_hand:

(The grokparsefailure was because, the gap between the loglevel and the word Audit was 4 spaces actually rather than 1)

I have one more request to ask;

In the jsondata, how can i avoid the "/" and the ":portnumber" so i can map it to a geo-point. sourceIP: /13.229.49.219:26270


#8
dissect { mapping => { "sourceIP" => "/%{ip}:%{port}" } } 
geoip { source => "ip" }

(Kaushik Vankayala) #9

@Badger : Thank you very very much! All my use-cases are now achieved. It was an awesome getting help from here. Will keep posting if any new topics! :innocent::+1:


(system) #10

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.