If Else in logstash filter

hey Guys,
kindly help me to solve this

  • There are two types of messages that i am trying to parse

    1. {"Code":"BOSIIF902","Message":"Backup stopped. Error during the backup :: SOAP-ERROR: P","time":1583470627,"userId":"*****************","businessUserId":"************","cloudId":4,"domainId":"603","additionalInfo":null} [] []
    
    2. plain message without any fields eg : "hello how are you"
    
  • So this was my config before

input {
  file {
    path  => "/home/ubuntu/*"
    start_position => "beginning"
  }
}
filter {

   grok {																									
    match => { "message" => "(?<jsonf>({.*}))"}
  }
  json {
            source => "jsonf"
  }
  mutate {
        remove_field => [ "message","jsonf" ]
      }
}

output {
  amazon_es {
    hosts => ["*****************************"]
    region => "us-east-1"
    index => "test-%{+YYYY.MM.dd}"
    #user => "elastic"
    #password => "changeme"
  }
}
  • so this was working fine for the 1st message and for the second message it was throwing as_grokparsefailure

  • So the next thing i tried was by putting IF conditions

input {
	file {
		type => "testlogs"
		path => "/home/ubuntu/testlog.log"
		start_position => "beginning"
	}
}

filter {

	# strating if
	if [message] == "message" {
		
	
 
 	# filter
	grok {
			match => { "message" => "(?<jsonf>({.*}))"}
		}
    json {
            source => "jsonf"
        }
    mutate {
        remove_field => [ "message","jsonf" ]
      }
	
}
}

# output logs to console and to elasticsearch
output {
if [message] =~ "message" {
amazon_es {
    hosts => ["******************************"]
    region => "us-east-1"
    index => "test-%{+YYYY.MM.dd}"

  }
    
}	
}
  • when i use this i dint get any errors but when i tried to push the logs there was no index created
  • i would request anyone to help me out with this , im totally not sure whether im using the conditions in right place .

I would suggest that you use a json filter unconditionally, but tell it to delete the message field if it successfully parses it.

json { source => "message" remove_field => [ "message" ] }

@Badger so u mean i should put the grok in conditions and not the json right ?

I see no reason to use a grok filter.

i am trying to parse this message

{"Code":"BOSIIF902","Message":"Backup stopped. Error during the backup :: SOAP-ERROR: P","time":1583470627,"userId":"*****************","businessUserId":"************","cloudId":4,"domainId":"603","additionalInfo":null} [] []
  • if i want that code, Message, USerID, BusinessID cloudID as a seperate feild i should use grok with json right ?

  • so without using grok i will get the whole message in a single field.

OK, I did not realize that there is additional text after the JSON. Try

filter {
    if [message] =~ /{.*}/
        grok { match => { "message" => "(?<[@metadata][json]({.*}))"} }
        json { source => "[@metadata][json]" remove_field => [ "message" ] }
    }
}

Fields under [@metadata] exist in logstash, but are not indexed into elasticsearch. The remove_field only gets executed if the json filter successfully parses the message, so if there is a problem with the JSON you will be able to see what it is. For the plain text messages none of the filters will be applied and they will just be sent to the elasticsearch output.

Thanks a lot @Badger this solved my problem