After adding Prune filter - logs are not going to Elastic search

Hi Team,
I am learning ELK and trying to do as a POC for my project. I am applying KV filter for the sample integration logs from my project and i could see lot of extra fields are coming as a result so i have tried to apply prune filter and white-listed certain fields. I can see the logs getting printed in the logstash server but logs are not going to elastic search. If i remove the filter it is going to the elastic search. Please advise how to further debug on this issue.

filter {
	kv {
			field_split => "{},?\[\]"
			transform_key => "capitalize"
			transform_value => "capitalize"
			trim_key => "\s"
			trim_value => "\s"
			include_brackets => false	
		}
	prune
	{
		whitelist_names => [ "App_version", "Correlation_id", "Env", "Flow_name", "host", "Instance_id", "log_level","log_thread", "log_timestamp", "message", "patient_id", "status_code", "type", "detail"]
	}
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "mule-%{+YYYY.MM.dd}"
    #user => "elastic"
    #password => "changeme"
  }
  stdout { codec => rubydebug }
}

I also need two more suggestion,

  1. I am also trying to use the grok filter in the initial logs and trying to take log level fields(time and log type) from the sample log and send the remaining logs to the KV filter. Is there any reference please share for it. This is what i have tried for it. but getting as _grokparsefailure. I have passed the msgbody to the kv filter with the source option.

     grok {
       		match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}\s+%{LOGLEVEL:loglevel}\s+%{GREEDYDATA:msgbody}"}
     		overwrite => [ "msgbody" ]
     	}
    
  2. I am having message fields inside sample logs as like below. When the data goes to Kibana i can see two message field tag one is with full log and other is with correct message(highlighted). Will the mutate works for this case? Is there any way we can change the full log name as something else ??

[2020-02-10 11:20:07.172] INFO Mule.api [[MuleRuntime].cpuLight.04: [main-api-test].api-main.CPU_LITE @256c5cf5: [main-api-test].main-api-main/processors/0/processors/0.CPU_LITE @378f34b0]: event:00000003 {app_name=main-api-main, app_version=v1, env=Test, timestamp=2020-02-10T11:20:07.172Z, log={correlation_id=00000003, patient_id=12345678, instance_id=hospital, **message**=Start of System API, flow_name=main-api-main}}

Thanks!

Regards,
Vignesh G

Hi All,

I have fixed the prune issue. Can someone help me how to add the two fields in the grok and convert it to the time format

grok {
			match => { "message" => "%{LOGLEVEL:log_level}\s+\[%{TIMESTAMP_ISO8601:Log_timestamp}\]\s+%{GREEDYDATA:msgbody}"}
			add_field => {
					"%{log_level}" => "%{log_level}"
					"%{Log_timestamp}" => "%{Log_timestamp}"
				}
			overwrite => [ "msgbody" ]
		}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.