KV filter not working


(Shubham Munot) #1

config file:

filter {
  if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
    }
    syslog_pri { }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
  }
  if [message] =~ "^#" {
    drop {}
  }
  else {
     grok {
	match => {"message" => "%{TIMESTAMP_ISO8601:timestamp} %{IPORHOST:serverip} %{WORD:method} %{URIPATH:stem} %{NOTSPACE:query} %{USERNAME:username} %{GREEDYDATA:referer} %{NUMBER:stat} %{NUMBER:scBytes} %{NUMBER:csBytes} %{NUMBER:totalTime}" }	
	}
     mutate {
	convert => { "totalTime" => "float" } 
	convert => { "scBytes" => "float" } 
	convert => { "csBytes" => "float" } 
	convert => { "stat" => "float" }
     }
     kv {
	source => "stem"
	field_split => "/"
	prefix => "stem_"
     }
  }
}

I want my uri_stem to be in different fields
eg: /Services/UserPageSearchHelper.svc/GetPracticeConfigurationValue
field 1: Services
field 2: UserPageSearchHelper.svc
field 3: GetPracticeConfigurationValue

LOG FILE:
2017-05-24T12:49:37,074][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2017-05-24T12:49:37,081][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-05-24T12:49:37,241][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<URI::HTTP:0x5545205b URL:http://localhost:9200/>}
[2017-05-24T12:49:37,248][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::Generic:0x120074c URL://localhost:9200>]}
[2017-05-24T12:49:37,398][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-05-24T12:49:38,274][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2017-05-24T12:49:38,337][INFO ][logstash.pipeline ] Pipeline main started
[2017-05-24T12:49:38,485][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-05-24T12:49:42,323][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[http://localhost:9200/], :added=>[http://127.0.0.1:9200/]}}
[2017-05-24T12:49:42,327][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://127.0.0.1:9200/, :path=>"/"}
[2017-05-24T12:49:42,340][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<URI::HTTP:0x549a2b58 URL:http://127.0.0.1:9200/>}


Grok URI extract
(Mark Walkom) #2

And what's not working?


(Christian Dahlqvist) #3

The kv filter is designed to parse a list of key-value pairs, e.g. key1=value1,key2=value2 which does not seem too be what you have in the field. You could use grok to separate out the different parts or maybe even treat it as a csv string with a '/' separator.


(Shubham Munot) #4

Yeah I figured that out later. Thanks.


(Shubham Munot) #5

If anyone still looking for answer

 grok {
	match => ["stem","/(?<services>[^/]+)/(?<serviceCall>[^/]+)/(?<function>[^/]+)/(?<rest>[^/]+)",
		  "stem","/(?<services>[^/]+)/(?<serviceCall>[^/]+)/(?<function>[^/]+)",
		  "stem","/(?<services>[^/]+)/(?<serviceCall>[^/]+)"]
 }

(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.