How to change a message into multiple key and value in logstash filter


(Robin Guo) #1

original data :

except field in kibana : (I also want to remove message field)

agent:  metricbeat
env: production
dc: fr4  
pgpool_activite: 20
pgpool_idle:8
pgpool_waiting:100 
...
...
....

(Christian Dahlqvist) #2

What does your Logstash config look like? Are you using a son filter or codec?


(Robin Guo) #3

I just sent a jason based data to logstash.

#json data

./pgpool.sh >/tmp/test

cat /tmp/test
{"tags":"tcpbeat","agent":"metricbeat","env":"production","dc":"fr4","os":"Linux","service":"pgpool","beat.name:"robinguo-HP-Z210-Workstation","pgpool_active":20,"pgpool_idle":80,"pgpool_waiting":100}
170816173601 root@robinguo-HP-Z210-Workstation python # nc logstashtest01.tls.ad 5050 <  /tmp/test

#logstash pipline
170816173806 root@ conf.d # pwd
/etc/logstash/conf.d

cat tcpbeat.conf
input {
  tcp {
    port => 5050
  }
}

output {
    file {
    path => "/tmp/logstash-5050"
    }
}

(Christian Dahlqvist) #4

Try adding a json filter to the config in order to parse the JSON object.


(Robin Guo) #5

Would you please walk me through an example to do this cuz I'm a newbie to logstash?

Thanks


(Christian Dahlqvist) #6

Add the following (taken from the example in the docs I linked to) ahead of the output section and see how it works:

filter {
  json {
    source => "message"
  }
}

(Robin Guo) #7

seems it's not working well.

#logstash for tcpbeat
input {
  tcp {
    port => 5050
  }
}

filter {
  json {
    source => "message"
  }
}


output {
    file {
    path => "/tmp/logstash-5050"
    }
}

the results:
{"@timestamp":"2017-08-16T10:37:02.669Z","port":46369,"@version":"1","host":"10.65.186.63","message":"{\"tags\":\"tcpbeat\",\"agent\":\"metricbeat\",\"env\":\"production\",\"dc\":\"fr4\",\"os\":\"Linux\",\"service\":\"pgpool\",\"beat.name:\"robinguo-HP-Z210-Workstation\",\"pgpool_active\":20,\"pgpool_idle\":80,\"pgpool_waiting\":100}","tags":["_jsonparsefailure"]}


(Robin Guo) #8

hi @Christian_Dahlqvist,
Finally, It works. it's because of bad JSON format strings.

one more request, I want to remove the unnecessary key from logstash pipeline which is I want to remove message {} or content {}, but still need to keep the contents of in it, is there a way can do it?

#logstash pipline conf

input {
  tcp {
    port => 5050
  }
}

filter {
  json {
    source => "message"
    target => "content"
  }
  mutate {
         remove_field =>  ["message"]  
  }
}

output {
     file {
         path =>"/tmp/logstash-5050"
     }
}

the original output

{"@timestamp":"2017-08-18T04:07:44.125Z","port":36600,"@version":"1","host":"10.65.186.63","content":{"pgpool_waiting":100,"agent":"metricbeat","beat_name":"robinguo-HP-Z210-Workstation","os":"Linux","service":"pgpool","pgpool_idle":80,"env":"production","dc":"fr4","pgpool_active":20}}

the results I need to output

{"@timestamp":"2017-08-18T04:07:44.125Z","port":36600,"@version":"1","host":"10.65.186.63","pgpool_waiting":100,"agent":"metricbeat","beat_name":"robinguo-HP-Z210-Workstation","os":"Linux","service":"pgpool","pgpool_idle":80,"env":"production","dc":"fr4","pgpool_active":20}


(system) #9

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.