Filebeat: Split message field in kibana into several fields

Do I need logstash to transform below log message received in Kibana from filebeat

[2019-08-25T19:23:07.489Z] "GET /health HTTP/1.1" 200 - "-" "-" 0 15 22 22 "-" "kube-probe/1.13" "68e1cfbd-345f-45cd-9b86-2b4b17ab3ade" "172.30.35.252:8080" "127.0.0.1:8061" inbound|8080|http-management|kube-appname-app-service.default.svc.cluster.local - 172.30.35.252:8080 10.135.231.188:22451 -

TO several columns

HTTP METHOD|DIRECTION|PORT|OPERATION|SERVICE_NAME
200|inbound|8080|http-management|kube-appname-app-service.default.svc.cluster.local

I know this is possible to do with logstash grok+filter, but I'm not sure whether this could be done only with filebeat:

Much appreciate!!!

Hi @O_K,

You could probably do it using Filebeat processors, there is for example one to decode CSV fields, that was introduced in filebeat 7.2 and is able to split a string using a custom separator. The dissect processor can also be helpful to separate the rest of elements.

If you want to use grok, you can also use Elasticsearch ingest nodes instead of Logstash. They allow to define pipelines similar to the ones you could define with Logstash.

1 Like

Thank you @jsoriano,

Does it mean filebeat could be enough for some aggregation? Or we still have to use logstash/fluentd?

What kind of aggregations?

Let's say for below I'd like to get sum of 200 http methods for particular SERVICE_NAME

HTTP METHOD|DIRECTION|PORT|OPERATION|SERVICE_NAME
200|inbound|8080|http-management|kube-appname-app-service.default.svc.cluster.local

Oh, you could use Kibana to explore the data collected by filebeat, what includes aggregations. Aggregations would be done at query time.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.