Split field data

Hello

We now have a ELK system with logstash,elasticsearch and kibana.

The logstash configure is:

input {
beats {
port => 5044
}
}

filter {
grok {
match => [ "message", "%{COMMONAPACHELOG}" ]
}
}

output {
if [type]=="wineventlog" {
elasticsearch {
hosts => "localhost:9200"
index => "wineventlog-%{+YYYY-MM-dd}"
}
}
else
{

elasticsearch {
hosts => "localhost:9200"
}
}
}

The module of index is default.

Now we get the data with winlogbeat and we want to split the "event_data.Binary" row in the kibana table.

For example:

The origin content of event_data.Binary row is "000000000003".

And we want to split it to three row on kibana:

event_data.Binary1 0000

event_data.Binary2 0000

event_data.Binary3 0003

Could we use logstash or kibana to do it ?

You can do it with a ruby filter in Logstash.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.