I have some logs being forwarded using Filebeat to Elasticsearch which contain a JSON array.
{
"docs": [
{
"_source": {
...
"snapshot": [
{
"address": "127.0.0.1",
"interface": "lo",
"mac": "00:00:00:00:00:00"
},
{
"address": "::1",
"mac": "00:00:00:00:00:00",
"interface": "lo"
},
]
...
}
}
]
}
I need to use this data in Kibana and as such, given that from what I read Kibana does not deal nicely with arrays (How to visualize JSON array in Kibana), I would like to store each array item within the snapshot
array as it's own event in Elasticsearch, presumably looking something similar to the following.
{
"docs": [
{
"_source": {
...
"snapshot": {
"address": "127.0.0.1",
"interface": "lo",
"mac": "00:00:00:00:00:00"
}
...
}
},
{
"_source": {
...
"snapshot": {
"address": "::1",
"mac": "00:00:00:00:00:00",
"interface": "lo"
}
...
}
}
]
}
While this seems to be possible using Logstash (https://stackoverflow.com/questions/31402997/how-to-split-a-json-array-inside-an-object) using the split
filter, I can't seem to find an obvious way to do this using Elasticsearch ingest pipelines (the split
processor in Elasticsearch ingest pipelines seems to only operate on strings).
Any help would be most appreciated.