Hi,
I am consuming data from Kafka with INPUT configuration:
input {
kafka {
codec => avro {
schema_uri => "/etc/logstash/avro.avsc"
}
key_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
value_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
Schema is:
{
"namespace": "avro_data",
"type": "record",
"name": "event",
"fields":
[
{"name": "timestamp", "type": "long"},
{"name": "src", "type": "string"},
{"name": "host_ip", "type": "string"},
{"name": "rawdata", "type": "bytes"}
]
}
My problem is that data in "rawdata" are nested, there is a lot of fields. And in Kibana i do see all these data in that one "rawdata" field.
Example in "rawdata" fieled:
{"timestamp":"Mon May 24 12:34:23 UTC 2021","src":"avro_syslog","hostT01":"device1","host_ip":"device1","tag":"hello","type":"syslog","source":"syslog","msg":"%Viptela-device1-ftmd-6-INFO-1400002: bfd-state-change severity-level:major host-name:device1 system-ip:10.2.0.14 src-ip:10.24.11.221 dst-ip:63.142.13.44 proto:ipsec src-port:12346 dst-port:12386 local-system-ip:10.2.0.14 local-color:custom1 remote-system-ip:10.0.0.6 remote-color:custom1 new-state:up deleted:false flap-reason:na","raw":"<190>FTMD[1285]: %Viptela-device1-ftmd-6-INFO-1400002: 2021-05-24 12:34:21 Notification: bfd-state-change severity-level:major host-name:device1 system-ip:10.2.0.14 src-ip:10.24.11.221 dst-ip:63.142.13.44 proto:ipsec src-port:12346 dst-port:12386 local-system-ip:10.2.0.14 local-color:custom1 remote-system-ip:10.0.0.6 remote-color:custom1 new-state:up deleted:false flap-reason:na"}
I want to see for examle in Kibana a value "tag" which is in "rawdata" as separated field, not nested in "rawdata".
Is there a way hot to pars it or split it?
Thanks