Choose filter-kv or another from list for such type logs

Hello to everyone

I have logs at docker-compose such like this

 2019-06-18 16:26:03.639 INFO  [kafka-producer-network-thread | testSrvProducer-1] [kafka.producer.impl.MessageProducerKafkaImpl] [] - delivered to topic=testTwoGateInQueue message='{"type":"BLA_BLA","sequence_id":"seq-1","data":{"version":0,"customer_id":"test-id-1","order_id":"order-id-1"}}' with offset=1 partition=0 key = kafka-test-key-1

What kind of filter I need to use for logstash?
logstash-filter-kv or logstash-filter-metrics or grok?

I need to send at elastic logs that have timestamp + name kafka topic + field type + sequence_id and other fields at {}

With 7.1.1 you can parse this using

filter {
    dissect { mapping => { "message" => "%{[@metadata][ts]} %{+[@metadata][ts]} %{level->} [%{fieldA} | %{fieldB}] [%{fieldC}] [%{fieldD}] - %{restOfLine}" } }
    date { match => [ "[@metadata][ts]", "YYYY-MM-dd HH:mm:ss.SSS" ] }
    kv { source => "restOfLine" remove_field => "restOfLine" }
    mutate { gsub => [ "message", "\\\'", "" ] }
    json { source => "message" target => "fieldE" remove_field => "message" }
}

and you will get

    "fieldB" => "testSrvProducer-1",
    "fieldD" => "",
 "partition" => "0",
     "topic" => "testTwoGateInQueue",
       "key" => "kafka-test-key-1",
    "fieldE" => {
           "data" => {
            "version" => 0,
           "order_id" => "order-id-1",
        "customer_id" => "test-id-1"
    },
           "type" => "BLA_BLA",
    "sequence_id" => "seq-1"
},
     "level" => "INFO ",
"@timestamp" => 2019-06-18T16:26:03.639Z,
    "fieldC" => "kafka.producer.impl.MessageProducerKafkaImpl",
    "offset" => "1",
    "fieldA" => "kafka-producer-network-thread"

The kv filter will skip junk like "delivered to" and "with". That may not work with older versions.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.