Pattern Creation for Filter

42 32 45 41 31 44 43 36 39 37 38 43 B2EB1DC6978C

[16/07/19][08:28:14][EzAHC3421][7632]IN [0] <200>
[16/07/19][08:28:15][EzAHC3421][7632]IN [1]
[16/07/19][08:28:15][EzAHC3421][7632]Value [000000268000]
[16/07/19][08:28:15][EzAHC3421][7632]IN [1] <0000000020000000>
[16/07/19][08:28:15][EzAHC3421][7632]IN [2] <16>
[16/07/19][08:28:15][EzAHC3421][7632]IN [2] <4092630000021310>
[16/07/19][08:28:15][EzAHC3421][7632]IN [3] <100000>
[16/07/19][08:28:15][EzAHC3421][7632]IN [4] <000000000000>
[16/07/19][08:28:15][EzAHC3421][7632]IN [5] <56289>
[16/07/19][08:28:15][EzAHC3421][7632]IN [6]
[16/07/19][08:28:15][EzAHC3421][7632]TempAmount[000000000000]
[16/07/19][08:28:15][EzAHC3421][7632]Error in Reading...
[16/07/19][08:28:15][EzAHC3421][7632]The Transaction amount before sending [ 2680.00 ] RefNum[ 2993242251568 ]
[16/07/19][08:28:15][EzAHC3421][7632]Message Sucessfully Written to Channel AChannel
[16/07/19][08:28:15][EzAHC3421][7632]CBMachine Waiting for Request to be Serviced
[17/07/19][00:30:33][EzAHC3421][7632]Message of size[ 12928 ] Received from Channel[ EzAHC3421 ]
[17/07/19][00:30:33][EzAHC3421][7632]MsgType = [210]

How can i output like below from above log.

If [3]=<100000> then, date from previous 6 lines and after 3 lines.

timestamp: 16/07/19 08:28:14
Field 1: 200
Field 2: B038440000E04000
Field 3: 2680.00
Field 4: 0000000020000000
Field 5: 16
Field 6: 4092630000021310
Field 7: 880000
Field 8: 000000000000
Field 9: 56289
Field 10: NewYork

Look at example 5 of the aggregate filter. Note that some of the fields you say you want do not appear in the data.

Possible to provide the filter for above log? I am actually trying to learn the filter pattern, real time example would help me to understand better.

This should get you started

    dissect { mapping => { "message" => "[%{[@metadata][timestamp]}][%{+[@metadata][timestamp]}][%{correlationId}][%{someNumber}]%{[@metadata][restOfLine]}" } }
    grok {
        match => {
            "[@metadata][restOfLine]" => [
                "IN \[%{NUMBER:key}\] <%{DATA:value}>",
                "The Transaction amount before sending \[ %{NUMBER:amount:float} \] RefNum"
            ]
        }
    }

    date { match => [ "[@metadata][timestamp]", "dd/MM/YY']['HH:mm:ss" ] }

    aggregate {
        task_id => "%{correlationId}"
        timeout_task_id_field => "eventId"
        inactivity_timeout => 10
        push_map_as_event_on_timeout => true
        code => '
            unless map["@timestamp"]
                map["@timestamp"] = event.get("@timestamp")
            end

            a = event.get("amount")
            if a
                map["field3"] = a
            end

            k = event.get("key").to_i; v = event.get("value")
            if k and v
                case k
                when 0
                    map["field1"] = v
                when 1
                    map["field4"] = v
                when 2
                    map["field6"] = v
                end
            end

            # Drop the individual lines and just push the aggregate
            event.cancel
        '
    }

Thanks a lot Badger, i will start with the filter you provided. And, will post if there is any help required.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.