Logstash POS Log Transaction (TLogs) to JSON

Hi Everyone,
I have below log from IBM 4690 POS systems generating Transaction Logs (TLogs) as below

"::p:�:�:�:�:U:!a"
":p::D::�::�::�::!a:U: P:�:�:#:�:�:",":�!8qg4:�:�P:�::�",":�!8qg4:�:�P:�:�F:�",":�!8qg4:�:�:�::d","::",":0::",":�!8qg4:�:�P:�::�",":�!8qg4:�:�P:�:�F:�",":�!8qg4:�:�:�::d","::",":0::",":�!8qg4:�:�P:�::�",":�!8qg4:�:�P:�:�F:�",":�!8qg4:�:�:�::d","::",":0::",":�!8qg4:�:�P:�::�",":�!8qg4:�:�P:�:�F:�",":�!8qg4:�:�:�::d","::",":0::",":�!8qg4:�:�P:�::�",":�!8qg4:�:�P:�:�F:�",":�!8qg4:�:�:�::d","::",":0::",":�!8qg4:�:�P:�::�",":�!8qg4:�:�P:�:�F:�",":�!8qg4:�:�:�::d","::",":0::",":�!8qg4:�:�P:�::�",":�!8qg4:�:�P:�:�F:�",":�!8qg4:�:�:�::d","::",":0::",":�!8qg4:�:�P:�::�",":�!8qg4:�:�P:�:�F:�",":�!8qg4:�:�:�::d","::",":0::",":�!8qg4:�:�P:�::�",":�!8qg4:�:�P:�:�F:�",":�!8qg4:�:�:�::d","::",":0::",":�!8qg4:�:�P:�::�",":�!8qg4:�:�P:�:�F:�",":�!8qg4:�:�:�::d","::",":0::",":�!8qg4:�:�P:�::�",":�!8qg4:�:�P:�:�F:�",":�!8qg4:�:�:�::d","::",":0::",":",":�:�:�:�:�",":::::",":0:a:eaeG^         TAX INVOICE ABN 79 005 950 548         
e!e ",":0:a:/
e!e ",":0:a:eG    MANAGER:ABC XYZ PH:(dd) dddd dddd
e!e ",":0:a:ea01/01/18  10:49 101 SALES       dddd dddd ddd
e!e ",":0:a:ea
e!e ",":0:a: %FEAR W/DEAD S1                               
e!e ",":0:a:          2 @ 1.00    9321337167343 P1   2.00 P
e!e ",":0:a: %FEAR W/DEAD S1      9321337167343 P3   1.50 P
e!e ",":0:a:                                            
e!e ",":0:a:eGeaeWehTOTAL ITEMS = 3

e!e ",":0:a:eGeaeWeh01/01/18 10:49
e!e ",":0:a:eGeaeWeh
e!e ",":0:a:eGeWeh                         TOTAL           3.50  
e!e ",":0:a:            CASH TENDER                  3.50  
e!e ",":0:a: %TAXABLE ITEMS - GST AMOUNT              .32
e!e ",":0:a:ea^
e!e ",":0:a:^ea
e!e ",":0:a:Hfh@eaw",":0:a:ka023456789345107191130101000350ea",":0:a:                                      




e!e ",":0:a:"

I would like to get a clean json output for the transaction
I am pretty new to logstash. Can anyone please suggest the best approach.

As a first step, I have tried following config where using REGEX to get rid of special characters and then use the keywords to construct JSON output. Is this right approach or there is better approach to read TLogs encoded files. Appreciate all your help/support in advance

input {
  beats {
    port => 5044
  }
}
filter {

grok {

match => "[^a-z0-9``~!@#$%^&*()\-_=+\[\]{}\\|;:'"<>,./?\r\n]%{DATA:message}$"]

}
}

}
output {
 file {
   path => "/tmp/outlog.dat"
   
   codec => line  { format => "%{message}"}
 }
}

Please edit your post, select the TLog text and click on </> in the toolbar so that it is formatted like your configuration.

Are you consuming the whole of that TLog as a single event?

Also, do you happen to if they are ACE or SA format?

Also, are you really constrained to use logstash for this? I would think you would be better off using Apache Daffodil and a little bit of Java code. DFDL is highly structured data and you will end up throwing a lot of that structure away. The basic schema for the TLog DFDL is on github.

Thanks. I have updated the TLog format. We use SA (GSA) format. Not constrained to use logstash, evaluating options for best approach. Thanks for sharing Apache Daffodil. Will go through DFDL schema

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.