Several questions about logstash and filebeat


(ja) #1

1.I am dealing with a database export file like this
Nokia^A20180101^A1999.99 ^A RM-356 ^A1 ^A/UCWEB8.9.0.253/50/999^A20171209

and ^A is the delimeter. I checked https://github.com/kkos/oniguruma/blob/master/doc/RE#L400 to find ^A without a result.
Would you mind give a hint on how to write the grok code?I used \u0001 but failed.

grok {
     match => {"message" => "%{DATA:product}\u0001%{DATA:startdate}\u0001%{DATA:amount}\u0001%{DATA:style}\u0001%{DATA:order}\u0001%{DATA:browers}\u0001%{DATA:enddate}"}
 }
  1. Under filebeat's directory, I have different .txt file to harvest, for example, the file named JGXX.txt to match format 1 , YGXX.txt to match format 2 and so on. Would you mind tell me how to deal with this?

3.I have the create tables sql file and I'd like to know is there a wheel or tool which can convert sql to mapping or template so I can load into elasticsearch?

Thank you.


(Magnus B├Ąck) #2

Under filebeat's directory, I have different .txt file to harvest, for example, the file named JGXX.txt to match format 1 , YGXX.txt to match format 2 and so on. Would you mind tell me how to deal with this?

I can think of a couple of options.

  • On the Filebeat side, use different prospectors and set a field or a tag to indicate what kind of file it is, then use Logstash conditionals to process them differently.
  • If the filename itself indicates the file type you can use Logstash conditionals that inspect the field containing the path of the source file.

(ja) #3

Thank you for your reply.And I will try both methods.
I am building a system which can analysis transaction data(database export file), is there any example I can follow?I am a fresh new in this architechure(filebeat,logstash,elasticsearch,kibana).
Thanks again for your kindess reply.


(ja) #4

for the first question, I handle it with filter mutate.

^A is a delimeter for Hive and it can be write in vim by ctrl +v, ctrl+a,so
split => ["message","^A"]


(system) #5

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.