Create json object in logstash & pass to the elastic search

My log format is like,
02.11.2017,11:33:13,DDIC,6,192.168.2.110,PFCG,SAPMSYST SAPMSYST1,Logon Successful (Type=A)

My filter is like,

filter {
grok{
match=>{
"message"=>"%{NOTSPACE:date},%{NOTSPACE:time},%{WORD:user},%{NUMBER:riskindex},%{IP:terminal},%{NOTSPACE:tcode},%{GREEDYDATA:program},%{GREEDYDATA:messagetext}"
}
add_field=>{
"eventName"=>"grok"
}
}
}

& I am getting output in elastic is as below,

{
"date" => "02.11.2017",
"terminal" => "192.168.2.110",
"program" => "SAPMSYST SAPMSYST1",
"message" => "02.11.2017,11:33:13,DDIC,6,192.168.2.110,PFCG,SAPMSYST SAPMSYST1,Logon Successful (Type=A)\r",
"type" => "logs",
"riskindex" => "6",
"tcode" => "PFCG",
"path" => "D:\logfile\test.log",
"@timestamp" => 2017-11-07T09:48:41.835Z,
"messagetext" => "Logon Successful (Type=A)\r",
"@version" => "1",
"host" => "GLT-D103",
"eventName" => "grok",
"time" => "11:33:13",
"user" => "DDIC"
}

Expected output for me is,
{
"date" => "02.11.2017",
"terminal" => "192.168.2.110",
"program" => "{"grp1":"SAPMSYST",
"grp2":"SAPMSYST1"}",

"message" => "02.11.2017,11:33:13,DDIC,6,192.168.2.110,PFCG,SAPMSYST SAPMSYST1,Logon Successful (Type=A)\r",
"type" => "logs",
"riskindex" => "6",
"tcode" => "PFCG",
"path" => "D:\logfile\test.log",
"@timestamp" => 2017-11-07T09:48:41.835Z,
"messagetext" => "Logon Successful (Type=A)\r",
"@version" => "1",
"host" => "GLT-D103",
"eventName" => "grok",
"time" => "11:33:13",
"user" => "DDIC"
}

If anyone could point out my oversite or redirect my efforts, it would be greatly appreciate it.

Thanks

Replace

%{GREEDYDATA:program}

with

%{WORD:[program][grp1]} %{WORD:[program][grp2]}
1 Like

it's works Thanks :slight_smile:

Just have one more question here(previous content),
If my log contains n numbers of fields that get generated automatically {for ex. 02.11.2017,11:33:13,DDIC,6,192.168.2.110,PFCG,SAPMSYST SAPMSYST1 SAPMSYST2 SAPMSYST3 .... SAPMSYSTn,Logon Successful (Type=A)) } then how to deal with this kind of situation?

Hope you understand my question

No, I don't quite get it. Is it the "SAPMSYST ..." column that's dynamic?

Yes ...!!!

You'll have to write a piece of Ruby code in a ruby filter. Capture the whole column into a field, split the field, and add/update fields for each item in the list.

Can you please share code snippet for this(if you have)? I am very new to logstash & ruby. Your help make it easy.... Mean while I'll also try it at my end. Thanks :slight_smile:

Assuming you extract the "SAPMSYST SAPMSYST1 SAPMSYST2 SAPMSYST3" string into the field programs something like

prog_list = event.get("programs").split()
prog_list.each_index { |i|
  event.set("[program][prg#{i}]", prog_list[i])
}

might work. But why not just let the program field be an array? Why do you need separate fields?

1 Like

ok ... Will make program field as array & try it. Thanks for your reply :slight_smile:

One more question: For hashing/encryption I have used 'fingerprint' plugin but when i tried to find plugin for decryption then there is no plugin in logstash(Cipher is available for decrypt function but this plugin is not available for latest logstash version(5.6.3) ). Can you please provide such plugin name to decrypt my encrypted field?

I don't think there is a decryption plugin. What problem are you trying to solve?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.