Problem with Ruby Filter

Hello People,

I would like a help with my problem. Currently I have a database with postgresql and I have some queries in logstash to extract some information and send it to elasticsearch.One query return a list of data as a example below:

parser_result: "{"sgis_vulnerability": false, "service": "ssh", "timestamp": "2020-05-05 22:41:12", "target_ips": {"172.0.0.0": {"port": "22"}}, "source_ip": "200.133.1.1", "other_logs": "Inicio Ataque:\"2010-05-05 22:41:12\"\nFinal do Ataque:\"2010-05-05 22:41:12\"\nNome e Vers\u00e3o do Cliente:SSH-2.0-libssh-0.6.3\nUsu\u00e1rio:Tom\nPassword:tom", "source_hostname": "-", "source_port": "49608", "subject": "Tentativas de acesso n\u00e3o autorizadas a sistemas por for\u00e7a bruta"}"

I have a script in ruby that extract all information that I need, but I don't have know how to change it to work in logstash filter.

My script in ruby is :

require 'json'

json_string = "{\"sgis_vulnerability\": false, \"service\": \"ssh\", \"timestamp\": \"2020-05-05 22:41:12\", \"target_ips\": {\"172.0.0.0\": {\"port\": \"22\"}}, \"source_ip\": \"200.133.1.1\", \"other_logs\": \"Inicio Ataque:\\\"2010-05-05 22:41:12\\\"\\nFinal do Ataque:\\\"2010-05-05 22:41:12\\\"\\nNome e Vers\\u00e3o do Cliente:SSH-2.0-libssh-0.6.3\\nUsu\\u00e1rio:Tom\\nPassword:tom\", \"source_hostname\": \"-\", \"source_port\": \"49608\", \"subject\": \"Tentativas de acesso n\\u00e3o autorizadas a sistemas por for\\u00e7a bruta\"}"

json = JSON.parse(json_string)

puts ""
puts "sgis_vulnerability: " + json["sgis_vulnerability"].to_s
puts "service: " + json["service"].to_s
puts ""
puts "timestamp: " + json["timestamp"].to_s
puts ""
...

Someone could explain how is possible to get this script and use it in logstash filter ?

Thank you

any reason why you don’t pass the data to json filter ?

if you insist on ruby then pass the field containing the json string to the code portion inside ruby filter block. something like

json_string = event.get(json_string)

then you can set the output value of your script to a field with

event.set(field_name) = (returned_value)

I’d go with json filter though. it’s simpler

Hello ptamba, First of all , thank you for your support

The reason is that I'm new and I don't know about json filter, I will look for some information to do it with json filter.

If you have any information how is possible to start it with json filter I will appreciate.

Best Regards

using your data example i will start with :

filter {
  json { 
    source => “parser_result”
  }
}

with ruby it will be something like

filter { 
  ruby { 
    code => ‘
          require json
          json_string = event.get(“parser_result”)
          json = JSON.parse(json_string)
          (— rest of the code —-) 
         #return the result 
          event.set(”sgis_vulnerability, json[“sgis_vulnerability”].to_s)
         ‘
   }
}

Amazing ptamba, Thank you

I use filter json and It almost work

The problem is related with the field target_ips because it is a dictionary and when I use filter json my Elastic exceed the number 1000 fields like this:
target_ips.172.0.0.0
target_ips.172.0.0.1
target_ips.172.0.0.2
etc

I tried to remove it with the filter mutate
mutate {
remove_field => [ "[parser_result][target_ips]" ]

But it doesn't work because field target_ips is dynamic , is possible to help me ?

Thank you again

you could just remove the target_ips field if you don’t need it. or use pass it to grok filter if you want to extract information from it.

Thank you ptamba, I solved the problem.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.