I would like a help with my problem. Currently I have a database with postgresql and I have some queries in logstash to extract some information and send it to elasticsearch.One query return a list of data as a example below:
parser_result: "{"sgis_vulnerability": false, "service": "ssh", "timestamp": "2020-05-05 22:41:12", "target_ips": {"172.0.0.0": {"port": "22"}}, "source_ip": "200.133.1.1", "other_logs": "Inicio Ataque:\"2010-05-05 22:41:12\"\nFinal do Ataque:\"2010-05-05 22:41:12\"\nNome e Vers\u00e3o do Cliente:SSH-2.0-libssh-0.6.3\nUsu\u00e1rio:Tom\nPassword:tom", "source_hostname": "-", "source_port": "49608", "subject": "Tentativas de acesso n\u00e3o autorizadas a sistemas por for\u00e7a bruta"}"
I have a script in ruby that extract all information that I need, but I don't have know how to change it to work in logstash filter.
The problem is related with the field target_ips because it is a dictionary and when I use filter json my Elastic exceed the number 1000 fields like this:
target_ips.172.0.0.0
target_ips.172.0.0.1
target_ips.172.0.0.2
etc
I tried to remove it with the filter mutate
mutate {
remove_field => [ "[parser_result][target_ips]" ]
But it doesn't work because field target_ips is dynamic , is possible to help me ?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.