Obtener un json de una entrada de log

Buenas!! necesitaria poder obtener un json de una entrada de log y poder convertir los campos de ese json a campos queryables (cabe destacar que esos campos pueden estar desordenados) en kibana a traves de elasticsearch.

este es mi logstash.conf actual.

input {
beats {
port => 5044
}
}

filter {
grok {
match => {
"message" => "{contrato:%{INT:contrato:int}}"
}
}
}

output {
elasticsearch {
hosts => ["elasticsearch:9200"]
index => "logstash"
}
stdout {
codec => json
}
}

Mi entrada de log es este:

DEBUG [2019-01-25 18:18:35,973] db679d09-66d1-4556-90d0-6174dc4b90ce a.c.e.s.j.c.JudicialidadService [dw-21 - GET /siniestros?expedientes=254472] {contrato:12345,tipo_documento:DNI,numero_documento:99999999}

pero tambien puede venir asi:

DEBUG [2019-01-25 18:18:35,973] db679d09-66d1-4556-90d0-6174dc4b90ce a.c.e.s.j.c.JudicialidadService [dw-21 - GET /siniestros?expedientes=254472] {contrato:12345,numero_documento:99999999,tipo_documento:DNI}

en ambos casos hay que convertir los campos del json en campos del documento que se grabe en elasticsearch.

Gracias!!

You might want to move this to Elastic en espanol.

Or not. You can chop up those logs using dissect. Something like

dissect { mapping => { "message" => "%{level} [%{ts} %{+ts}] %{guid} %{class} [%{something}] %{json}" } }

If your JSON is valid (it does not appear that it is) then you can parse it using

filter { json { source => "json" } }

You may be able to fix your JSON using

mutate { gsub => [ "json", "^{", '{"', "json", "}$", '"}', "json", ":", '":"', "json", ",", '","' ] }

This works to Logstash 6.5.4??

Yes. Can you show us what one of those events looks like when you add

 output { stdout { codec => rubydebug } }

to the end of your configuration? I want to see if the JSON fields are already quoted.

This is the logstash.conf

input {
beats {
port => 5044
}
}

filter {
dissect {
mapping => {
"message" => "%{level} [%{ts} %{+ts}] %{guid} %{class} [%{something}] %{json}"
}
}
}

output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => ["elasticsearch:9200"]
index => "logstash"
}

}

and this is the console output:

{
"host" => {
"name" => "4514e89c0134"
},
"class" => "a.c.e.s.j.c.JudicialidadService",
"@timestamp" => 2019-01-29T18:18:01.522Z,
"something" => "dw-21 - GET /siniestros?expedientes=254472",
"source" => "/webapp/filebeat/webapp/LOGGER.log",
"input" => {
"type" => "log"
},
"level" => "DEBUG",
"prospector" => {
"type" => "log"
},
"ts" => "2019-01-25 18:18:35,973",
"tags" => [
[0] "beats_input_codec_plain_applied"
],
"message" => "DEBUG [2019-01-25 18:18:35,973] db679d09-66d1-4556-90d0-6174dc4b90ce a.c.e.s.j.c.JudicialidadService [dw-21 - GET /siniestros?expedientes=254472] {contrato:12345}",
"@version" => "1",
"offset" => 1956,
"json" => "{contrato:12345}",
"beat" => {
"version" => "6.5.4",
"hostname" => "4514e89c0134",
"name" => "4514e89c0134"
},
"guid" => "db679d09-66d1-4556-90d0-6174dc4b90ce"
}

Now i have to convert the fields of that json into queryable fields (these fields can be disordered) in kibana through elasticsearch.

Thank you!!

If you change that to

filter {
    dissect { mapping => { "message" => "%{level} [%{ts} %{+ts}] %{guid} %{class} [%{something}] %{json}" } }
    mutate { gsub => [ "json", "^{", '{"', "json", "}$", '"}', "json", ":", '":"', "json", ",", '","' ] }
    filter { json { source => "json" } }
}

then I would expect that to work. The mutate will convert {contrato:12345} into {"contrato":"12345"} and the json filter should work just fine on that.

1 Like

Sorry, how does gsub work? That's magic for me. jaja

There are 4 gsubs there. The first replaces { at the start of a line with {". The second replaces } at the end of a line with "}. The third changes : to ":". The fourth changes , to ","

Thank you very much bro!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.