Logstash: parsing fields from json in array of another json

Hi, i would like to parse some fields from json logs from AWS WAF. On input logstash use:
s3 {
bucket => "XXXX"
access_key_id => "XXXX"
secret_access_key => "XXXX"
prefix => "waf/"
region => "XXX"
sincedb_path => "/tmp/s3.sincedb"
add_field => [ "lso_name", "NULL", "lsi_type", "s3", "lsi_name", "waf" ]
codec => "json"

I get messages like:
"httpRequest"=> {
{"name"=>"Host", "value"=>"test.exmaple.com"},
{"name"=>"user-agent", "value"=>"Mozilla/5.0"},
{"name"=>"accept", "value"=>"/"}

I am trying to parse that header in logstash filter, but cant do it.

Output in kibana should be like:

"httpRequest.headers.Host" => "test.example.com"
"httpRequest.headers.user-agent" => "Mozilla/5.0"

I tryed json filter even kv, but with no success.

Thanks for any reply.

As it currently stands, your Elasticsearch index is getting a field httpRequest.headers.name and another called httpRequest.headers.value, which doesn't allow you to map names to values.

The array-of-objects will need to be transposed into a single object with named values.

I made something for you :slight_smile:

With the above-linked transpose.logstash-filter-ruby.rb, you could do the following:

filter {
  ruby {
    path => "/path/to/transpose.logstash-filter-ruby.rb"
    script_params => {
      "source" => "[httpRequest][headers]"
1 Like

Thanks, thats awesome! Works like a charm

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.