How to filter json log file in logstash and send it to elastic search

This is my json log file. I'm trying to store the file to my elastic-Search through my logstash.

    { "id": "135569", "title" : "Star Trek Beyond", "year":2016 , "genre": 
    ["Action", "Adventure", "Sci-Fi"] }

after storing the data into the elasticSearch, my results is as follow

{
    "_index": "filebeat-6.2.4-2018.11.09",
    "_type": "doc",
    "_id": "n-J39mYB6zb53NvEugMO",
    "_score": 1,
    "_source": {
      "@timestamp": "2018-11-09T03:15:32.262Z",
      "source": "/Users/jinwoopark/Jin/json_files/testJson.log",
      "offset": 106,
      "message": """{ "id": "135569", "title" : "Star Trek Beyond", "year":2016 , "genre":["Action", "Adventure", "Sci-Fi"] }""",
      "id": "%{id}",
      "@version": "1",
      "host": "Jinui-MacBook-Pro.local",
      "tags": [
        "beats_input_codec_plain_applied"
      ],
      "prospector": {
        "type": "log"
      },
      "title": "%{title}",
      "beat": {
        "name": "Jinui-MacBook-Pro.local",
        "hostname": "Jinui-MacBook-Pro.local",
        "version": "6.2.4"
      }
    }
  }

What I'm trying to do is that,

I want to store only "genre value" into the message field, and store other values(ex id, title) into extra fields(the created fields, which is id and title field). but the extra fields were stored with empty values(%{id}, %{title}). It seems like I need to modify my logstash json filter, but here I need your help.

my current configuration of logstash is as follow

input {
    beats {
     port => 5044
    }
}

filter {
    json {
            source => "genre" //want to store only genre (from json log) into message field 
    }
    mutate {
            add_field => {
                    "id" => "%{[_source][message][id]}" // want to create extra field for id value from log file
                    "title" => "%{[_source][message][title]}" // want to create extra field for title value from log file
            }
    }
    date {
            match => [ "timestamp", "dd/MM/yyyy:HH:mm:ss Z" ]
    }
}
output {
    elasticsearch {
            hosts => ["http://localhost:9200"]
            index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
    }
    stdout {
            codec => rubydebug
    }
}

You can let Filebeat parse the JSON in the message field for you.

If you instead want to do it in Logstash you need to change your json filter to work off the message field (which contains the JSON data), and not the genre field which does not exist at this point.

You can then also remove the mutate filter and possibly also the date filter as I do not see any timestamp field in your data.

Thank you for your kind reply.

as you mentioned, I added extra fields in filebeat.yml as follow

processors:
 - decode_json_fields:
     fields: ["id", "title", "genre"]
     process_array: false
     max_depth: 3
     target: ""
     overwrite_keys: true

I can't find any difference. need extra help

The field you need to decode is the message field. This is the one containing the JSON data.

Thank you It's solved

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.