Need to Parse a nested JSON message in #Logstash

Hello ,

I am trying to send my logs files ( .txt / json files ) to Logstash via Filebeat

my sample log structure is as below :

{"LogDetails":{"transaction-id":"1234","channel-id":"abc","APIName":"testapi","OperationName":"get","Timestamp":"2021-02-22 10:07:42.949352","BackendName":"NA","LogType":"Request","Status":"0","Parameters":{"id":"121"}}}

flow - file beats >> log stash >> Elasticsearch

In Elasticsearch i need the index fields to contain the fields of my logs and the log file should not be as a single message inside "message" field in Elasticsearch

here is my logstash.conf file >

input {
  beats {
    port => 5044
  }
}
filter {
    grok {
        match => { "message" => "%{GREEDYDATA:LogData}"}
    }
    json {
        source => "LogData"
        target => "LogData"
        skip_on_invalid_json => true
    }
}
output {

    elasticsearch {
        ilm_enabled => true
        index => "gorktest1"
        hosts => [ "http://localhost:9200" ]
    }
    stdout {}
}

in filebeat.yaml >

filebeat.inputs:
- type: filestream
  enabled: true
  paths:
    - C:\Users\nm\documents\{folder}\*
  json.keys_under_root: true

output.logstash:
  hosts: ["localhost:5044"]

processors:
  #- add_host_metadata:
     # when.not.contains.tags: forwarded
  #- add_cloud_metadata: ~
  #- add_docker_metadata: ~
  #- add_kubernetes_metadata: ~
  - decode_json_fields:
      fields: ["message"]
      process_array: true
      max_depth: 1
      target: ""
      overwrite_keys: false

i'am not sure where its going wrong but.......with this setup i can see all my log message is under a single field

need some help on this !!!

Hello @pavanKumar2K

Could try the below code , this will store the decoded format of json in "LogData" field.

filter 
{ 

json {
  source => "message"
  target => "LogData"
  skip_on_invalid_json => true
}

}

Keep Posted !!! Thanks !!!

Hi @sudhagar_ramesh , thanks for the reply ........... it works.

But now i have one more issue where if i want to process some complex json objects/data
elasticserach is considering few values with its default datatypes.

Ex - i am passing date as string - "date":"12-08-2022" , but its being conidered as date format and gives me a mapping parser exception

i am not sure , where i need to make changes for this.