Hello ,
I am trying to send my logs files ( .txt / json files ) to Logstash via Filebeat
my sample log structure is as below :
{"LogDetails":{"transaction-id":"1234","channel-id":"abc","APIName":"testapi","OperationName":"get","Timestamp":"2021-02-22 10:07:42.949352","BackendName":"NA","LogType":"Request","Status":"0","Parameters":{"id":"121"}}}
flow - file beats >> log stash >> Elasticsearch
In Elasticsearch i need the index fields to contain the fields of my logs and the log file should not be as a single message inside "message" field in Elasticsearch
here is my logstash.conf file >
input {
beats {
port => 5044
}
}
filter {
grok {
match => { "message" => "%{GREEDYDATA:LogData}"}
}
json {
source => "LogData"
target => "LogData"
skip_on_invalid_json => true
}
}
output {
elasticsearch {
ilm_enabled => true
index => "gorktest1"
hosts => [ "http://localhost:9200" ]
}
stdout {}
}
in filebeat.yaml >
filebeat.inputs:
- type: filestream
enabled: true
paths:
- C:\Users\nm\documents\{folder}\*
json.keys_under_root: true
output.logstash:
hosts: ["localhost:5044"]
processors:
#- add_host_metadata:
# when.not.contains.tags: forwarded
#- add_cloud_metadata: ~
#- add_docker_metadata: ~
#- add_kubernetes_metadata: ~
- decode_json_fields:
fields: ["message"]
process_array: true
max_depth: 1
target: ""
overwrite_keys: false
i'am not sure where its going wrong but.......with this setup i can see all my log message is under a single field
need some help on this !!!