I had shipped log to ES with the integration of filebeat and logstash.
This is my logstash.conf
# Read input from filebeat by listening to port 5044 on which filebeat will send the data
input {
beats {
type => "test"
port => "5044"
}
}
filter {
#If log line contains tab character followed by 'at' then we will tag that entry as stacktrace
if [fields][level] == "info" {
grok {
match => {"message" => ["(?m)%{TIMESTAMP_ISO8601:timestamp} - %{NOTSPACE:orgCode} - %{GREEDYDATA:message}"]}
#break_on_match => true
add_tag => ["cms-info"]
tag_on_failure => ["_grokparsefailure-cms-info"]
}
}
output {
stdout {
codec => rubydebug
}
# Sending properly parsed log events to elasticsearch
elasticsearch {
hosts => ["https://xxxxxxxx.us-east-1.aws.found.io:443"]
index => "%{[fields][service-name]}-%{[fields][env]}-%{[fields][application]}-%{[fields][level]}-%{+yyyy.MM.dd}"
user => "elastic"
password => "xxxxxx"
}
}
This is my log message
"2019-10-29 21:57:16.9884 - yyyyy - Generating [item settings cloudfront item links] for asset : 137909"
But in my ES board showsed duplicating message parts. like following
How can I solve this?