Hello, I'm trying to take the logs from a file and send them to an elasticsearh instance in another machine, but I don't know why everytime I try it logstash service keep running without finishing the service and no new index is created in ES machine.
this is my logstash configuration:
input {
file {
path => ["/etc/elk/logout/test"]
type => "file"
start_position => "beginning"
}
}
filter {
if [type]== "file"{
mutate {
rename => ["@host", "host"]
}
dns {
reverse => ["host"]
action => "replace"
nameserver => "IP_DNS_SERVER"
}
grok {
patterns_dir => "/etc/logstash/patterns"
match => [
"message","%{MESSAGE_1}",
"message", "%{MESSAGE_2}",
"message", "%{MESSAGE_3}",
"message", "%{MESSAGE_4}",
"message", "%{MESSAGE_5}"
]
}
date{
match => [ "dater", "YYYY/MM/dd HH:mm:ss.SSS" ]
target => "@timestamp"
}
}
}
output {
elasticsearch
{
protocol => "http"
cluster => "logstash"
host=>"IP_B"
index => "logstash-syslog-%{+YYYY.MM.dd}-fromFile"
}
}
This is the format of text file content (it has been taken from another ES instance:
{"message":"2016/10/05 15:09:30.146 ...","@version":"1","@timestamp":"2016-10-05T13:09:30.205Z","type":"udp"...."}
{"message...}
...