Hallo Everybody
I have successfully configured ELK on my local machine. when i run following command, it start sending data to ES for indexing.
logstash -f.. \conf\Onsurance.conf --path.settings=file://C:/ELK/logstash/config
I want it check after every 30 seconds, either logfiles has new data, so it should send to ES for Indexing.
How to do that? as per documentation it check after every 15s but it seems like i have some config problem.
Today i have copied logfiles of last week from server. i was expecting Logstash will automatically recognize that it has new logfiles to parse but it has not done anything.
Here is my Config
input {
stdin { }
file {
type => "OnsuranceAppLog"
path => "D:/logs/main/Api.Ohcp/Onsurance/*ApplicationEntLib*.log"
start_position => "beginning"
sincedb_path => "NUL"
ignore_older => 0
codec => multiline {
pattern => "^%{WORD};"
negate => true
what => "previous"
}
}
}
filter {
mutate {
gsub => ["message", "\n", " "]
gsub => ["message", "\t", " "]
}
grok {
match => ["message", "(?m)%{WORD:LOGLEVEL}\;%{WORD:Machine}\;%{GREEDYDATA:Logtimestamp}\;%{WORD:}\=;%{WORD:}\=;%{WORD:}\=;%{WORD:}\;%{WORD:}\=;%{GREEDYDATA:message}"]
overwrite => [ "message" ]
add_field => { "ApplicationName" => "Onsurance" }
remove_field => ["WORD"]
}
#Set the Event Timesteamp from the log
date {
match => ["Logtimestamp","dd.MM.yyyy HH:mm:ss,SSS"]
remove_field => ["Logtimestamp"]
}
}
# See documentation for different protocols:
output {
if [type] == "OnsuranceAppLog" {
elasticsearch {
hosts => ["localhost:9200"]
index => "onsurance-%{+YYYY.MM.dd}"}
}
stdout { codec => rubydebug }
}
i am using sincedb_path="NUL" because currently i am reindexing all logs per command line.
Whats wrong with my config? any idea or solution?
Thanks and regards