Hi All,
I have created configuration in logstash and able to saw the logs in kibana with huge content like 7 lakh lines. Then i have separated the same data into parts with 1 lakh lines in each file and made 7 log files in it.Now still i am seeing old data with old log path even the log file not available.
Nothing had changed apart from making huge log file into parts and changing log path from /etc/logstash/audit.log to /etc/logstash/*.log.
My configuration looks like below.
input
{
file
{
path => "/etc/logstash/*.log"
codec => multiline {
pattern => "(^%{TIMESTAMP_ISO8601} )"
negate => "true"
what => "previous"
}
codec => "json"
type => "%{type}"
sincedb_path => "/dev/null"
}
}
filter
{
grok
{
match => {"message" => "%{SPACE}%{NOTSPACE}%{SPACE}%{GREEDYDATA}"}
}
}
output
{
elasticsearch
{
hosts => ["159.122.121.231:9200"]
index => "%{index}-%{+YYYY.MM.dd}"
document_id => "%{_id}"
}
stdout {codec => rubydebug }
}
Please help me to resolve this issue.
Regards
Raja