Strange observation in ELK

Hi All,

I have created configuration in logstash and able to saw the logs in kibana with huge content like 7 lakh lines. Then i have separated the same data into parts with 1 lakh lines in each file and made 7 log files in it.Now still i am seeing old data with old log path even the log file not available.

Nothing had changed apart from making huge log file into parts and changing log path from /etc/logstash/audit.log to /etc/logstash/*.log.

My configuration looks like below.

input
{
file
{
path => "/etc/logstash/*.log"
codec => multiline {
pattern => "(^%{TIMESTAMP_ISO8601} )"
negate => "true"
what => "previous"
}
codec => "json"
type => "%{type}"
sincedb_path => "/dev/null"
}
}
filter
{
grok
{
match => {"message" => "%{SPACE}%{NOTSPACE}%{SPACE}%{GREEDYDATA}"}
}
}
output
{
elasticsearch
{
hosts => ["159.122.121.231:9200"]
index => "%{index}-%{+YYYY.MM.dd}"
document_id => "%{_id}"
}
stdout {codec => rubydebug }
}

Please help me to resolve this issue.

Regards
Raja

I resolved this issue by removing the old index and creating new one with same name.
I am unable to create new index and assigning the new data in to it. I have gone thru the documents but i am unable to reach that level.

Thanks for this again not replying back to my question.

See my answer at

Again misunderstanding.

I told because of not getting reply i was able to solve my problem.
If you observe again i asked one more question there. I have patience that's why i gave what i did to resolve my issue.

Even if any issue i will post the questions to here. No matter what how much time it will take.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.