I am new to ELK. I have created a new index myself. But the data is still feeding into the old index. How do i switch it to new index?
What are you using to send the data?
filebeat to logstash to elastichseach and kibana
What is the logstash pipeline configuration ?
This is the output
output {
if [fields][log_type] == "networkInfo"{
elasticsearch {
hosts => ["localhost:9200"]
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
} else if [fields][log_type] == "websocket" {
elasticsearch {
hosts => ["localhost:9200"]
manage_template => false
index => "mdp-huobi-wslog-delay-%{+YYYY.MM.dd}"
}
}
if "_dateparsefailure" in [tags] {
file {
path => "/var/log/logstash/dateparsefailure.log"
codec => json_lines
}
}
file {
path => "/var/log/logstash/output.log"
codec => rubydebug
}
# stdout { codec => rubydebug }
}
But the data is still feeding into the old index. How do i switch it to new index?
What is the new index you want to send data to?
The new index i created my self is filebeat-7.6.1-2020.04.26-000001. I wish it will roll over day after day like filebeat-7.6.1-2020.04.27 and filebeat-7.6.1-2020.04.28 and filebeat-7.6.1-2020.04.29 but now it rollover like filebeat-7.6.1-2020.04.26-000001, filebeat-7.6.1-2020.04.26-000002, filebeat-7.6.1-2020.04.26-000003 and no data feed into it.
Actually, the old index has got the name pattern I wish correctly , it can roll over day by day like filebeat-7.6.1-2020.04.27 and filebeat-7.6.1-2020.04.28 and filebeat-7.6.1-2020.04.29 successfully, but index lifecycle policy doesn't work for it, so I created a new index and tried to apply the index lifecycle policy on it.
So does filebeat send data to logstash and then logstash to Elasticsearch?
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
This seems to be your index name. So the data are sent to this index I think.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.