Hi everyone
Logstash keeps crashing with 403 error, I know this error its caused by no available space on disk, on that moment I had 30 Gb, now I attached 100 Gb on disk and keeps crashing
I tried removing Logstash, installing another versions but keeps appering same error
Can someone help this poor blind men?
I have:
4 CPU
16 VCPU
32 GB RAM
130 GB STORAGE
CentOS 7
OK, so you ran out of disk space and elasticsearch put the indexes into read-only mode. Once you have added or freed up disk space you need to tell elasticsearch to enable read-write. See this post for how to do that.
In this context, there is no metadata to do a substition from. Check your index name, it will be something like filebeat-6.6.0-2019.02.07 and use that name in the curl command.
Oh, and if filebeat-%{[@metadata][version]}-2019.02.07 really is the literal name of your index then you may need to use backslash to escape some of the characters ({} and/or [ ]).
In the event that you have to do date math, you should cast the field esteems to date types in Painless. Regularly students ask me to do my homework for me. Who interested, welcome
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.