Currently, I have a flatine alert that correctly triggers if there is no log in Kibana for a component. The component's logs are sent to Kibana from 4 IPs. The problem I have is: since I am using flatline alert, it does not show for which IP the log has stopped. I will have to go to Kibana and manually run the query 4 times to know which IP does not have logs. So, I am looking for 1) If flatline alert can display the IP for which alert has stopped or 2) Replace flatline alert with a 'frequency' or 'any' type alert .
The alert I have in place is:
nextrulename: RLCMNoKibanaLogs index: logstash-* type: flatline query_key: ["@module_tag", "ipaddr"] threshold: 1 timeframe: minutes: 5 realert: minutes: 0 use_count_query: true doc_type: fluentd filter: - query: query_string: query: '@module_tag:rlcm' alert: my_alerts.AlertManager labels: alertsrc: ElasticSearch kafka: 'true' slack: 'true' severity: info annotations: description: No logs reaching kibana for RLCM component. summary: No logs available in Kibana from RLCM for the last 5 minutes.