How to extract a field to match a keyword from a custom log to create alert

Hi,
i was watching the video how to create Alert by using system log file ( Load Average etc)
however, i have customer log file, bellow log are under message, What i am looking for is ,
When the Field [ALARM:NONE] will show [ALARM:POWER], Generate a Alert by attaching feld
[IMEI:352621109471182] in the email.

Can any one please tell me how can i do this ?
I would really appreciate your help.

Apr 30, 2019 @ 14:35:54.897 message:

14:35:48,811 DEBUG [com.yuma.jca.sockets.vt202] (default-threads - 47) [TIMESTAMP:Tue Apr 30 14:35:46 UTC 2019],[IMEI:352621109471182],[COMMAND:INFO],[GPS STATUS:true],[INFO:true],[SIGNAL:false],[ENGINE:0],[DOOR:0],[LON:90.35394999999998],[LAT:23.773595],[SPEED:1.1],[BATTERY:100],[GSM_SIGNAL:100],[GPS_SATS:7],[FUEL:0.0],[ALARM:NONE]

@timestamp:

Apr 30, 2019 @ 14:35:54.897

cloud.instance.id:

135683766

cloud.region:

lon1

cloud.provider:

digitalocean

log.file.path:

/root/wildfly-11.0.0.Final/standalone/log/VT202.log

log.offset:

746247

input.type:

log

host.name:

PRODWILDIFY

host.hostname:

PRODWILDIFY

host.architecture:

x86_64

host.os.platform:

ubuntu

host.os.version:

18.04.2 LTS (Bionic Beaver)

host.os.family:

debian

host.os.name:

Ubuntu

host.os.kernel:

4.15.0-45-generic

host.os.codename:

bionic

host.id:

982b7b8d1795445e9b8a9a794550123f

host.containerized:

false

agent.version:

7.0.0

agent.type:

filebeat

agent.ephemeral_id:

ac3e1b07-c74c-475a-96af-c162df6601d5

agent.hostname:

PRODWILDIFY

agent.id:

e47d8400-1077-4fd1-a56b-68e69e1b129e

ecs.version:

1.0.0

_id:

PTusbmoBMZCkMLhicKqO

_type:

_doc

_index:

filebeat-7.0.0-2019.04.24-000001

_score:

Hi there, could you please help me understand what you're trying to do by:

  • Linking to the video you watched
  • Mentioning which app in Kibana you're using -- I assume Watcher?
  • Mentioning which version of Kibana you're using
  • Formatting code and JSON data with triple backticks to preserve spacing

Hi Thanks
Please find the code which I have done,
its Kibana 7 and I am using Watcher,
now bellow code, I can see its Executing but the problem is, its not using log trailing, what i meant is example if its runs at 10.00 PM, its showing
"logged_text": "There are 60 documents in your index. Threshold is 1."
Now when its runs again after 1 hour example 11.00 PM, its showing the same result

I want to get alert for NEW occurence , but its keep showing the same Occurence, so if No new occurence from last Exeucte, should not it show 0 ?

basically, i want to get Alert for every new Occurence , if the number is same (60) (this is are past occurence) , then no need to sent alert. if this become 61 then sent alert
How can i do this ?

The code : -

{
  "trigger": {
    "schedule": {
      "interval": "60m"
    }
  },
  "input": {
    "search": {
      "request": {
        "search_type": "query_then_fetch",
        "indices": [
          "*"
        ],
        "rest_total_hits_as_int": true,
        "body": {
          "size": 0,
          "query": {
            "match": {
              "message": "ALARM:ALARM_POWER_OFF"
            }
          }
        }
      }
    }
  },
  "condition": {
    "compare": {
      "ctx.payload.hits.total": {
        "gte": 1
      }
    }
  },
  "actions": {
    "my-logging-action": {
      "logging": {
        "level": "info",
        "text": "There are {{ctx.payload.hits.total}} documents in your index. Threshold is 1."
      }
    }
  }
}

Execution Result (Simulation Result)

{
  "watch_id": "_inlined_",
  "node": "fWMM4t9NS2mHJoQRnN1bGQ",
  "state": "executed",
  "user": "elastic",
  "status": {
    "state": {
      "active": true,
      "timestamp": "2019-05-01T06:40:15.721Z"
    },
    "last_checked": "2019-05-01T06:40:15.721Z",
    "last_met_condition": "2019-05-01T06:40:15.721Z",
    "actions": {
      "my-logging-action": {
        "ack": {
          "timestamp": "2019-05-01T06:40:15.721Z",
          "state": "ackable"
        },
        "last_execution": {
          "timestamp": "2019-05-01T06:40:15.721Z",
          "successful": true
        },
        "last_successful_execution": {
          "timestamp": "2019-05-01T06:40:15.721Z",
          "successful": true
        }
      }
    },
    "execution_state": "executed",
    "version": -1
  },
  "trigger_event": {
    "type": "manual",
    "triggered_time": "2019-05-01T06:40:15.721Z",
    "manual": {
      "schedule": {
        "scheduled_time": "2019-05-01T06:40:15.721Z"
      }
    }
  },
  "input": {
    "search": {
      "request": {
        "search_type": "query_then_fetch",
        "indices": [
          "*"
        ],
        "rest_total_hits_as_int": true,
        "body": {
          "size": 0,
          "query": {
            "match": {
              "message": "ALARM:ALARM_POWER_OFF"
            }
          }
        }
      }
    }
  },
  "condition": {
    "compare": {
      "ctx.payload.hits.total": {
        "gte": 1
      }
    }
  },
  "metadata": {
    "name": "POWER_CUT",
    "xpack": {
      "type": "json"
    }
  },
  "result": {
    "execution_time": "2019-05-01T06:40:15.721Z",
    "execution_duration": 10,
    "input": {
      "type": "search",
      "status": "success",
      "payload": {
        "_shards": {
          "total": 28,
          "failed": 0,
          "successful": 28,
          "skipped": 0
        },
        "hits": {
          "hits": [],
          "total": 60,
          "max_score": null
        },
        "took": 9,
        "timed_out": false
      },
      "search": {
        "request": {
          "search_type": "query_then_fetch",
          "indices": [
            "*"
          ],
          "rest_total_hits_as_int": true,
          "body": {
            "size": 0,
            "query": {
              "match": {
                "message": "ALARM:ALARM_POWER_OFF"
              }
            }
          }
        }
      }
    },
    "condition": {
      "type": "compare",
      "status": "success",
      "met": true,
      "compare": {
        "resolved_values": {
          "ctx.payload.hits.total": 60
        }
      }
    },
    "actions": [
      {
        "id": "my-logging-action",
        "type": "logging",
        "status": "simulated",
        "logging": {
          "logged_text": "There are 60 documents in your index. Threshold is 1."
        }
      }
    ]
  },
  "messages": []
}

hi,
Please see my answer , is there anyway you can help me ?
Thanks

Hi there, thank you for explaining your question in more detail and for sharing your formatted code! Have you tried using the Acknowledge API to silence an executing watch until the watch's condition evaluates to false? You can also acknowledge a watch via the UI, by clicking the "Acknowledge" button on the watch's status page:

image

Does this help?

Hi,
I did some progress,
the main problem is, "gte": 72 , so if i dont change the values to 73, it will keep sending, then as soon as anotehr keyword will apper which make it total 73, it will keep sending till i change it to 74 and so on..

I am trying to make it such, if there is new LOG for Keyword "ALARM:ALARM_POWER_OFF" in last 60 minutes, then sent Alert

{
  "trigger": {
    "schedule": {
      "interval": "50s"
    }
  },
  "input": {
    "search": {
      "request": {
        "search_type": "query_then_fetch",
        "indices": [
          "*"
        ],
        "rest_total_hits_as_int": true,
        "body": {
          "size": 0,
          "query": {
            "match": {
              "message": "ALARM:ALARM_POWER_OFF"
            }
          }
        }
      }
    }
  },
  "condition": {
    "compare": {
      "ctx.payload.hits.total": {
        "gte": 72
      }
    }
  },
  "actions": {
    "send_email": {
      "email": {
        "profile": "standard",
        "to": [
          "monitoring@yuma-technology.co.uk"
        ],
        "subject": "Watcher Notification",
        "body": {
          "text": "{{ctx.payload.hits.total}} POWER CUT ALARM"
        }
      }
    }
  }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.