Trying to trigger alert when field exceed limit

Hello,

I am trying to create an advanced watcher using json. Below is my json.

{
  "trigger": {
    "schedule": {
      "interval": "1m"
    }
  },
  "input": {
    "search": {
      "request": {
        "body": {
          "size": 0,
          "query": {
            "match_all": {}
          }
        },
        "indices": [
          "logstash-*"
        ]
      }
    }
  },
    "aggs" : {
      "total_code_scan" : { "max" : { "field" : "sonar.code.length"} }
   },
  "condition": {
    "compare": {
      "ctx.payload.total_code_scan": {
        "gte": 24
      }
    }
  },
  "actions": {
    "my-logging-action": {
      "logging": {
        "text": "There are {{ctx.payload.total_code_scan}} documents in your index. Threshold is 24."
      }
    }
  }
}

So, basically what i am trying to do is

  1. Watcher will search for field named : sonar.code.length
  2. If sonar.code.length max value is 24 or more, it will trigger logging alert.

Currently I am stuck with Watcher: [parse_exception] could not parse watch [inlined]. unexpected field [aggs] error.

May i know which part am i doing it wrong ? Thanks !

The aggregation must be in the body Like this

"trigger": {
  "schedule": {
    "interval": "1m"
  }
},
"input": {
  "search": {
    "request": {
      "body": {
        "size": 0,
        "query": {
          "match_all": {}
        },
        "aggs": {
          "total_code_scan": {
            "max": {
              "field": "sonar.code.length"
            }
          }
        }
      },
      "indices": [
        "logstash-*"
      ]
    }
  }
},
"condition": {
  "compare": {
    "ctx.payload.total_code_scan": {
      "gte": 24
    }
  }
},
"actions": {
  "my-logging-action": {
    "logging": {
      "text": "There are {{ctx.payload.total_code_scan}} documents in your index. Threshold is 24."
    }
  }
}

Hi elastock,

Very very thank you for the example.

Currently i am using your example and i got this simulation output.

"result": {
    "execution_time": "2018-06-21T11:02:18.582Z",
    "execution_duration": 6,
    "input": {
      "type": "search",
      "status": "success",
      "payload": {
        "_shards": {
          "total": 25,
          "failures": [
            {
              "node": "yYkc8cDWRgm6sXhgrEnnjQ",
              "reason": {
                "reason": "Fielddata is disabled on text fields by default. Set fielddata=true on [sonar.code.length] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory. Alternatively use a keyword field instead.",
                "type": "illegal_argument_exception"
              },
              "index": "logstash-2018.06.21",
              "shard": 0
            }
          ],
          "failed": 5,
          "successful": 20,
          "skipped": 0
        },
        "hits": {
          "hits": [],
          "total": 2267449,
          "max_score": 0
        },
        "took": 5,
        "timed_out": false,
        "aggregations": {
          "total_code_scan": {
            "value": null
          }
        }
      },
      "search": {
        "request": {
          "search_type": "query_then_fetch",
          "indices": [
            "logstash-*"
          ],
          "types": [],
          "body": {
            "size": 0,
            "query": {
              "match_all": {}
            },
            "aggs": {
              "total_code_scan": {
                "max": {
                  "field": "sonar.code.length"
                }
              }
            }
          }
        }
      }
    },
    "condition": {
      "type": "always",
      "status": "success",
      "met": true
    },
    "actions": [
      {
        "id": "my-logging-action",
        "type": "logging",
        "status": "simulated",
        "logging": {
          "logged_text": "There are {} documents in your index. Max is 24."
        }
      }
    ]
  },

I dont quite understand the reason : "Fielddata is disabled on text fields by default. Set fielddata=true on [sonar.code.length] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory. Alternatively use a keyword field instead.

How can i fix this error?

1 Like

Hi - you need to check the mapping for the field sonar.code.length.

It is probably mapped as text - and therefore doing a max aggregation on it isn't possible. Re-map that field as a number such as long or double.

Thank you for replying richcollier.

Using your advice, i added a convert filter that looks like below.

I need to add two filter blocks, one for add field another for converting.

filter {
    json{
        source => "message"
    }
	
	mutate { 
	add_field => { "metric" => "%{[component][measures][0][metric]}"  } 
	add_field => { "sonar.code.length" => "%{[component][measures][0][value]}"  } 
	add_field => { "sonar.project.name" => "%{[component][name]}" }
	}
}

filter {
  mutate {
    convert => { "sonar.code.length" => "integer" }
  }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.