There are no specific features built-in to the Site Search Application itself, however it should be possible to block requests initiated by bots or even other search agents (like googlebot), by incorporating some degree of input validation from the front-end of your site.
You'll want to consider the manner in which the requests are being made from either your host logs or a dedicated analytics platform like GA to trace the session or point of origin for the searches. I've seen cases where the behavior is due to an errant crawler that is able to access the Search Page of a site and is then attempting to submit information into any available input. For the non-bad actors, you can typically add a directive to either the site's robots.txt file or to individual pages via robots meta tags.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.