Most of the times we are ending in pushing logs to azure even not required, since our stack comprises large number of microservices its hard for us to stop/resume sending logs into logstash. so we would like to know if there is any possibility to enable/ disable output fields in a way it would send logs only when enabled, else logs get nullified inside pipeline/ storage itself. ?
Can explain / share some more detail information if required.
Hi @Badger Thanks for responding back, while using the drop {} am getting below error. IT states the drop plugin is not recongized by logstash helm chart i am using.
ERROR logstash.plugins.registry - Unable to load plugin. {:type=>"output", :name=>"drop"}
ERROR logstash.agent - Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main2, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (PluginLoadingError) Couldn't find any output plugin named 'drop'. Are you sure this is correct? Trying to load the drop output plugin resulted in this error: Unable to load the requested plugin named drop of type output. The plugin is not installed."
Yes tried using the expressions but am not sure where to point inside the expression to drop/ delete / eliminate the logs. can you please advise if there is a plugin/ command/ config can be used for discarding the logs
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.