ElasticSearch : Failure Sink always running

The logs are always being written to both Elasticsearch and Local Log Files (when used as a Failure Sink) . This should not be the case. Local Log Files should be created only when the application fails to write logs to Elasticsearch

Serilog.Debugging.SelfLog.Enable(TextWriter.Synchronized(file));
        Log.Logger = new LoggerConfiguration()
                 .ReadFrom.Configuration(configuration)
                 .WriteTo.Elasticsearch(new ElasticsearchSinkOptions
                 {          
                     EmitEventFailure = EmitEventFailureHandling.WriteToSelfLog |
                               EmitEventFailureHandling.WriteToFailureSink |
                               EmitEventFailureHandling.RaiseCallback,
                     FailureSink = new RollingFileSink("./failures.txt", new JsonFormatter(),null,30)
                 })
                 .CreateLogger();

Error on SelfLog
Caught exception while preforming bulk operation to Elasticsearch: Elasticsearch.Net.ElasticsearchClientException: No connection could be made because the target machine actively refused it. Call: Status code unknown from: POST /_bulk ---> System.Net.Http.HttpRequestException: No connection could be made because the target machine actively refused it ---> System.Net.Sockets.SocketException: No connection could be made because the target machine actively refused it

Welcome to our community! :smiley:

Are you asking how to do this for serilog? If that's the case you might need to ask the developers there, I don't know if Elastic has expertise in that library. Otherwise, someone else might, and may stop by to help.

How do we configure FailureSink to write to local Files when Elasticsearch fails to accept the events?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.