Using Winlogbeat for custom logfiles?

Hi,
A newbee question...
I'm evaluating Winlogbeat as an option for log management of an application that has it's own custom log file on a windows-server.
Is Winlogbeat only using the Windows event log as input source?

Thanks.

Log management as in log rotation and housekeeping?

Yes, exactly!

These are not the tools for log housekeeping, don't quite get why it has been suggested for evaluation! Unless of course you are also using it to ingest data into an Elastic Search datastore. For file housekeeping you just need a simple script run by Windows Task Scheduler.

I realised that I was a bit un clear.
Yes, I want to rotate the log and avoid huge log files on the actual server, but I also want to ship them to our ELK environment to keep the log data for audits and trouble shooting, when needed.

My question: Is Windows Event Log the only option for Winlogbeat. Will I need another beat, and ship files if we need data from the application log file?

Yeah Winlogbeat is dedicated to Windows Event logs. For your Application logs you can use Filebeat https://www.elastic.co/products/beats to send your application log data to Elastic.

Alternatively to Filebeat you can use the "File" input plugin (included by default) in Logstash https://www.elastic.co/products/logstash. Logstash is the heavy duty option in that it has plenty of options to perform a lot transformation and filtering (can use grok) on the data before it sends it on to Elastic. So depending on your situation, Filebeat may suffice.

An example of using the Logstash file input plugin to add fields to log entries from an application (called Crystal in this case) log file using a wildcard as the log filenames contain a date stamp. Also note because I have used a wildcard I have to use forward slashes for the Windows path:

# Input from Crystal Logs

input {
     file {
           type => "Crystal"
           path => "C:/Logs/Crystal*.log"
          }
}

# Transformation Section to add extra Fields to Crystal log events

filter {

  if [type] == "Crystal" {
       mutate {
          add_field => { 
       		           "env" => "qa"
       		           "region" => "UK"
                       }
                }
          }
}

# Output to Elastic Search section

output {
    elasticsearch  {
                hosts => "xxx.xxx.xxx.xxx"
                index => "logstash-crystal-%{+YYYY.MM.DD}" 
       	           }
}

There is no way to use any of the Beats or Logstash to housekeep the log files that it processes. Might be a useful feature request!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.