MS SQL monitoring with Filebeat?


- Using ELK 7.9.2 in Debian 10
- MS SQL Server 2016 SP2 Standard in Windows Server 2012

I want to start monitoring my MS SQL Server with Elasticsearch and visualize the data in Kibana. After some research I found and tried Metricbeat and enabled the mssql module. The pre-loaded dashboards looks great. The issue with this is beat is that only offers metrics data, but I am looking for more detailed monitoring like queries, users, logs, etc. I need to be able to check queries like INSERT, UPDATE, DELETE rather than just the rate.

Then I tried Filebeat. Apparently this is the right shipper to read and ship data from any kind of log or file. Found a post where a user was looking for the same I am, but she was told that Filebeat does not have a mssql module. That thread was from more than 1 year ago. Perhaps things have changed, because I found mssql module under Filebeat\modules.d. I enabled the module and ran setup command to create index in ES and upload the dashboards to Kibana. I found many dashboards but none for MSSQL.

I would like to know if this is still under development and full version is coming soon, or if I can set Filebeat somehow to ship all the data I need from MSSQL, even if I have to build my own dashboards.

Thank you

Is it possible for Filebeat to access MS SQL Server (like Metricbeat does) to read all transaction events (INSERT, UPDATE, DELETE) and then ship all the data to Elasticsearch?

It should be possible to use Extended Events and write down the events to a json file and then read that file with filebeat. I also think it is possible for MSSQL to write the extended events to a "audit table" and then you can query that table with filebeat. But I dont know if it wise to have the audit log stored on the database itself.

I dont know if the filebeat index template include audit fields but that could be solved by enabling below in the input.yml.

json.keys_under_root: true
json.add_error_key: true

Timestamp field may also need some tweaking, e.g. changing from "timestamp" to "@timestamp" with processor, or using timestamp ( processor.

You have a Filebeat MSSQL module to parse your logs (they must contain the queries, of course).

Then you can check your queries in the Logs view in Kibana.

You'll have to build your own dashboards according to the data you want to query which, usually, isn't generalized enough for all users to have a built in dashboard. But it's pretty straightforward.

@hocho is a very good trick for many situations, but it's not strictly necessary for MSSQL because there's a built-in module

If I enable the mssql module and leave the default values, I don't get any data in ELK/Discovery other than ERROR logs.

If I set a value for var.paths: - C:\Program Files\Microsoft SQL Server\MSSQL13.TSSMSSQL\MSSQL\Log\system_health*, then I get this error:

2020-10-19T15:24:27.614-0400 ERROR instance/beat.go:951 Exiting: Failed to start crawler: creating module reloader failed: Error getting config for fileset mssql/log: Error interpreting the template of the input: template: text:3:24: executing "text" at <.paths>: range can't iterate over C:\Program Files\Microsoft SQL Server\MSSQL13.TSSMSSQL\MSSQL\Log\system_health*

I am looking for the transaction logs, at least the basics: INSERT, UPDATE, DELETE, DROP

Can you paste a few example log lines you have on those files, please? Maybe you aren't using the default format

I just installed a new MS SQL Server 2016 Standard. Then imported a sample DB, just for testing the beats exporting. Those logs files are are generated by MS SQL, but file extension is .xel. system_health_0_132472753407060000.xel. This is not a plain text log, so I can't see what is inside. I am just trying to open these because I found somewhere that transaction logs could be found there, but I may be wrong.

All I want is to export logs that contain the transactions, at least the basic ones: INSERT, UPDATE, DELETE, DROP. If I leave Filebeat with just the default settings and start the service, it will attempt to read only the ERRORLOG files, but those does not have the transaction logs, just errors, besides that's not even from the right path.

2020-10-19T15:46:16.703-0400	INFO	[crawler]	beater/crawler.go:71	Loading Inputs: 1
2020-10-19T15:46:16.707-0400	INFO	log/input.go:157	Configured paths: [c:\ProgramFiles\Microsoft SQL Server\MSSQL.1MSSQL\LOG\ERRORLOG*]
2020-10-19T15:46:16.707-0400	INFO	[crawler]	beater/crawler.go:108	Loading and starting Inputs completed. Enabled inputs: 0
2020-10-19T15:46:16.707-0400	INFO	cfgfile/reload.go:164	Config reloader started
2020-10-19T15:46:16.710-0400	INFO	log/input.go:157	Configured paths: [c:\ProgramFiles\Microsoft SQL Server\MSSQL.1MSSQL\LOG\ERRORLOG*]

The path c:\ProgramFiles\Microsoft SQL Server\MSSQL.1MSSQL\LOG\ERRORLOG does not exist in my system. Instead should be C:\Program Files\Microsoft SQL Server\MSSQL13.TSSMSSQL\MSSQL\Log\ERRORLOG. Because of that, no data is arriving to Elasticsearch. So I set var.paths: - C:\Program Files\Microsoft SQL Server\MSSQL13.TSSMSSQL\MSSQL\Log\ERRORLOG* in mssql.yml, but then Filebeat service fails to start with error:

2020-10-19T16:33:30.585-0400 ERROR instance/beat.go:951 Exiting: Failed to start crawler: creating module reloader failed: Error getting config for fileset mssql/log: Error interpreting the template of the input: template: text:3:24: executing "text" at <.paths>: range can't iterate over C:\Program Files\Microsoft SQL Server\MSSQL13.TSSMSSQL\MSSQL\Log\ERRORLOG*

And I haven't done anything else but to provide the right path so the default config can work.

This is why I disabled the mssql module and enabled the filebeat.inputs in the filebeat.yml config file. Setting the same path that previously failed in mssql module, C:\Program Files\Microsoft SQL Server\MSSQL13.TSSMSSQL\MSSQL\Log\ERRORLOG* will work with out errors when used in filebeat.yml.

If you could tell me how I can archive my goal with a basic or advanced settings, please show me how.

Thank you in advance

Filebeat doesn't have support to read *.xel files. You need to configure SQL server to output the transaction logs on plain text

Hi @Mario_Castro,

Thank you for your help on this. Let's suppose for a moment that I just installed MS SQL Server 2016 Standard and I just installed Filebeat just to be able to extract the transaction logs (INSERT, UPDATE, DELETE, DROP) and send them to ELK to build custom dashboards that can show that data. It's a clean installation.

  • What would be your advice/steps to configure Filebeat? Please note that by just enabling mssql module won't work because:
    • It will read from wrong path by default
    • It will read just ERRORLOG files by default
    • If I set var.paths: to the right path in mssql.yml file, Filebeat process won't start due to errors reading ERRORLOG files. But same process will start without issues if I set the same path in filebeat.yml file and disable the mssql module.

Of course, I will look into exporting SQL logs in plain text, if possible

Most of the solutions I have found to export MSSQL logs are designed for the SQL Server error logs and for the SQL Agent error logs. But those are already saved in plain text and it's easy to point out Filebeat to read those. Besides, none of these logs includes the transaction logs.

Other solutions to export other logs or data, involves adding and setting additional DB plus the use of scripts. I confess that I am not an expert, so it may be difficult for me to implement.

Can you provide an easier solution that only involves configuring Filebeat, or can you help me to have SQL Server to export transaction logs in plain text, so Filebeat can read it?

Thank you in advance

@Mario_Castro so far I have been unable to have SQL Server to write the transaction logs in plain texts so Filebeat can red them and get the data I need. If you can, pleas help me to make this config or to config Filebeat to (somehow) get this data from SQL Server.

I am also open to try other beats as well. My goal is the same. I am giving a try to the sql module from Metricbeat, but got stuck with the config. Here is the thread. If this is something you can help me with, I will appreciate your support.

Thank you

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.