I'm aware of xpack and the access logs as described in elasticsearch.yml ( https://www.elastic.co/guide/en/x-pack/5.6/auditing.html ). But are there any other logs that track user usage within kibana (i.e. which dashboards users are visiting)?
We have opened a support case with Elastic and the official answer is that they don't have a way to do this, and it is not on their list of laundry items. Another reason is that they say this code has the potential to change over time. You can get an idea of what code to modify when you turn on debugging in your browser.
So we have modified a snippet of code to do this. When users logon, and when users click a dashboard. The snippet of code writes to a file and we have filebeat scrape that, it then sends to logstash for transformation, and then to elasticsearch. We then provide the graphs in kibana to show who is using the system, logins over time, dashboards that are used. We also had logstash enrich the data by doing a data lookup to get the users senior manager, and we created charts around what management organizations use our dashboards and how much. We also have a weekly job run that then reads all dashboards and then compares them to the index for usage and creates entries in another index to show us the dashboards that are used/not used. For instance you can answer the question, what dashboards have never been used?
You can turn on kibana and elastic logging... but it will generate a ton of data.
Also if you have kibana sitting in front of an F5 loadbalancer, in the log, it will show the loadbalancer ip, and not the userid. Because of the above 2 reasons is why wrote a custom snippet. We did this for kibana 5.6.1 and kibana 6.2.4
To log userids, modify: plugins\x-pack\plugins\security\server\routes\api\v1\authenticate.js
You will replace the async handler section with the below.
This was written for 6.2.4. Use at your own risk.
This will get you your userids when a load-balancer such as an F5 is used, as the kibana logging will only show the ip address of the load-balancer. If you want to capture dashboards... that is another snippet in another code file.
async handler(request, reply) {
const { username, password } = request.payload;
// Pulls username from request for logging purposes
var fs = require('fs');
var userJson = {
username: username
}
var jsonString = JSON.stringify(userJson) + "\n";
// Write to a log file that is monitored by logstash
fs.appendFile("add_your_log_directory_here/logs/userlogs.log", jsonString, function(err) {
if(err) {
return console.log(err);
}
console.log("The file was saved!");
});
The code you modified, did you add it to the src code or is this a file that already exists within the src code that you simply edited to add the scrape? I'd really be interested to learn more about how you accomplished this. We also noticed what you pointed out about capturing the loadbalancer ip address as well as the log.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.