Automatically preprocessing data for a plugin

Working on a plugin here. I've got some records that come into elastic - and I'd like to preprocess and update them before the plugin gets ahold of them.

Easy enough to add a field to track if they have been processed, but I need somewhere to park a bit of code the periodically checks for new records and preprocesses any it finds. Preferably without having to add any new machines or require anyone to be connected to Kibana or my plug in.

Is this sonething that can be done on the server side of a Kibana plug-in or do I need to do it elsewhere? The records can be directly placed in the index without going through logstash, so placing it there wòn't get all the updates.

Well, processing the records at ingest time really is the right way to do this. If you're already using Logstash to index data, you should really consider trying to prevent data from making it in without Logstash.

That said, there's nothing to stop you from using server code in your plugin to do this. Since it would be running server-side, it wouldn't require users to interact with the instance, and you could using a node.js based cron helper, or even just setTimeout, to run the code periodically.

The key would be to use the init parameter in the plugin definition. As an example, the kibana plugin template uses this hook to configure routes, but you can do anything you want in this function, like kick off a polling task that can query Elasticsearch for data that doesn't have the right value in the tracking field and run some data processing on it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.