Elapsed Filter

I've got 9 apache servers feeding data into Elasticsearch via Logstash, front-end is Kibana 4.

One of the items they want me to visualize is Average Visitor Stay Length. It was suggested in another thread that I could use the elapsed filter to calculate the value and store in elasticsearch.

Assuming I can make it work, I can apply it to all data being fed into the system. Is there a way to apply a logstash filter to the data I've already stored in elasticsearch? I've got 18 months of log file data in the system now I'd like to mangle it in place if possible.

(I can see a crude way to apply it. Given that I have the log file data on disk and the indexes are date-time based: use a script to delete the index for the day, use logstash to put it back in with the new filter in addition to the rest but .. man that seems tedious. Say .. is there an export function ...)

You will need to reindex the data to get this.

Looks like I'm about to learn something.

Reading this, it appears to be a fairly straightforward process. And this brings forward another question;

I've been getting by, so far, using bash to import data, and some basic operations on the cluster and nodes using the cat API. Would it be optimal instead to spend a few hours learning to use one of the clients instead?

Your call :stuck_out_tongue:
I just use Logstash!