Active tagging of in-progress bugs


(Kofi) #1

I'm trying to figure out a way to hide all logs displayed on Kibana that have already been seen and have already been started to be worked on. Another way to put it is, logs that I am dealing with, I don't want to display them on Kibana so I can focus on the current issues.

For example, if there is a certain bug that creates an error log every 5 mins but it is being worked on, I don't want to see it appear on my visualizations so that I can focus on the other errors that are being produced.

I was thinking of maybe tagging them through the ES API as "TrackingBug" to the logs I want to hide and then filtering my visualizations to hide those logs tagged "TrackingBug". I think this method works pretty well to hide all past issues but it isn't very good for hiding future errors of the same type. This is the code I made to tag the documents:

POST /logstash-*/_update_by_query
{
    "script": {
        "inline": "ctx._source.tags = \"TrackingBug\""
    }, 
    "query" :{
        "match_phrase": {
            "message": "Error 12312. Message alert"
        }
    }
}

Can this approach be used to somehow insert this POST command into elasticsearch so that all future logs with this message will also be tagged? I need to do this actively as new errors come in, so hardcoding something into logstash isn't ideal.

Or is there another completely different perspective on solving this?

Thanks for your help!


(Kofi) #2

Can anyone help?


(Christoph) #3

Hi,

this is a somewhat complex ask, so let me break this down a bit

  • adding this kind of tags based on a query like you mentioned works well on documents (logs) that are already indexed

  • for each new document, we would have to execut this query somewhere. This is traditionally something the percolate feature is good at. At the moment, however, you would need to have some external process (e.g. logstash or your own application) to drive this process for each new incoming document, add the tag depending on the results, and then index it

  • conceptually this could probably be something thats suited for the new ingest node in Elasticsearch 5.0, but currently this kind of operation is not supported. Its also not very likely that this is going to be supported in the near future. See https://github.com/elastic/elasticsearch/pull/20340 for a somewhat related feature that got discussed but ultimatively dismissed for its complexity

So summing this up, you could try to register the queries that you want to run on new documents using the percolator, but you would need some external process to drive the tagging process. I don't know if any such plugins exist for logstash. Running a bunch if queries on each new document also has the potentiall to slow down your indexing quiet a bit.

Hope this helps a bit to understand the situation.


(Kofi) #4

Ok, thanks for the thorough response. I can't tell you how much I appreciate it because I've been digging and digging to find a solution. I'll take a look at percolate.

Do you think scripting could help this situation at all? Like somehow making a variable that either communicates with logstash or ES script folder through the ES API or do you think my digging in that direction will lead no where?


(Christoph) #5

Hi,

Scripting helps you in updating the existing documents as you already described, but the main problem I see remains: you need some way of "classifying" incoming documents by some dynamically changing set of criteria (you expressed these as queries).

Currently I don't see an easy option of doing this inside ES, but I might be wrong, so happy if anybody else chimes in with other ideas.

When you have a chance to intercept inccoming loglines (either in logstash using a plugin or in your own code) you can try to do this matching using the percolator feature.


(Christoph) #6

Good news, I did some googling myself and there seems to be logstash plugins that can use the percolate api:

If you're using Elasticseach 5.0 this might not even be needed, since the elasticsearch plugin seems to be able to run queries from a logstash pipeline.

Haven't tried anything of this myself but thought this was worth pointing out to you.


(Kofi) #7

Yeah, I'm using 5.0 so the ES plugin I've been taking a look at. Thanks a lot for the help because sometimes for a beginner user like me, its hard to know how to even search for things with the proper language, so I really appreciate it!


(system) #8

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.