Hello all, we're ingesting Windows event logs via Logstash and to get specific field data from an application log I've got the following filter in Logstash:
if [log_name] == "CISAccess" {
grok {
match => { "message" => "Usr=>%{DATA:User}#.*\sStn=>%{DATA:Workstation}#.*"}
}
}
Can somebody tell me how I can apply this to existing documents? - would I need to use a painless script or do I need to change the Logstash output to action => update and if so how do I keep ingesting new documents if I do that?
Thanks
@kernelpanic I think you have two options:
- build an ingest pipeline with the same grok logic inside of a grok processor and run an _update_by_query which passes data through that pipeline like
PUT _ingest/pipeline/my-grok-pipeline
{
"description" : "pulls more stuff from the message",
"processors" : [ {
"grok": {
"field": "message",
"patterns": ["Usr=>%{DATA:User}#.*\sStn=>%{DATA:Workstation}#.*"]
}
}]
}
POST my-index/_update_by_query?pipeline=my-grok-pipeline
{
"query": { "term": { "log_name": "CISAccess" } }
}
- Do the
_update_by_query with a Painless script, requiring you to figure out how to map that grok logic to Java-style regex/Patterns
Excellent, thankyou for getting back to me Mike.