Removing extra characters in Grok


I am converting an original windows event log from json to syslog at the Logstash server.

Here is a partial output:

<13>May 31 14:27:55 {"name":'TEST'} LOGSTASH[-]: 2023-05-31T14:27:55.283Z {name=TEST} Permissions on an object were changed.\n\nSubject:\n Security ID: S-0-0-0\n Account Name:

However, I am having a hard time parsing these logs now. The main problem is the "name" being repeated twice, the LOGSTASH in the middle of the first line, the second date suddenly switched formats, and then the new line (\n) entries. How do I remove all of these things while maintaining the overall output? Do I put this filter on the input side or the output side? On the input, I do not see any of these things, so there must be some filtering that has to be done in the output? Is that possible with the syslog plugin being used?

Thank you for your help. :slight_smile:

Can you specify fields in this message?

I assume:

eventid: 13
date: May 31 14:27:55
name: TEST
source: LOGSTASH
time-created: 2023-05-31T14:27:55.283Z
host: TEST
msg: Permissions on an object were changed.
subject: \n
securityid: S-0-0-0 

Can you provide more context on what you are doing? Please share a sample of your json file and the Logstash configuration you are using.

My logs come from my windows machines via beats and then get parsed in my Logstash. However, I need to send my logs to another source that can't read json, only syslog. The other source that only reads syslog is having some issues reading my syslog output (seen above).

Logstash config:
port = > xxxx
if[tag]== "win" {
grok {
match => { "message" => %{GREEDYDATA:Out} }

host => ["host"]
port => "xxxx"
protocol => "tcp"

I unfortunately can't have any fields in my output. It has to be free of headers.

But what beats? It is not clear.

Are you using Winlogbeat to extract the events from windows and send it to Logstash or are you using something else to extract the the Windows Events, save it to json and send this json to Logstash?

Also, how are you parsing your messages? You have just a grok filter, if you are using beats to send to Logstash, it will send a json message, so you would need a json filter in the beginning of your pipeline to parse the json message,

Then you could manipulate the event to love only what you need.

You didn't share a sample of your message, nor the desired output, you need to share those.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.