I am sending syslog data to logstash in the form of attribute=value separated by spaces. Sometime some fields might be empty. For example in the syslog message you might have:
JOBNAME=ABCD SOURCE= CLASS=CLASS1 ENTITY=ENTITY2
I am just using a kv parser to forward to elastic. The SOURCE value and others
sometimes have no data in them. Thus after forwarding to elastic it gets mixed up on the KV pairs.
So I don't want to lose the CLASS KV pair as it is attached to the SOURCE attribute by mistake.
I have been looking on here and all I have found is examples when the entry is empty after the colon already, not in how to get the entry placed right in elastic.
Please share the logstash configuration you are using.
What is generating those messages? Do you have any control over it to change the format and use double quotes in the values? Unquoted values can mix up the KV filter and you may need to use other filters to parse the message.
I have a bunch of messages that are generated by underlying code.
So I can't easily change the code so I was trying to get logstash to make it work
when a field was empty.
So it seems everything comes in as gets stored in a field called message.
My filter is simple right now.
I was trying to replace "= " with "=EMPTY " but the value was changed in the output of the message field. I need to filter on input maybe if that is possible?
I am not sure.
I can't see anything in the DOC that allows me to replace all occurrences of a string with another. Seems like that grok matches work on a field, but the input string is causing my field to not be found.
Is there a way to replace the "= " on input with a default value?
I can't seem to figure this out.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.