It is not clear to me what you are trying to do, but if you want to set the value of a field on an event, you should use event.set, which is documented here.
I think in old versions you could do this
event['index_value'] = log_pattern[i]
but now you would would have to do
event.set('index_value', log_pattern[i])
And in your output, index => index_value, as documented under sprintf format here, should be
Thanks for responding. But the index_value is not applied in output plugin. In elasticsearch output, index => "%{index_value}" the actual value is not applied in logstash output index tag.
say for example this is my sample log,
2018-05-30 11:00:04,355 [INFO] packagename=sample.package.group, date=10-12-2017, status=pending,...etc.
I want to split this message with dynamic key value like packagename,data,status. and also the log format should not be same as all the files. so I have to split this logs with some respective keywords like how kv filter works. how to do that in ruby filter with dynamic key values.
I would use dissect (or grok) to take the date and log level off, then use a kv filter to parse the rest of the line. Why do you want to do it in ruby?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.