Force to output before processing

Hi! Since i face some problems with nested fields in kibana lens, i decided to re-enginieer the ruby code for my filter, here is a snippet of it

ruby {
    code => '
      require "nokogiri"
      xml_doc = Nokogiri::XML(event.get("message"))
      xml_doc.remove_namespaces!

      rule_results = xml_doc.xpath("//rule-result")
      rule_results.each do |rule_result|
        result_text = rule_result.xpath("result").text
        next if result_text == "notselected"
        result = {result_text}
        rule = {rule_result.attr("idref")}
        role = {rule_result.attr("role")}
        severity = {rule_result.attr("severity")}
        weight = {rule_result.attr("weight")}
        ident = {rule_result.xpath("ident").text}
        time = {rule_result.xpath("ident").text}

      event.set('Result', result) unless filtered_rule_results.empty?
      event.set('Scanned rule', rule) unless filtered_rule_results.empty?
      event.set('Role', role) unless filtered_rule_results.empty?
      event.set('Severity', severity) unless filtered_rule_results.empty?
      event.set('weight', weight) unless filtered_rule_results.empty?

      end
    '
  }

Since im processing multiple xml lines with the same xpath, im force to use this code.

Now my problem is, can i send this event directly to output, before processing the next line?

The idea is to have multiple documents in elastic whit the same fields:

example of desire output

"Document1": {
  "Result": "selected",
  "Scanned rule": "rule1",
  "Role": "admin",
  "Severity": "high",
  "Weight": "5",
  "Time": "2024-05-29T12:34:56Z"
}

"Document2":{
{
  "Result": "notselected",
  "Scanned rule": "rule2",
  "Role": "root",
  "Severity": "low",
  "Weight": "1",
  "Time": "2024-05-29T12:34:56Z"
}
}

How can i be able to generate multiple documents in elastic with a single pipeline?

Can you provide more context here? It is not clear how your original event looks like.

Events in Logstash are independent from each other, so as soon as an event exit your ruby filter it will continue in the pipeline, if the next step is the output block, logstash will then create a batch size as configured to send it to your output.