I am trying to pick a field from one event to another using aggregation filter

have log files that I am able to get fields based on two different if/grok statements and patterns. The output from the two are like below;

{
       timestamp" => 2021-06-09T03:08:30.943Z,
            "Loc" => "91340",
       "@version" => "1",
     "@timestamp" => 2021-07-17T04:09:36.438Z,
       "location" => 274.05292,
          "speed" => 2.6279999999999997,
"target_location" => 261.11999999999995,
           "host" => "AUDPRWL00192",
           "path" => "C:/ELK/LOGS/91340____________090621_021536_2653_ATO_B.txt",
}
{
        "ID" => "066",
      "host" => "AUDPRWL00192",
   "MESSAGE" => "0560BFC0BC00C8005023AE00164260BFC0BC6B5DDC5B",
 "timestamp" => 2021-06-09T03:08:27.540Z,
      "path" => "C:/ELK/LOGS/91340____________090621_021536_2653_ATO_B.txt",
       "Loc" => "91340",
  "@version" => "1",
"@timestamp" => 2021-07-17T04:09:36.428Z

I am trying to aggregate so that my end goal is to get the following i.e pick values from the previous event i.e speed and location so that the output that i can send to Elastic is;

{
        "ID" => "066",
      "host" => "AUDPRWL00192",
   "MESSAGE" => "0560BFC0BC00C8005023AE00164260BFC0BC6B5DDC5B",
 "timestamp" => 2021-06-09T03:08:27.540Z,
      "path" => "C:/ELK/LOGS/91340____________090621_021536_2653_ATO_B.txt",
       "Loc" => "91340",
     "speed" => 2.6279999999999997,
  "location" => 274.05292,
  "@version" => "1",
"@timestamp" => 2021-07-17T04:09:36.428Z
}

The aggregation filter i am trying is;

aggregate {
task_id => "%{host}%{path}"
code => "map['location'] = event.get('[location]')"
map_action => "create"}

You are not even close to providing enough information to propose a solution for this. If you look at the aggregate documentation then which of the 5 examples is closest to your use case?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.