Processing events in json format and do correlation to find the duration based on correlation id

Newbie in logstash, 24 hours, so you might find the problem is trivial.
input is multiple records in json format. PFB.
Below is log stash config file. while executing, getting _jsonparsefailure.

input {
   stdin{}
}
filter {
 
 json {
    source => "message"    
  }
    
    
 
   aggregate {
        task_id => "%{[correlationID]}"
        code => '
          
		  e = "%{[freeText]}"
           map["time#e"] = "%{[timestamp]}"

            event.cancel
        '
        push_map_as_event_on_timeout => true
        timeout => 120
        timeout_code => '
            event.set("duration", map("timeProcess Completed").to_f - map("timeProcess Started").to_f)
        '
    }
 
}

output {
   file {
      path => "C:/ELK/inputs/output.log"
   }
}
Input :-
*{"auditID":"168975.STO.CDT.16107247327378975","auditType":"INFO","correlationID":"STO.CDT.16107247327378975","jobID":"168975","message":"BusinessServices/SW/SWW/GetCustomerDetails/V001/Logic/getCustomerDetails.process STARTED","freeText":"Process Started","country":"SW","moduleName":"STO","serviceName":"STO","serviceVersion":"5.x","operationName":"getCustomerDetails","operationID":"HYBRIS-getCustomerDetails","operationVersion":"1.0","processName":"BusinessServices/SW/SWW/GetCustomerDetails/V001/Logic/getCustomerDetails.process","status":"IMPLEMENTATION_START","statusCode":20,"sourceSystem":"HYBRIS","targetSystem":"ERP","category":"WSSync","subCategory":"SW","hostName":"localhost","timestamp":"2021-01-15T16:32:12.737+01:00","appNode":"BW5","appSpace":"BW5","domain":"EU_UAT_514"}*
*{"auditID":"168975.STO.CDT.16107247327378975","auditType":"INFO","correlationID":"STO.CDT.16107247327378975","jobID":"168975","message":"BusinessServices/Common/Services/V001/Logic/getCustomerDetails.process STARTED","freeText":"Request Received from Hybris","country":"SW","moduleName":"STO","serviceName":"STO","serviceVersion":"5.x","operationName":"getCustomerDetails","operationID":"HYBRIS-getCustomerDetails","operationVersion":"1.0","processName":"BusinessServices/Common/Services/V001/Logic/getCustomerDetails.process","status":"INTERFACE_START","statusCode":15,"sourceIP":"10.155.42.20","sourceSystem":"HYBRIS","targetSystem":"ERP","category":"WSSync","subCategory":"SW","hostName":"localhost","timestamp":"2021-01-15T16:32:12.737+01:00","appNode":"BW5","appSpace":"BW5","domain":"EU_UAT_514"}*
*{"auditID":"168975.STO.CDT.16107247327378975","auditType":"INFO","correlationID":"STO.CDT.16107247327378975","jobID":"168975","message":"BusinessServices/SW/SWW/GetCustomerDetails/V001/Logic/getCustomerDetails.process COMPLETED","freeText":"Process Completed","country":"SW","moduleName":"STO","serviceName":"STO","serviceVersion":"5.x","operationName":"getCustomerDetails","operationID":"HYBRIS-getCustomerDetails","operationVersion":"1.0","processName":"BusinessServices/SW/SWW/GetCustomerDetails/V001/Logic/getCustomerDetails.process","status":"INTERFACE_END","statusCode":40,"sourceSystem":"HYBRIS","targetSystem":"ERP","category":"WSSync","subCategory":"SW","hostName":"localhost","timestamp":"2021-01-15T16:32:12.75+01:00","appNode":"BW5","appSpace":"BW5","domain":"EU_UAT_514"}*
*{"auditID":"168975.STO.CDT.16107247327378975","auditType":"INFO","correlationID":"STO.CDT.16107247327378975","jobID":"168975","message":"BusinessServices/Common/Services/V001/Logic/getCustomerDetails.process COMPLETED","freeText":"Response Sent to Hybris","country":"SW","moduleName":"STO","serviceName":"STO","serviceVersion":"5.x","operationName":"getCustomerDetails","operationID":"HYBRIS-getCustomerDetails","operationVersion":"1.0","processName":"BusinessServices/Common/Services/V001/Logic/getCustomerDetails.process","status":"INTERFACE_END","statusCode":40,"sourceSystem":"HYBRIS","targetSystem":"ERP","category":"WSSync","subCategory":"SW","hostName":"localhost","timestamp":"2021-01-15T16:32:12.751+01:00","appNode":"BW5","appSpace":"BW5","domain":"EU_UAT_514"}*

What error message does logstash log?

[2021-01-17T11:12:22,286][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.10.1", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc Java HotSpot(TM) 64-Bit Server VM 25.131-b11 on 1.8.0_131-b11 +indy +jit [mswin32-x86_64]"}
[2021-01-17T11:12:22,620][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2021-01-17T11:12:25,659][INFO ][org.reflections.Reflections] Reflections took 85 ms to scan 1 urls, producing 23 keys and 47 values 
[2021-01-17T11:12:26,684][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["C:/Sandeep/Softwares/ELK/logstash-7.10.1-windows-x86_64/logstash-7.10.1/config/logstash.conf"], :thread=>"#<Thread:0x1c69bc7 run>"}
[2021-01-17T11:12:28,210][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.49}
[2021-01-17T11:12:28,340][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2021-01-17T11:12:28,425][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2021-01-17T11:12:29,094][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2021-01-17T11:13:04,642][WARN ][logstash.filters.json    ][main][e3dce1345f5e986b72e34b1d52cfc4153c396c87955a9d0cd38b8c4718b04fef] Parsed JSON object/hash requires a target configuration option {:source=>"message", :raw=>"\r"}
[2021-01-17T11:13:04,882][INFO ][logstash.outputs.file    ][main][323a040a12f06c68b3f4bfa4cd7939bfbf6775e29c4eeec8f627ebff8daa8b5a] Opening file {:path=>"C:/ELK/inputs/output.log"}
[2021-01-17T11:13:23,411][INFO ][logstash.outputs.file    ][main][323a040a12f06c68b3f4bfa4cd7939bfbf6775e29c4eeec8f627ebff8daa8b5a] Closing file C:/ELK/inputs/output.log
[2021-01-17T11:13:41,904][WARN ][logstash.filters.json    ][main][e3dce1345f5e986b72e34b1d52cfc4153c396c87955a9d0cd38b8c4718b04fef] Parsed JSON object/hash requires a target configuration option {:source=>"message", :raw=>"\r"}
[2021-01-17T11:13:41,943][INFO ][logstash.outputs.file    ][main][323a040a12f06c68b3f4bfa4cd7939bfbf6775e29c4eeec8f627ebff8daa8b5a] Opening file {:path=>"C:/ELK/inputs/output.log"}
[2021-01-17T11:13:58,416][INFO ][logstash.outputs.file    ][main][323a040a12f06c68b3f4bfa4cd7939bfbf6775e29c4eeec8f627ebff8daa8b5a] Closing file C:/ELK/inputs/output.log
[2021-01-17T11:15:33,464][INFO ][logstash.outputs.file    ][main][323a040a12f06c68b3f4bfa4cd7939bfbf6775e29c4eeec8f627ebff8daa8b5a] Opening file {:path=>"C:/ELK/inputs/output.log"}
[2021-01-17T11:15:58,460][INFO ][logstash.outputs.file    ][main][323a040a12f06c68b3f4bfa4cd7939bfbf6775e29c4eeec8f627ebff8daa8b5a] Closing file C:/ELK/inputs/output.log
[2021-01-17T11:17:27,645][WARN ][logstash.runner          ] SIGINT received. Shutting down.
[2021-01-17T11:17:28,783][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2021-01-17T11:17:28,994][INFO ][logstash.runner          ] Logstash shut down.
[2021-01-17T11:18:10,406][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.10.1", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc Java HotSpot(TM) 64-Bit Server VM 25.131-b11 on 1.8.0_131-b11 +indy +jit [mswin32-x86_64]"}
[2021-01-17T11:18:10,691][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2021-01-17T11:18:12,947][INFO ][org.reflections.Reflections] Reflections took 94 ms to scan 1 urls, producing 23 keys and 47 values 
[2021-01-17T11:18:13,838][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["C:/Sandeep/Softwares/ELK/logstash-7.10.1-windows-x86_64/logstash-7.10.1/config/logstash.conf"], :thread=>"#<Thread:0xc95af38 run>"}
[2021-01-17T11:18:15,008][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.15}
[2021-01-17T11:18:15,119][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2021-01-17T11:18:15,193][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2021-01-17T11:18:15,607][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2021-01-17T11:18:25,333][INFO ][logstash.outputs.file    ][main][fa168b58bbd43a518dc402a8d22e6589d608812f36dc1003dfb9ff8cdb3d536b] Opening file {:path=>"C:/ELK/inputs/output.log"}
[2021-01-17T11:18:40,150][INFO ][logstash.outputs.file    ][main][fa168b58bbd43a518dc402a8d22e6589d608812f36dc1003dfb9ff8cdb3d536b] Closing file C:/ELK/inputs/output.log
[2021-01-17T11:18:42,239][WARN ][logstash.filters.json    ][main][2bf21c89bf7819dbd8a5c052eb85546682a09a1cd54280614cd9903ccba93c58] Parsed JSON object/hash requires a target configuration option {:source=>"message", :raw=>"\r"}
[2021-01-17T11:18:42,282][INFO ][logstash.outputs.file    ][main][fa168b58bbd43a518dc402a8d22e6589d608812f36dc1003dfb9ff8cdb3d536b] Opening file {:path=>"C:/ELK/inputs/output.log"}
[2021-01-17T11:19:05,141][INFO ][logstash.outputs.file    ][main][fa168b58bbd43a518dc402a8d22e6589d608812f36dc1003dfb9ff8cdb3d536b] Closing file C:/ELK/inputs/output.log
[2021-01-17T11:21:12,011][WARN ][logstash.filters.json    ][main][2bf21c89bf7819dbd8a5c052eb85546682a09a1cd54280614cd9903ccba93c58] Parsed JSON object/hash requires a target configuration option {:source=>"message", :raw=>"\r"}
[2021-01-17T11:21:12,018][INFO ][logstash.outputs.file    ][main][fa168b58bbd43a518dc402a8d22e6589d608812f36dc1003dfb9ff8cdb3d536b] Opening file {:path=>"C:/ELK/inputs/output.log"}
[2021-01-17T11:21:25,142][INFO ][logstash.outputs.file    ][main][fa168b58bbd43a518dc402a8d22e6589d608812f36dc1003dfb9ff8cdb3d536b] Closing file C:/ELK/inputs/output.log

No error in logs, there is one warning "Parsed JSON object/hash requires a target configuration option {:source=>"message", :raw=>"\r"}". that's ok I think as it is for empty line. Is problem with logic? aggregate data?

Badger, can you please check and provide a input.

Can anyone please provide solution.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.