Logstash nested json parsing,getting every nested json as seperate field

Hello,

After trying several times I'm unable to parse below json string data coming from oracle column called: package_data.Kindly assist how to get data in elastic to show data like below from given data.

{"status":"READY_FOR_PROCESSING","errorData":null,"creationDate":1678096719969,"lastModificationDate":1678096824967,"manyId":"1900","form":"ABC","systemId":"TIS_FULL","userId":"?","XmlFileId":"","size":0,"Kind":"FULL","Datas":[{"serviceRequestId":"29F91450A3","Counter":2,"size":122628,"fileId":"CC2B5","packageId":"1900","lPart":true},{"sRequestId":"190FBB14E1864A75B6B8ECC054FEAD3A","pCounter":1,"size":122628,"fileId":"F6395CF76A2E","packageId":"1900","lastPart":false}],"Histories":[{"status":"START","errorData":null,"creationDate":1678096719980,"lastModificationDate":1678096719980},{"status":"PROCESSING","errorData":null,"creationDate":1678096824974,"lastModificationDate":1678096824974}]}

Even I'd like to know how did you manage to show the data in above shared images i.e all values in single array.

Required:

Getting:
image

Creation date and last mod date also need to be converted from epch to utc for packagestateHistories:

Failed attemp:

filter {
  json {source => "package_data"} 
        split {field => "package_data"}
        mutate {remove_field => [ "package_data"]}
}
filter {
 if [type] == "tcs-package-details" {
 json {source => "package_data"}
 split {field => "package_data"}
 split {field => "[package_data][packagePartDatas][packageStateHistories]"}
mutate {
    convert => {
      "package_data.packageStateHistories.lastModificationDate" => "integer"
      "package_data.packageStateHistories.creationDate" => "integer"
    }
  }

  date {
    match => ["package_data.packageStateHistories.lastModificationDate", "package_data.packageStateHistories.creationDate"]
    target => ["package_data.packageStateHistories.lastModificationDate", "package_data.packageStateHistories.creationDate"]
    timezone => "UTC"
  }

 mutate {remove_field => [ "package_data"]}
}
}

Thanx

What is your full logstash pipeline? The sample data you shared does not have the fields your pipeline is using.

This is your sample data:

{
  "status": "READY_FOR_PROCESSING",
  "errorData": null,
  "creationDate": 1678096719969,
  "lastModificationDate": 1678096824967,
  "manyId": "1900",
  "form": "ABC",
  "systemId": "TIS_FULL",
  "userId": "?",
  "XmlFileId": "",
  "size": 0,
  "Kind": "FULL",
  "Datas": [
    {
      "serviceRequestId": "29F91450A3",
      "Counter": 2,
      "size": 122628,
      "fileId": "CC2B5",
      "packageId": "1900",
      "lPart": true
    },
    {
      "sRequestId": "190FBB14E1864A75B6B8ECC054FEAD3A",
      "pCounter": 1,
      "size": 122628,
      "fileId": "F6395CF76A2E",
      "packageId": "1900",
      "lastPart": false
    }
  ],
  "Histories": [
    {
      "status": "START",
      "errorData": null,
      "creationDate": 1678096719980,
      "lastModificationDate": 1678096719980
    },
    {
      "status": "PROCESSING",
      "errorData": null,
      "creationDate": 1678096824974,
      "lastModificationDate": 1678096824974
    }
  ]
}

You have a field named Datas and another one named Histories, where this package_data comes from?

Mybad for the wrong data.

{"status":"READY_FOR_PROCESSING","errorData":null,"creationDate":1678096719969,"lastModificationDate":1678096824967,"packageId":"1900","platform":"ABC","systemId":"TIS_FULL","userId":"?","connectXmlFileId":"","size":0,"packageKind":"FULL","packagePartDatas":[{"serviceRequestId":"3801FA31FCF64CB1A1401529F91450A3","partCounter":2,"size":122628,"fileId":"F63956AD63B170CCE053020011ACC2B5","packageId":"1900","lastPart":true},{"serviceRequestId":"190FBB14E1864A75B6B8ECC054FEAD3A","partCounter":1,"size":122628,"fileId":"F6395CF76A2E7124E053020011ACE8A3","packageId":"1900","lastPart":false}],"packageStateHistories":[{"status":"INTRANSIT","errorData":null,"creationDate":1678096719980,"lastModificationDate":1678096719980},{"status":"READY_FOR_PROCESSING","errorData":null,"creationDate":1678096824974,"lastModificationDate":1678096824974}]}

This is how data is stored in oracle column named: package_data.

image

So I require all fields to come in index seperately,unable to parse nested json array:
packagePartDatas.serviceRequestId
packagePartDatas.partCounter
packagePartDatas.size
packagePartDatas.fileid

packageStateHistories.status:
packageStateHistories.errorData
packageStateHistories.creationDate
packageStateHistories.lastModDate

input {
  
  jdbc { 
	jdbc_connection_string => "jdbc:oracle:thin:@abc:1821/mis"
	jdbc_user => "ABC"
	jdbc_password => "ABC"
    jdbc_driver_library => "../lib/ojdbc8-11.2.0.1.jar"
    jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
	jdbc_paging_enabled => true
	last_run_metadata_path => "../config/lastrun-mis-ckage-details.yml"
	schedule => "*/25 * * * * *"
	connection_retry_attempts => 5
	connection_retry_attempts_wait_time => 10
  statement=> "select PACKAGE_ID,PACKAGE_DATA, from_tz(CAST (CREATION_DATE AS TIMESTAMP), 'UTC') as CREATION_DATE, from_tz(CAST (LAST_MOD_DATE AS TIMESTAMP), 'UTC') as LAST_MOD_DATE   from kdata where LAST_MOD_DATE >= SYS_EXTRACT_UTC(:sql_last_value)"
  type => "mis-package-details"
  }
  
  
}

filter {
  json {source => "package_data"} 
        split {field => "package_data"}
        mutate {remove_field => [ "package_data"]}
}


output {
#stdout { codec => json_lines }
  if [type] == "mis-package-details"{
	elasticsearch {
		hosts => "http://abc:9200"
		ilm_pattern => "{now/d}-000001"
		"doc_as_upsert" => true
		ilm_rollover_alias => "mis-package-details"
		ilm_policy => "mis-transfer-common-policy"
		"document_id" => "%{package_id}"
		
	}

  }
	
  
}



What is the name of the field? The sample you shared does not a field named package_data.

From what you shared you need to do two splits one on the packagePartDatas field and after that one on the packageStateHistories field.

Something like this:

    split {
        field => "packagePartDatas"
    }
    split {
        field => "packageStateHistories"
    }

Also, in Logstash the correct way to reference nested fields is using square brackets, not dots.

This will not work:

    convert => {
      "package_data.packageStateHistories.lastModificationDate" => "integer"
      "package_data.packageStateHistories.creationDate" => "integer"
    }

You need to use [package_data][packageStateHistories][lastModificationDate] for example.

Hello @leandrojmp ,

It dosen't has field called as package_data,its oracle column name called pakage_data and it contains the entire json string.So I want every field in that as seperate so that I can use in elastic index.Here I'm still unable to get like below,I'd like them to represent them in table and add filter later on package id,so that for selected pacakage id I get below details in table:

{"status":"READY_FOR_PROCESSING","errorData":null,"creationDate":1678096719969,"lastModificationDate":1678096824967,"packageId":"1900","platform":"ABC","systemId":"TIS_FULL","userId":"?","connectXmlFileId":"","size":0,"packageKind":"FULL","packagePartDatas":[{"serviceRequestId":"3801FA31FCF64CB1A1401529F91450A3","partCounter":2,"size":122628,"fileId":"F63956AD63B170CCE053020011ACC2B5","packageId":"1900","lastPart":true},{"serviceRequestId":"190FBB14E1864A75B6B8ECC054FEAD3A","partCounter":1,"size":122628,"fileId":"F6395CF76A2E7124E053020011ACE8A3","packageId":"1900","lastPart":false}],"packageStateHistories":[{"status":"INTRANSIT","errorData":null,"creationDate":1678096719980,"lastModificationDate":1678096719980},{"status":"READY_FOR_PROCESSING","errorData":null,"creationDate":1678096824974,"lastModificationDate":1678096824974}]}

Expected:
packagePartDatas.serviceRequestId
packagePartDatas.partCounter
packagePartDatas.size
packagePartDatas.fileid

packageStateHistories.status:
packageStateHistories.errorData
packageStateHistories.creationDate
packageStateHistories.lastModDate

I used this now,but I dont get any output now:

filter {
 if [type] == "cis-package-details" {
 json {source => "package_data"}
 split {field => "package_data"}
 split {        field => "packagePartDatas"    }   
 split {        field => "packageStateHistories"    }
 mutate {remove_field => [ "package_data"]}
}
}

How does the document you have in Logstash looks like? You didn't share the document you have.

Please share the document you have in Logstash, not Oracle.

I do not use the jdbc input, but assuming that the PACKAGE_DATA column in your database will create a PACKAGE_DATA field in your logstash document, you just need to use the correct name of the fields.

Logstash is case sensitive, package_data is different from PACKAGE_DATA, so you would need a split filter on the field [PACKAGE_DATA][packagePartDatas] and another one on the field [PACKAGE_DATA][packageStateHistories].

Hello,

This is how the data looks like after logstash execution:

{"systemId":"MIS_FULL","packagePartDatas":[{"packageId":"1700","fileId":"F74F869EFFEF7FBDE053020011AC63DD","serviceRequestId":"ADFDF0645EDB486496E41B520DCF4482","size":126624,"lastPart":true,"partCounter":1}],"packageId":"1700","platform":"P36","package_id":"1700","@timestamp":"2023-03-21T02:53:31.845Z","connectXmlFileId":"","errorData":{"errorCode":"BATCH_FRAMEWORK_1041","errorMessage":"The input parameter \"nodeClassName\" must not be null.","stackTrace":"[xyz.connect.integration.jobs.inform.service.klkl.transform(klkl.java:106), xyz.connect.integration.jobs.inform.service.connectNow.doProcess(connectNow.java:47), xyz.connect.integration.jobs.inform.processor.yesstarted.process(yesstarted.java:54), xyz.connect.integration.jobs.inform.processor.yesstarted.process(yesstarted.java:1), sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method), sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62), sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43), java.lang.reflect.Method.invoke(Method.java:498), org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:344), org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:198), org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163), org.springframework.aop.support.DelegatingIntroductionInterceptor.doProceed(DelegatingIntroductionInterceptor.java:137), org.springframework.aop.support.DelegatingIntroductionInterceptor.invoke(DelegatingIntroductionInterceptor.java:124), org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186), org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:215), com.sun.proxy.$Proxy161.process(Unknown Source), org.springframework.batch.core.step.item.SimpleChunkProcessor.doProcess(SimpleChunkProcessor.java:134), org.springframework.batch.core.step.item.SimpleChunkProcessor.transform(SimpleChunkProcessor.java:319), org.springframework.batch.core.step.item.SimpleChunkProcessor.process(SimpleChunkProcessor.java:210), org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:77), org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:407), org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:331), org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:140), org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:273), org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:82), org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:375), org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215), org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:145), org.springframework.batch.core.step.tasklet.TaskletStep.doExecute(TaskletStep.java:258), org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:208), org.springframework.batch.core.job.SimpleStepHandler.handleStep(SimpleStepHandler.java:152), org.springframework.batch.core.job.flow.JobFlowExecutor.executeStep(JobFlowExecutor.java:68), org.springframework.batch.core.job.flow.support.state.StepState.handle(StepState.java:68), org.springframework.batch.core.job.flow.support.SimpleFlow.resume(SimpleFlow.java:169), org.springframework.batch.core.job.flow.support.SimpleFlow.start(SimpleFlow.java:144), org.springframework.batch.core.job.flow.FlowJob.doExecute(FlowJob.java:137), org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:320), org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:149), org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50), org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:140), xyz.connect.integration.batch.framework.AbstractConnectIntegrationBatchJobRunner.launchJob(AbstractConnectIntegrationBatchJobRunner.java:219), xyz.connect.integration.jobs.inform.informJobApplication.main(informJobApplication.java:47)]"},"type":"mis-package-details","packageStateHistories":[{"status":"READY_FOR_PROCESSING","errorData":null,"lastModificationDate":1679291172626,"creationDate":1679291172626},{"status":"MERGED","errorData":null,"lastModificationDate":1679293118971,"creationDate":1679293118971},{"status":"ERROR","errorData":{"errorCode":"BATCH_FRAMEWORK_1041","errorMessage":"The input parameter \"nodeClassName\" must not be null.","stackTrace":"[xyz.connect.integration.jobs.inform.service.klkl.transform(klkl.java:106), xyz.connect.integration.jobs.inform.service.connectNow.doProcess(connectNow.java:47), xyz.connect.integration.jobs.inform.processor.yesstarted.process(yesstarted.java:54), xyz.connect.integration.jobs.inform.processor.yesstarted.process(yesstarted.java:1), sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method), sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62), sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43), java.lang.reflect.Method.invoke(Method.java:498), org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:344), org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:198), org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163), org.springframework.aop.support.DelegatingIntroductionInterceptor.doProceed(DelegatingIntroductionInterceptor.java:137), org.springframework.aop.support.DelegatingIntroductionInterceptor.invoke(DelegatingIntroductionInterceptor.java:124), org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186), org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:215), com.sun.proxy.$Proxy161.process(Unknown Source), org.springframework.batch.core.step.item.SimpleChunkProcessor.doProcess(SimpleChunkProcessor.java:134), org.springframework.batch.core.step.item.SimpleChunkProcessor.transform(SimpleChunkProcessor.java:319), org.springframework.batch.core.step.item.SimpleChunkProcessor.process(SimpleChunkProcessor.java:210), org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:77), org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:407), org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:331), org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:140), org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:273), org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:82), org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:375), org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215), org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:145), org.springframework.batch.core.step.tasklet.TaskletStep.doExecute(TaskletStep.java:258), org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:208), org.springframework.batch.core.job.SimpleStepHandler.handleStep(SimpleStepHandler.java:152), org.springframework.batch.core.job.flow.JobFlowExecutor.executeStep(JobFlowExecutor.java:68), org.springframework.batch.core.job.flow.support.state.StepState.handle(StepState.java:68), org.springframework.batch.core.job.flow.support.SimpleFlow.resume(SimpleFlow.java:169), org.springframework.batch.core.job.flow.support.SimpleFlow.start(SimpleFlow.java:144), org.springframework.batch.core.job.flow.FlowJob.doExecute(FlowJob.java:137), org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:320), org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:149), org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50), org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:140), xyz.connect.integration.batch.framework.AbstractConnectIntegrationBatchJobRunner.launchJob(AbstractConnectIntegrationBatchJobRunner.java:219), xyz.connect.integration.jobs.inform.informJobApplication.main(informJobApplication.java:47)]"},"lastModificationDate":1679293124721,"creationDate":1679293124721}],"creationDate":1679291172616,"userId":"?","creation_date":"2023-03-20T05:46:12.621Z","last_mod_date":"2023-03-20T06:18:44.721Z","@version":"1","status":"ERROR","lastModificationDate":1679293124721,"size":0,"packageKind":"FULL"}

output {
file {
path => "/l/new/mis/logstash/logstash-7.9.1/logs/myfile"
codec => json_lines
}
}


I'm still unsure how can I get data like below:

packagePartDatas.serviceRequestId
packagePartDatas.partCounter
packagePartDatas.size
packagePartDatas.fileid

packageStateHistories.status:
packageStateHistories.errorData
packageStateHistories.creationDate
packageStateHistories.lastModDate

If this is the document you have, then you have two array fields, packagePartDatas and packageStateHistories.

You just need to add the split filters as mentioned on a previous answer.

filter {
    split {
        field => "packagePartDatas"
    }
    split {
        field => "packageStateHistories"
    }
}

I'm not sure what I'm doing wrong ,so now I have below config and I've removed the filter and logstash gives this output now.
The above filter dosen't help to get desired output.Kindly suggest.

{"package_id":"1700","package_data":"{\"status\":\"ERROR\",\"errorData\":{\"errorCode\":\"BATCH_FRAMEWORK_1041\",\"errorMessage\":\"The input parameter \\\"nodeClassName\\\" must not be null.\",\"stackTrace\":\"[Trace]\"},\"creationDate\":1679291172616,\"lastModificationDate\":1679293124721,\"packageId\":\"1700\",\"platform\":\"P36\",\"systemId\":\"TIS_FULL\",\"userId\":\"?\",\"connectXmlFileId\":\"\",\"size\":0,\"packageKind\":\"FULL\",\"packagePartDatas\":[{\"serviceRequestId\":\"ADFDF0645EDB486496E41B520DCF4482\",\"partCounter\":1,\"size\":126624,\"fileId\":\"F74F869EFFEF7FBDE053020011AC63DD\",\"packageId\":\"1700\",\"lastPart\":true}],\"packageStateHistories\":[{\"status\":\"READY_FOR_PROCESSING\",\"errorData\":null,\"creationDate\":1679291172626,\"lastModificationDate\":1679291172626},{\"status\":\"MERGED\",\"errorData\":null,\"creationDate\":1679293118971,\"lastModificationDate\":1679293118971},{\"status\":\"ERROR\",\"errorData\":{\"errorCode\":\"BATCH_FRAMEWORK_1041\",\"errorMessage\":\"The input parameter \\\"nodeClassName\\\" must not be null.\",\"stackTrace\":\"[ST]\"},\"creationDate\":1679293124721,\"lastModificationDate\":1679293124721}]}","@timestamp":"2023-03-21T04:04:41.469Z","last_mod_date":"2023-03-20T06:18:44.721Z","creation_date":"2023-03-20T05:46:12.621Z","@version":"1","type":"mis-package-details"}
input {
 jdbc {
 jdbc_connection_string => "jdbc:oracle:thin:@abc:1521/mis"
 jdbc_user => "k"
 jdbc_password => "k"
 jdbc_driver_library => "../lib/ojdbc8-11.1.0.1.jar"
 jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
 jdbc_paging_enabled => true
 last_run_metadata_path => "../config/lastrun-mis-package-details.yml"
 schedule => "*/5 * * * * *"
 connection_retry_attempts => 5
 connection_retry_attempts_wait_time => 10
 statement=> "select PACKAGE_ID,PACKAGE_DATA, from_tz(CAST (CREATION_DATE AS TIMESTAMP), 'UTC') as CREATION_DATE, from_tz(CAST (LAST_MOD_DATE AS TIMESTAMP), 'UTC') as LAST_MOD_DATE from pdata where LAST_MOD_DATE >= SYS_EXTRACT_UTC(:sql_last_value)"
 type => "mis-package-details"
 }
}

output {
  file {
    path => "/l/app/mis/logstash/logstash-7.9.1/logs/myfile"
    codec => json_lines
  }
}

So, if this is the message that the jdbc input is giving you, than you have the following fields in your logstash pipeline after the input.

  • package_data
  • package_id
  • last_mod_date
  • creation_date

The data you want is in the field package_data, so you will need to parse it with the json filter.

filter {
    json {
        source => "package_data"
        remove_field => [ "package_data" ]
    }
}

The above filter will parse your package_data field and remove it from the document if the filter works.

After that you will have all the fields from the package_data in the root of your document, since you want to split some of them, you just need to add the following filters:

filter {
    split {
        field => "packageStateHistories"
    }
    split {
        field => "packagePartDatas"
    }
}

This will split both array fields.

I've just tested it and this is an example of one of the documents generated after the splits.

{
              "packageKind" => "FULL",
                "errorData" => {
           "errorCode" => "BATCH_FRAMEWORK_1041",
          "stackTrace" => "[Trace]",
        "errorMessage" => "The input parameter \"nodeClassName\" must not be null."
    },
                     "size" => 0,
                 "@version" => "1",
                     "type" => "mis-package-details",
         "packagePartDatas" => {
                "lastPart" => true,
             "partCounter" => 1,
                  "fileId" => "F74F869EFFEF7FBDE053020011AC63DD",
        "serviceRequestId" => "ADFDF0645EDB486496E41B520DCF4482",
                    "size" => 126624,
               "packageId" => "1700"
    },
            "creation_date" => "2023-03-20T05:46:12.621Z",
                 "systemId" => "TIS_FULL",
         "connectXmlFileId" => "",
               "@timestamp" => 2023-03-21T04:04:41.469Z,
               "package_id" => "1700",
                   "status" => "ERROR",
     "lastModificationDate" => 1679293124721,
                   "userId" => "?",
             "creationDate" => 1679291172616,
            "last_mod_date" => "2023-03-20T06:18:44.721Z",
                     "host" => "logstash-lab",
    "packageStateHistories" => {
                "creationDate" => 1679293124721,
                   "errorData" => {
               "errorCode" => "BATCH_FRAMEWORK_1041",
              "stackTrace" => "[ST]",
            "errorMessage" => "The input parameter \"nodeClassName\" must not be null."
        },
                      "status" => "ERROR",
        "lastModificationDate" => 1679293124721
    },
                 "platform" => "P36",
                "packageId" => "1700"
}
1 Like

Hello @leandrojmp ,

Thanx for you time to look into this and for your additional efforts to bring the data as required,
really appreciate for your timely efforts on this and for explanation of sequence of filter to be used.
Now I've got the data as required :grinning:.

filter {
json {
source => "package_data"
remove_field => [ "package_data" ]
}
}
filter {
split {
field => "packageStateHistories"
}
split {
field => "packagePartDatas"
}
}

This worked!!

Many Thanx

Hello @leandrojmp

Can you plz suggest this last thing in this how to convert epoch time to UTC,either this field should show time in UTC or add new filed that shows this in UTC.
Below filter is not working.

filter {
split {
field => "packageStateHistories"
}
split {
field => "packagePartDatas"
}
date {
match => [ "packageStateHistories.creationDate", "UNIX_MS" ]
target => "creationDateUTC"
add_field => {
"packageStateHistories.creationDateUTC" => "%{creationDateUTC}"
}
}
date {
match => [ "packageStateHistories.lastModificationDate", "UNIX_MS" ]
target => "lastModificationDateUTC"
add_field => {
"packageStateHistories.lastModificationDateUTC" => "%{lastModificationDateUTC}"
}
}
}

Thanx

You are referencing the field in the wrong way, you do not use field.nested format in logstash, you need to use [field][nested].

Also, you do not need the add field, you could just change the target to add the parsed data in a nested field.

It would be something like this:

date {
    match => [ "[packageStateHistories][creationDate]", "UNIX_MS", "UNIX" ]
    target => "[packageStateHistories][creationDateUTC]"
}

I Addedd the UNIX pattern just to make sure that will parse both epoch with miliseconds and without miliseconds.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.