Logstash nested array JSON parsing

I have a JSON log of my application which contains elements in a nested array form, here is the sample of it:-

{
    "msgs": [
    {
        "ts": "2017-09-04T07:07:46.37098Z",
        "tid": 25,
        "lvl": "Information",
        "cat": "NCR.CP.GatewayService.TransactionProcessingEngine",
        "msg": {
            "cnt": "Resolution performed."
        },
        "data": {
            "Response": {
                "Content": {
                    "Company": {
                        "Number": 10031,
                        "Name": "Sprouts"
                    },
             
                "NodeId": 0,
                "StatusCode": 200
            },
            "ElapsedSeconds": 0.7458587
        }
    },
    {
        "ts": "2017-09-04T07:07:49.8951815Z",
        "tid": 25,
        "lvl": "Information",
        "cat": "NCR.CP.GatewayService.TransactionProcessingEngine",
        "msg": {
            "cnt": "RSP"
        },
        "data": {
            "ResponseTransaction": {
                "Version": null,
                "Server": {
                    "LogicalDatacenterId": 0,
                    "PhysicalDatacenterId": 0,
                    "TransactionId": 10909,
                    "UniqueId": "G00008D4F35EDADCD368",
                    "UniversalTimestamp": "2017-09-04T07:07:45.6249373Z"
                },
                "State": null,
                "Tenant": {
                    "CompanyNumber": 10031,
                    "StoreNumber": 1,
                    "HostType": {
                        "Code": 7,
                        "Name": "ProdConcordHC"
                    },
                },
           
                    "Terminal": {
                        "EMVCapabilities": "Contact, ContactlessDisabled",
                        "Type": null,
                        "SerialNumber": null,
                        "EMVKernelVersion": null,
                        "EMVIdentifierCAPK": null,
                        "PINCapabilities": null
                    },
                    "PointOfSale": {
                        "CashierNumber": "1",
                        "ReferenceNumber": null
                    },
                    "LocalTimestamp": "2017-06-16T14:58:57",
                    "LocalTimeZoneOffset": null
                },
                "Request": {
                    "MessageType": null,
                    "EntryMode": null,
                    "CashbackAmount": 0.00,
                    "AllowPartialAuthorization": true,
                    "TenderType": null,
                    "TransactionType": null,
                    "ReversalType": null,
                    "ReferenceId": null,
                    "AuditId": null,
                    "AccountNumberFirstSix": null,
                    "AccountNumberLastFour": null,
                    "AccountNumberLength": null,
                    "CardType": null,
                    "PreviousTransaction": null,
                    "Card": null,
                    "Check": null,
                    "Identification": null,
                    "Currency": null,
                    "Amount": null,
                    "TipAmount": null,
                    "Host": null
                },
                "Response": {
                    "ErrorCode": 96,
                    "ErrorMessage": "E2G",
                    "IsApproved": false,
                    "ResponseCode": "96",
                    "HostResponseCode": "E2G",
                    "HostResponseMessage": "EDIT ER:OPT DATA",
                    "ApprovedAmount": 0.00,
                    "ApprovedCashbackAmount": 0.0,
                    "IsApprovedLocally": false,
                    "AuthorizationCode": null,
                    "ResponseMessage": null,
                    "BalanceAmount": null,
                    "Check": null,
                    "Currency": null,
                    "Card": null,
                    "Host": null
                }]
                },
                "Queue": null,
                "SensitiveData": null,
                "PreviousTransaction": null,
                "Trace": null
            }
        }
    },
    }],
    "RequestId": "G00008D4F35EDADCD368",
    "RequestPath": "/Processor",
    "Action": "http://servereps.mtxeps.com/TransactionService/SendTransaction",
    "Contract": "NCR.CP.GatewayService.IProcessorTransactionContract",
    "OperationName": "SendTransaction",
    "MethodName": "SendTransaction"
}

Here "msgs" element contains nested array object into it, so to create field of every elements inside the array object i had followed the mutate and split approach both:-

Mutate filter:-

mutate {
                    add_field => {
                                "ts"  => "%{[doc][msgs][0][ts]}"
                                "tid1" => "%{[doc][msgs][0][tid]}"
                                "eid1" => "%{[doc][msgs][0][eid]}"
                                "lvl1" => "%{[doc][msgs][0][lvl]}"  
                                "cat1" => "%{[doc][msgs][0][cat]}"
                                "msg1" => "%{[doc][msgs][0][msg]}"
                                "data" => "%{[doc][msgs][1][data]}"
                                "actual-message" =>"%{[doc][msgs][3][data][Trace]}"
                                "error" =>"%{[doc][msgs][5][ex][exs][0][ec]}"
                                "error-type" =>"%{[doc][msgs][5][ex][exs][0][typ]}"
                                }
            }

This approach is static as my logs is occurring dynamically i.e position of error is not fixed as it may come in 6th array element or may be at 7th or 8th, so i ignored this approach and followed the split filter approach:-

Split Filter in logstash config:-1:

input{
  file{
  path=>"C:\Logs\GatewayService\GatewayService-Processor.Transactions-20170830.slog"
  }

}

split{
field=>"msg"
}
}
output{
elasticsearch {
hosts => ["localhost:9200"]
index => "g_index"}

 } 

now it is showing error as:
[2017-09-19T17:40:07,557][WARN ][logstash.filters.split ] Only String and Array types are splittable. field:msg is of type = NilClass

so how should i able to retrieve every element in this JSON to be a field so as to visualize in kibana by applying metrics in dashboard.

In your add_field approach you specified [doc][msgs] as your path to the array but in the split approach you specify "msg". If [doc][msgs] worked for add_field (but the order is not dependable) then, surely, you should specify the same path in the split filter, no?

my logstash configuration:-1:
input{

  beats{
  port=>5043
  }
}

filter{
json{
source=>"message"
target=>"doc"
}

split
{
field=>"[doc][msgs]"
}
}


output{
 elasticsearch { 
				     	hosts => ["localhost:9200"] 
						index => "l_index"}
 
 }  

output at logstash logs-
[2017-09-20T14:53:24,980][WARN ][logstash.filters.split ] Only String and Array types are splittable. field:[doc][msgs] is of type = NilClass

What is your upstream beats configuration? How is it getting the original content?

Here the filebeat.yml config, document_type: gatewaylogs is what i am using right now.
Although i know document_type is deprecated but since this .yml is working.

   filebeat.prospectors:
    -
      paths:
      - E:\ELK-STACK\logstash-tutorial-dataset.log
      
      input_type: log
      document_type: apachelogs
      

    -
      paths:
      - C:\Logs\GatewayService\GatewayService-Processor.Transactions-20170830.slog
      
      input_type: log
      document_type: gatewaylogs  
      
     
    -
      paths:
      - C:\ECLIPSE WORKSPACE\authenticlogsparser\mylogs.txt
      
      input_type: log
      document_type: authenticlogs 
    -
      paths:
      - E:\ELK-STACK\mutatelogs.log
      
      input_type: log
      document_type: mutatelogs   
      
    output.logstash:
      hosts: ["localhost:5043"]

OK.

You should take a step-by-step approach.

I presume that you want to eventually use if conditions to apply different filter stages to the various types of input.

Initially, lets focus on the gatewaylogs.

Using the file input as you did with the mutate:add_field approach:
try...

input{
  file{
    path=>"C:\Logs\GatewayService\GatewayService-Processor.Transactions-20170830.test.slog"
    start_position => "beginning"
    ignore_older => 0
    sincedb_path => "C:\Logs\GatewayService\GatewayService-Processor.Transactions-20170830.test.sincedb"
  }
}

output {
  stdout {
    codec => rubydebug
  }
}

Put 1 to 2 lines of the original C:\Logs\GatewayService\GatewayService-Processor.Transactions-20170830.slog file into the test file C:\Logs\GatewayService\GatewayService-Processor.Transactions-20170830.test.slog
You will need to delete the sincedb C:\Logs\GatewayService\GatewayService-Processor.Transactions-20170830.test.sincedb after each LS run.

Post one of events in the console output here. Use triple backticks on a separate line each side of the text you copied in the discus post.

   ` {`
  `   "path" => "E:\\ELK-STACK\\logstash-5.5.1\\logstash-5.5.1\\bin\\GatewayService-Processor.Transactions-`    20170830.test.slog",
            "@timestamp" => 2017-09-20T10:31:34.305Z,
          "@version" => "1",
              "host" => "WINAS251170-RZG",
           "message" => "{\"msgs\":[{\"ts\":\"2017-08-30T06:21:11.6956712Z\",\"tid\":17,\"eid\":1,\"lvl\":\"Information\",\"cat\":\"Microsoft.AspNetCore.Hosting.Internal.WebHost\",\"msg\":{\"cnt\":\"Request starting HTTP/1.1 POST http://localhost:20001/Processor text/xml; charset=utf-8 837\",\"Protocol\":\"HTTP/1.1\",\"Method\":\"POST\",\"ContentType\":\"text/xml; charset=utf-8\",\"ContentLength\":837,\"Scheme\":\"http\",\"Host\":\"localhost:20001\",\"PathBase\":\"\",\"Path\":\"/Processor\",\"QueryString\":\"\"}}}],\"RequestId\":\"G00008D4EF6F3BDFF5FC\",\"RequestPath\":\"/Processor\",\"Action\":\"http://servereps.mtxeps.com/TransactionService/SendTransaction\",\"Contract\":\"NCR.CP.GatewayService.IProcessorTransactionContract\",\"OperationName\":\"SendTransaction\",\"MethodName\":\"SendTransaction\"}\r"
    }
here is console output

Great. So the message field is JSON encoded text. I am presuming that the exact same will be read by the filebeat and sent to LS.

Now you can add the JSON filter.

input {
  file {
    path=>"C:\Logs\GatewayService\GatewayService-Processor.Transactions-20170830.slog"
    start_position => "beginning"
    ignore_older => 0
    sincedb_path => "C:\Logs\GatewayService\GatewayService-Processor.Transactions-20170830.sincedb"
  }
}

filter {
  json {
    source=>"message"
    target=>"doc"
  }
}

output {
  stdout {
    codec => rubydebug
  }
}

Post one entry from the console output again.

[2017-09-20T16:28:19,919][WARN ][logstash.filters.json    ] Error parsing json {:source=>"message", :raw=>"gs\":[{\"ts\":\"2017-08-30T06:21:11.6956712Z\",\"tid\":17,\"eid\":1,\"lvl\":\"Information\",\"cat\":\"Microsoft.AspNetCore.Hosting.Internal.WebHost\",\"msg\":{\"cnt\":\"Request starting HTTP/1.1 POST http://localhost:20001/Processor text/xml; charset=utf-8 837\",\"Protocol\":\"HTTP/1.1\",\"Method\":\"POST\",\"ContentType\":\"text/xml; charset=utf-8\",\"ContentLength\":837,\"Scheme\":\"http\",\"Host\":\"localhost:20001\",\"PathBase\":\"\",\"Path\":\"/Processor\",\"QueryString\":\"\"}},{\"ts\":\"2017-08-30T06:21:11.6966713Z\",\"tid\":17,\"{\"ReferenceNumber\":\"VT00000\",\"Cashier]NCR.CP.SDK.TenantResolutionClient\",\"msg\":{\"cnt\":\"SEND\"},\"data\":{\"RequestUri\":\"http://winas251170-rzg:20002/TenantConfigurationService/v1/Resolution\",\"Method\":\"POST\",\"Headers\":{\"Date\":\"Wed, 30 Aug 2017 06:21:11 GMT\",\"User-Agent\":\"ConnectedPayments/1.0\",\"x-ms-request-root-id\":\"f75c3886-4e6a3cad0c25ce83\",\"x-ms-request-id\":\"|f75c3886-4e6a3cad0c25ce83.1.\",\"Request-Id\":\"|f75c3886-4e6a3cad0c25ce83.1.\",\"Correlation-Context\":\"UniqueId=G00008D4EF6F3BDFF5FC\",\"Content-Type\":\"application/json; charset=utf-8\",\"Content-Length\":\"182\"}}},{\"ts\":\"2017-08-30T06:21:17.7362752Z\",\"tid\":5,\"lvl\":\"Information\",\"cat\":\"NCR.CP.GatewayService.TransactionProcessingEngine\",\"msg\":{\"cnt\":\"Resolution call failed.\"},\"ex\":{\"exs\":[{\"typ\":\"HttpRequestException\",\"ec\":\"0x80072EFD\",\"msg\":\"An error occurred while sending the request.\"},{\"typ\":\"WinHttpException\",\"ec\":\"0x80072EFD\",\"msg\":\"A connection with the server could not be established\"}],\"st\":[\"at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\",\"at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\",\"at System.Net.Http.HttpClient.<FinishSendAsync>d__58.MoveNext()\",\"--- End of stack trace from previous location where exception was thrown ---\",\"at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\",\"at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\",\"at System.Runtime.CompilerServices.ConfiguredTaskAwaitable`1.ConfiguredTaskAwaiter.GetResult()\",\"at NCR.CP.SDK.JsonServiceHttpClient.<ExecuteRequest>d__25.MoveNext() in E:\\\\CPAUTHCODE\\\\authenticintegration\\\\DotNetCore\\\\ClassLibraries\\\\NCR.CP.SDK\\\\Http\\\\JsonServiceHttpClient.cs:line 225\",\"--- End of stack trace from previous location where exception was thrown ---\",\"at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\",\"at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\",\"at System.Runtime.CompilerServices.ConfiguredTaskAwaitable`1.ConfiguredTaskAwaiter.GetResult()\",\"at NCR.CP.SDK.TenantResolutionClient.<ResolveTenantTransactionProcessingConfiguration>d__1.MoveNext() in E:\\\\CPAUTHCODE\\\\authenticintegration\\\\DotNetCore\\\\ClassLibraries\\\\NCR.CP.SDK\\\\Services\\\\TenantConfigurationService\\\\TenantResolutionClient.cs:line 35\",\"--- End of stack trace from previous location where exception was thrown ---\",\"at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\",\"at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\",\"at System.Runtime.CompilerServices.ConfiguredTaskAwaitable`1.ConfiguredTaskAwaiter.GetResult()\",\"at NCR.CP.GatewayService.TransactionProcessingEngine.<ResolveTenantProcessingConfiguration>d__6.MoveNext() in E:\\\\CPAUTHCODE\\\\authenticintegration\\\\DotNetCore\\\\Services\\\\GatewayService\\\\Code\\\\TransactionProcessingEngine.cs:line 222\",\"--- End of stack trace from previous location where exception was thrown ---\",\"at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\",\"at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\",\"at System.Runtime.CompilerServices.ConfiguredTaskAwaitable`1.ConfiguredTaskAwaiter.GetResult()\",\"at NCR.CP.GatewayService.TransactionProcessingEngine.<Execute>d__4.MoveNext() in E:\\\\CPAUTHCODE\\\\authenticintegration\\\\DotNetCore\\\\Services\\\\GatewayService\\\\Code\\\\TransactionProcessingEngine.cs:line 101\"]}},{\"ts\":\"2017-08-30T06:21:17.814283Z\",\"tid\":5,\"lvl\":\"Information\",\"cat\":\"NCR.CP.GatewayService.TransactionProcessingEngine\",\"msg\":{\"cnt\":\"RSP\"},\"data\":{\"ResponseTransaction\":{\"Version\":\"1.0\",\"Server\":{\"LogicalDatacenterId\":0,\"PhysicalDatacenterId\":0,\"UniqueId\":\"G00008D4EF6F3BDFF5FC\",\"UniversalTimestamp\":\"2017-08-30T06:21:11.6976714Z\"},\"State\":{\"Status\":\"Resolved\"},\"Tenant\":null,\"PointOfInteraction\":null,\"Request\":null,\"Response\":{\"ErrorCode\":600,\"ErrorMessage\":\"InternalError\",\"IsApproved\":null},\"Trace\":{\"ElapsedSeconds\":6.1164949,\"Path\":[{\"NodeId\":0,\"ElapsedSeconds\":6.1164949,\"Type\":\"Gateway\",\"Value\":\"GAT\"}]}}}},{\"ts\":\"2017-08-30T06:21:17.8152831Z\",\"tid\":5,\"lvl\":\"Information\",\"cat\":\"NCR.CP.GatewayService.ProcessorTransactionService\",\"msg\":{\"cnt\":\"RSP\"},\"data\":{\"Trace\":\"Aa547<1C>Ab100<1C>Ac000201<1C>Ae70001<1C>GkGAT<1C>Jb900<1C>Mg0<1C>Yb0<1C>YcG00008D4EF6F3BDFF5FC\"}},{\"ts\":\"2017-08-30T06:21:17.834285Z\",\"tid\":17,\"lvl\":\"Information\",\"cat\":\"NCR.CP.Service.ServiceHostMiddleware\",\"msg\":{\"cnt\":\"RSP\"},\"data\":{\"Headers\":{\"Date\":\"Wed, 30 Aug 2017 06:21:17 GMT\",\"Transfer-Encoding\":\"chunked\",\"Content-Type\":\"text/xml; charset=utf-8\",\"X-Unique-Id\":\"G00008D4EF6F3BDFF5FC\",\"X-Node-Id\":\"0\"}}},{\"ts\":\"2017-08-30T06:21:17.834285Z\08D4EF6F3BDFF5FC\",\"RequestPath\":\"/Processor\",\"Action\":\"http://servereps.mtxeps.com/TransactionService/SendTransaction\",\"Contract\":\"NCR.CP.GatewayService.IProcessorTransactionContract\",\"OperationName\":\"SendTransaction\",\"MethodName\":\"SendTransaction\"}\r", :exception=>#<LogStash::Json::ParserError: Unrecognized token 'gs': was expecting ('true', 'false' or 'null')
 at [Source: [B@2e42d86; line: 1, column: 4]>}
{
          "path" => "C:\\Logs\\GatewayService\\GatewayService-Processor.Transactions-20170830.slog",
    "@timestamp" => 2017-09-20T10:58:19.906Z,
      "@version" => "1",
          "host" => "WINAS251170-RZG",
       "message" => "gs\":[{\"ts\":\"2017-08-30T06:21:11.6956712Z\",\"tid\":17,\"eid\":1,\"lvl\":\"Information\",\"cat\":\
 "tags" => [
    [0] "_jsonparsefailure"
]
}{
          "path" => "C:\\Logs\\GatewayService\\GatewayService-Processor.Transactions-20170830.slog",
    "@timestamp" => 2017-09-20T10:58:19.908Z,
      "@version" => "1",
          "host" => "WINAS251170-RZG",
           "doc" => nil,
       "message" => "\r"
}

here is the console output

In the above output - you have a _jsonparsefailure because the message value is not correct JSON.

Further in the previous console output the JSON encoded text is malformed too. I have tried to pretty format it in a code editor, you can see that there is one too many closing } in the JSON.

{
  "msgs":[
    {
      "ts":"2017-08-30T06:21:11.6956712Z",
      "tid":17,
      "eid":1,
      "lvl":"Information",
      "cat":"Microsoft.AspNetCore.Hosting.Internal.WebHost",
      "msg":{
        "cnt":"Request starting HTTP/1.1 POST http://localhost:20001/Processor text/xml; charset=utf-8 837",
        "Protocol":"HTTP/1.1",
        "Method":"POST",
        "ContentType":"text/xml; charset=utf-8",
        "ContentLength":837,
        "Scheme":"http",
        "Host":"localhost:20001",
        "PathBase":"",
        "Path":"/Processor",
        "QueryString":""
      }
    }
  } <------ MALFORMED JSON - NO CORRESPONDING "{"
  ],
  "RequestId":"G00008D4EF6F3BDFF5FC",
  "RequestPath":"/Processor",
  "Action":"http://servereps.mtxeps.com/TransactionService/SendTransaction",
  "Contract":"NCR.CP.GatewayService.IProcessorTransactionContract",
  "OperationName":"SendTransaction",
  "MethodName":"SendTransaction"
}

I would question how the msgs field is built because the rest of the field names at the root level of the JSON object are PascalCase formatted but "msgs" is not.

You will not be able to use the add_field or split if the json filter cannot parse the message field.

         {
                "@timestamp" => 2017-09-21T04:59:42.169Z,
                    "offset" => 75204,
                  "@version" => "1",
                "input_type" => "log",
                      "beat" => {
                    "hostname" => "WINAS251170-RZG",
                        "name" => "WINAS251170-RZG",
                     "version" => "5.5.1"
                },
                      "host" => "WINAS251170-RZG",
                       "doc" => {
                      "RequestPath" => "/Processor",
                             "msgs" => [
                        [0] {
                            "msg" => {
                                         "Path" => "/Processor",
                                       "Scheme" => "http",
                                     "PathBase" => "",
                                  "ContentType" => "text/xml; charset=utf-8",
                                  "QueryString" => "",
                                          "cnt" => "Request starting HTTP/1.1 POST http://localhost:20001/Processor text/xml; charset=utf-8 837",
                                         "Host" => "localhost:20001",
                                       "Method" => "POST",
                                "ContentLength" => 837,
                                     "Protocol" => "HTTP/1.1"
                            },
                            "eid" => 1,
                            "lvl" => "Information",
                            "cat" => "Microsoft.AspNetCore.Hosting.Internal.WebHost",
                            "tid" => 17,
                             "ts" => "2017-08-30T06:21:11.6956712Z"
                        },

Successfully got the console output in JSON tree structure on applying json filter as you suggested but i have used beats as input instead of file input because file input is showing json parser failure.

so after this i have applied the split filter on "msgs" it is showing error as.

[2017-09-21T10:29:48,182][WARN ][logstash.filters.split ] Only String and Array types are splittable. field:msgs is of type = NilClass

OK, nearly there but now we are back to the [doc][msgs] vs msgs for the split field config option problem we had before.

NOTE - Looking at the split filter source code - it is not going to get you what you want.

By default it will create a new event per array entry but replace the field value with each array element. This will spread your array entries across multiple events. Not desired.

If I read your question correctly, various array entries contain important relevant values (in variable array positions) that you need to extract into fields at the root of the event.

At the moment, I think you have two options:

  1. The ruby filter - write some code. Not easy.
  2. Use a long list of if conditionals to test for the existence of a value at a certain location and then use add_field because you know the ordinal position of the object in the array.

For #2 with this data...

{
  "msgs": [
    {
      "ts": "2017-09-04T07:07:46.37098Z",
      "tid": 25,
      "lvl": "Information",
      "cat": "NCR.CP.GatewayService.TransactionProcessingEngine",
      "msg": {
        "cnt": "Resolution performed."
      },
      "data": {
        "Response": {
          "Content": {
            "Company": {
              "Number": 10031,
              "Name": "Sprouts"
            },
            "NodeId": 0,
            "StatusCode": 200
          },
          "ElapsedSeconds": 0.7458587
        }
      }
    },
    {
      "ts": "2017-09-04T07:07:49.8951815Z",
      "tid": 25,
      "lvl": "Information",
      "cat": "NCR.CP.GatewayService.TransactionProcessingEngine",
      "msg": {
        "cnt": "RSP"
      },
      "data": {
        "ResponseTransaction": {
          "Version": null,
          "Server": {
            "LogicalDatacenterId": 0,
            "PhysicalDatacenterId": 0,
            "TransactionId": 10909,
            "UniqueId": "G00008D4F35EDADCD368",
            "UniversalTimestamp": "2017-09-04T07:07:45.6249373Z"
          },
          "State": null,
          "Tenant": {
            "CompanyNumber": 10031,
            "StoreNumber": 1,
            "HostType": {
              "Code": 7,
              "Name": "ProdConcordHC"
            }
          },
          "Terminal": {
            "EMVCapabilities": "Contact, ContactlessDisabled",
            "Type": null,
            "SerialNumber": null,
            "EMVKernelVersion": null,
            "EMVIdentifierCAPK": null,
            "PINCapabilities": null
          },
          "PointOfSale": {
            "CashierNumber": "1",
            "ReferenceNumber": null
          },
          "LocalTimestamp": "2017-06-16T14:58:57",
          "LocalTimeZoneOffset": null
        },
        "Request": {
          "MessageType": null,
          "EntryMode": null,
          "CashbackAmount": 0.00,
          "AllowPartialAuthorization": true,
          "TenderType": null,
          "TransactionType": null,
          "ReversalType": null,
          "ReferenceId": null,
          "AuditId": null,
          "AccountNumberFirstSix": null,
          "AccountNumberLastFour": null,
          "AccountNumberLength": null,
          "CardType": null,
          "PreviousTransaction": null,
          "Card": null,
          "Check": null,
          "Identification": null,
          "Currency": null,
          "Amount": null,
          "TipAmount": null,
          "Host": null
        },
        "Response": {
          "ErrorCode": 96,
          "ErrorMessage": "E2G",
          "IsApproved": false,
          "ResponseCode": "96",
          "HostResponseCode": "E2G",
          "HostResponseMessage": "EDIT ER:OPT DATA",
          "ApprovedAmount": 0.00,
          "ApprovedCashbackAmount": 0.0,
          "IsApprovedLocally": false,
          "AuthorizationCode": null,
          "ResponseMessage": null,
          "BalanceAmount": null,
          "Check": null,
          "Currency": null,
          "Card": null,
          "Host": null
        },
        "Queue": null,
        "SensitiveData": null,
        "PreviousTransaction": null,
        "Trace": null
      }
    }
  ],
  "RequestId": "G00008D4F35EDADCD368",
  "RequestPath": "/Processor",
  "Action": "http://servereps.mtxeps.com/TransactionService/SendTransaction",
  "Contract": "NCR.CP.GatewayService.IProcessorTransactionContract",
  "OperationName": "SendTransaction",
  "MethodName": "SendTransaction"
}

You test like this.

if [doc][msgs][0][msg][cnt] == "RSP" {
  # we know element 0 is the RSP type record
  mutate {
    add_field => {
      "actual-message" =>"%{[doc][msgs][0][data][Trace]}"
    }
  }
} else if [doc][msgs][1][msg][cnt] == "RSP" {
  
} else if [doc][msgs][2][msg][cnt] == "RSP" {

} else if [doc][msgs][3][msg][cnt] == "RSP" {

} else {
  
}

i have used:

   split{
    field=>"[doc][msgs]"
    }

and This worked, as it created event for each element in array, so whether element would exists or not a field would be created in kibana and the value will be null if elements doesn't exists and a value will be there if element exists.

This is what i want.
and why you are disagree with use of split?
and one more question.
1)I am stucked with creating LINE GRAPH for ElapsedSeconds.

"Trace": {
					"ElapsedSeconds": 0.3787265,
					"Path": [{
						"NodeId": 0,
						"ElapsedSeconds": 0.3787265,
						"Type": "Gateway",
						"Value": "GAT"
					}]
				}

i have extracted this [doc][msgs][responsetransaction][response][trace][elapsedseconds] to a field and creating line graph which should display a graph which show the real time up down of elapsed seconds in every thirty minutes.

For this , In Y-Axis I have used metric aggregation TOP HIT and terms : [doc][msgs][responsetransaction][response][trace][elapsedseconds](This field) and in X-axis i have used date histogram and set cutom time 30 minutes but it didn't worked.

As on y axis it is not displaying the values.?
if you understand what i said please suggest some opinion over it too.
THANKS :slight_smile:

1 Like

Because I thought you wanted one event with some fields from each array object cherrypicked into the same event and then to throw the array away. It seemed that way from your attempt to use add_field and your description of how the element order was a problem.

Marking as solved.

Please open a discussion on the kibana channel for this.

1 Like

ok thanks @guyboertje

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.