Nest a json object from csv

I have a csv which has a field called cart in it and although I can import correctly the cart field creates new fields in relation to the nested products inside the cart rather just a single cart field.

431|
2017-01-26 10:08:57|
example@example.com|
firstName|
lastName|
GBP|
GB|
2015-06-14 07:16:39|
43.88|
2017-01-24 13:45:28|
f|
{"18272":{"19208":"1","19210":"1"},"18218":{"19099":"1"},"18783":{"19249":"2"}}|
5|
GBP

The cart field looks like this where the top level number is the product ID and the second number is the variant ID and the third number is the qty in the cart.

{
"18272":
{
"19208":"1",
"19210":"1"
},
"18218":
{
"19099":"1"
},
"18783":
{
"19249":"2"
}
}

Using the config I have I get fields in elastic that look like this cart.18272.19208 when I need a single field called cart that includes all the products and variants

This is my config file.

input {
  file {
    path => "/etc/logstash/files/*.csv"
    start_position => "beginning"
  }
}
filter {
  csv {
      columns => ['userId', 'profileUpdated', 'email', 'lastName', 'firstName', 'userCurrency', 'billingCountry', 'registeredDate', 'cartValue', 'cartUpdated', 'kitInCart', 'cart', 'itemsInCart' ,'cartCurrency']
      separator => "|"
      quote_char => "'"
  }
  date {
    match => [ "registeredDate", "YYYY-MM-dd HH:mm:ss" ]
  }
  json {
    source => "cart"
    target => "cart"
  }
}
output {
  stdout {
    codec => rubydebug
  }
  elasticsearch {
     hosts => "elasticsearch:9200"
     index => "users"
  }
}

There's no out of the box filter for this; you'll have to use a ruby filter and write some Ruby code.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.