Adding json objets with Logstash but I get Strings

Hello,

I'm trying to add objets to an Array named "lineas". This is my initial document:

[quote="Alberto_Diaz, post:1, topic:105440, full:true"]
Hello,

I'm trying to add objets to an Array named "lineas". This is my initial document:
{
"lineas": [
],
"ticket": 4527,
"num_typett": 1,
"importe": 76.64,
"fecha_ticket": "2017-10-23T22:00:00.000Z",
"importe_servicios": 0,
"importe_sin_servicios": 76.64,
"fecha": "2017-10-23T22:00:00.000Z",
"hora_ticket": 203440,
"tienda": 4,
"@timestamp": "2017-10-27T10:30:26.206Z",
"num_bu": 2,
"@version": "1",
"mes": "10",
"id": "445271024102017",
"caja": 10,
"anio": "2017"
}
}

Then I try to update the object with this kibd of information from a sql:
------id-------------------------------linea
1----445271024102017-------{"numlig":[{"articulo":"13909154","cantidad":1.0000,"importe":5.1900}]}
2----445271024102017-------{"numlig":[{"articulo":"13823740","cantidad":1.0000,"importe":5.4500}]}

I want to add each new "linea" to "lineas"

I tried this but it doesn't work:

filter {
json {
    source => "linea"
  }
json {
    source => "lineas"
  }
}

output {
#stdout { codec => rubydebug }
  elasticsearch {
	hosts => "http://deslmelk01:9200"
	action => "update"
	index => "tickets-%{anio}.%{mes}"
	document_id => "%{id}"
	script_lang => "painless"
    script_type => "inline"  
	script => 'ctx._source.lineas.add("%{[numlig]}");'
  }
}

but I get an array of strings in "lineas" instead of an array of objetcs:

{
"lineas": [
"{importe_linea=5.1900, articulo=13909154, cantidad=1.0000}"
"{importe_linea=5.4500, articulo=13823740, cantidad=1.0000}",
],
"ticket": 4527,
"num_typett": 1,
"importe": 76.64,
"fecha_ticket": "2017-10-23T22:00:00.000Z",
"importe_servicios": 0,
"importe_sin_servicios": 76.64,
"fecha": "2017-10-23T22:00:00.000Z",
"hora_ticket": 203440,
"tienda": 4,
"@timestamp": "2017-10-27T10:30:26.206Z",
"num_bu": 2,
"@version": "1",
"mes": "10",
"id": "445271024102017",
"caja": 10,
"anio": "2017",
"numlig": [
{
"importe_linea": 4.59,
"articulo": "13607965",
"cantidad": 1
}
]
}

What am I doing wrong?

Thanks!

this looks more like a logstash question, so I moved the topic over.

you might want to take a look at the convert logstash filter

hope this helps!

--Alex

Thanks spinsclae, but "covert logstash filter" seems to be not valid to convert to json:

"Valid conversion targets are: integer, float, string, and boolean."

Your jdbc lookup returns a JSON encoded string in the linea field and after this your first json filter runs but because you have not specified a target field, the filter decodes the JSON String into the root of the event. Note the numlig field.

Try using the target option and only one JSON filter.

Reformatted event reposted for reference.

{
  "lineas": [
    "{importe_linea=5.1900, articulo=13909154, cantidad=1.0000}"
    "{importe_linea=5.4500, articulo=13823740, cantidad=1.0000}",
  ],
  ...
  ...
  "anio": "2017",
  "numlig": [
    {
      "importe_linea": 4.59,
      "articulo": "13607965",
      "cantidad": 1
    }
  ]
}

Hi guyboertjeElastic,

thanks for your help but just adding a target field the problem is that "numlig" is overwritten into "lineas", I want to add each line answered by my sql.

Is there any way to do it?

Thanks

Not really. I read the code in the JSON filter and I don't think that this will work when a JDBC filter fetches multiple records where one field is JSON without using nested field notation to properly connect to the JSON string itself.

The JDBC filter will create an array of objects with id and lineafields. If this array has only 2 elements then you can do this:
assume that the JDBC filter puts the array in the lines field.

filter {
  if [lines][0][linea] {
    json {
      source => "[lines][0][linea]"
      target => "[lineas][0]"
    }
  }
  if [lines][1][linea] {
    json {
      source => "[lines][1][linea]"
      target => "[lineas][1]"
    }
  }
}

If that does not work then you may need to concatenate the results in SQL to have one row with JSON combined into a JSON encoded array
e.g. '[{"numlig":[{"articulo":"13909154","cantidad":1.0000,"importe":5.1900}]},{"numlig":[{"articulo":"13823740","cantidad":1.0000,"importe":5.4500}]}]'

Hi guyboertje,

this "if [lines][0][linea]" can not be a solution because we can have more than 500 lines!
Concatenatenation in sql could be a solution, but we should use a recursive query in TERADATA and it is not recomended due to the performance.

I tried this:
I changed the sql to this:
------id-------------------------------num_ligtic-----------linea
1----445271024102017-------1------------------------{"articulo":"13909154","cantidad":1.0000,"importe":5.1900}
2----445271024102017-------2------------------------{"articulo":"13823740","cantidad":1.0000,"importe":5.4500}

and the filter json to this:

json {
    source => "linea"
    target => "[lineas][%{num_ligtic}]"
  }

but %{num_ligtic} is not recognized in the json filter and I have:

"lineas": {
            "%{num_ligtic}": {
              "importe_linea": 5.45,
              "articulo": "13823740",
              "cantidad": 1
            }
          }

Indeed, if it could work the result should be:

"lineas": {
            "0": {
             "importe_linea": 5.45,
              "articulo": "13823740",
              "cantidad": 1
            },
         "1": {
             "importe_linea": 5.45,
              "articulo": "13823740",
              "cantidad": 1
            }
          }

but my expected output should be:

  "lineas": {
         {
         "importe_linea": 5.45,
          "articulo": "13823740",
          "cantidad": 1
        },
       {
         "importe_linea": 5.45,
          "articulo": "13823740",
          "cantidad": 1
        }
      }

Finally I think I will use Java API to do this because I'm not able to get it with Logstash.

Thank you very much for your help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.