Make logstash filter map field differantly based on the value

Hi everyone!
I'm using the ELK 6 stack, and I have tranactions JSON structure like so:

[
{
  "fieldId": "PK_NEWJOURNAL",
  "fieldType": "NUMBER",
  "fieldValue": "1235543",
  "fieldChanged": "Y"
},
{
  "fieldId": "OFFICE",
  "fieldType": "CHAR",
  "fieldValue": "NULL",
  "fieldChanged": "Y"
},
{
  "fieldId": "UPDATE_DATE",
  "fieldType": "DATE",
  "fieldValue": "2017-12-07 05:09:54+03:000",
  "fieldChanged": "Y"
},
{
  "fieldId": "ENDDATE",
  "fieldType": "DATE",
  "fieldValue": "NULL",
  "fieldChanged": "Y"
},
{
  "fieldId": "USERNAME",
  "fieldType": "VARCHAR2",
  "fieldValue": "blah123",
  "fieldChanged": "Y"
},
....
]

Now, I want logstash/elasticsearch to map the "fieldValue" according to the type of "fieldType", meaning that if the fieldType in the same object is "VARCHAR2" so "fieldValue" will be saved as text, and if its "NUMBER" that it should be "number".

Can this be done?

My problem is that currently everything is saved as text, and I can't visualize anything in kibana.
Any help will be appreaciated...

You'll have to use different fields. An Elasticsearch field needs to be mapped either as a number or a string (or some other data type).

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.