Manipulate data with grok - logstash

hi, i have log data like this.

2017-01-19T18:08:35+07:00 payment INFO {"user_id":0,"cart_id":"81746"}

so i want using logstash to input data to elasticsearch.
but i want to manipulate the data it self before insert it to elastic.

so i use this tool to help me
http://grokdebug.herokuapp.com/

here is my pattern.

%{TIMESTAMP_ISO8601} %{GREEDYDATA:message} %{LOGLEVEL:log-level} %{GREEDYDATA:json}

and here is the data that i got after debug this log

{
"TIMESTAMP_ISO8601": [
[
"2017-01-19T18:08:35+07:00"
]
],
"YEAR": [
[
"2017"
]
],
"MONTHNUM": [
[
"01"
]
],
"MONTHDAY": [
[
"19"
]
],
"HOUR": [
[
"18",
"07"
]
],
"MINUTE": [
[
"08",
"00"
]
],
"SECOND": [
[
"35"
]
],
"ISO8601_TIMEZONE": [
[
"+07:00"
]
],
"message": [
[
"payment"
]
],
"log": [
[
"INFO"
]
],
"json": [
[
"{"user_id":0,"cart_id":"81746"}"
]
]
}

so what i want next is manipulate data,
lets say, i want to take :message, :log-leve, and :json
and combine it, and then make one json object with that values.

how can i do that?
pls help me.

thank you

Given the example log line

2017-01-19T18:08:35+07:00 payment INFO {"user_id":0,"cart_id":"81746"}

what do you want the resulting event to look like? If you want a specific answer you need to be specific about what you want to do.

i want something like this :
{
'logtime': '2017-01-19T18:08:35+07:00',
'message':'payment',
'log-level':'INFO',
'json': {
'user_id':'0',
'cart_id':'81746'
}
}

Use what you already have but add a json filter that parses the json field and stores the result back into the json field.

hi,

i did this in my logstash conf

filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601} %{GREEDYDATA:message} %{LOGLEVEL:log-level} %{GREEDYDATA:json}" }
}

    json { source => message }

}

the data successfully added to elastic.
but the format, its not what i want.

i got this :

      "path": "/usr/local/Cellar/logstash/5.1.1/payment.json",
      "@timestamp": "2017-01-26T07:45:16.986Z",
      "log-level": "INFO",
      "@version": "1",
      "host": "Gutas-MBP",
      "json": " {\"user_id\":0,\"cart_id\":\"81746\"}",
      "message": [
        "2017-01-19T18:08:35+07:00 payment INFO  {\"user_id\":0,\"cart_id\":\"81746\"}",
        "payment"
      ]

what i want is something like this :

  "path": "/usr/local/Cellar/logstash/5.1.1/payment.json",
  "@timestamp": "2017-01-26T07:45:16.986Z",
  "log-level": "INFO",
  "@version": "1",
  "host": "Gutas-MBP",
  "json": [
        "user_id ": 0,
        "cart_id":17676,
          "TIMESTAMP_ISO8601": "2017-01-19T18:08:35+07:00",
         "message": "payment",
        "log_level": "INFO"
  ]

Use the json filter's target option to control where the parsed JSON values are stored.

i tried this :

filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601} %{GREEDYDATA:message} %{LOGLEVEL:log-level} %{GREEDYDATA:json}" }
}

json {
source => "message"
target => "message"
}

}

i got something like this :

      "message": [
        "2017-01-19T18:08:35+07:00 payment INFO  {\"user_id\":0,\"cart_id\":\"81746\"}",
        "payment"
      ],

i want something like this

message : [
'logtime' : '2017-01-19T18:08:35+07:00'
'message': 'payment'
'log_level': 'INFO',
'json': [
'user_id': '3313',
'cart_id': '222'
]
]

what should i do sir?

Why are you parsing the message field when your JSON data is in the json field?

Secondly, you need overwrite => ["message"] in your grok filter so that it's allowed to overwrite the current contents of the message field.

no, i want to combine these 4 objects, into one big json object.
i want this log

2017-01-19T18:08:35+07:00 payment INFO {"user_id":0,"cart_id":"81746"}

and convert it all to one json object.

Oh. Well, you can use a mutate filter to rename fields, including moving fields to become subfields. I also believe you can reference nested fields in the grok filter to put them in the right place from the start.

%{TIMESTAMP_ISO8601} %{GREEDYDATA:[message][message]} %{LOGLEVEL:[message][log-level]} %{GREEDYDATA:json}

The [field][subfield] notation can be used in the json filter too. See https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html for more on field references.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.