How to parse log file with different log types with json

Hello I'm trying to parse a log file where different log types coexist. Hope someone can help :slight_smile: This are all log types:

[2022-05-18 11:09:41] dev.INFO: Inserting new User...

[2022-05-18 11:09:41] dev.INFO: Insert successful for user {"idmember":0000,"idcard":"000","name":"FRANCISCO","surname_1":"GARCIA","surname_2":"SUAREZ","email":"hello@gmail.com","phone_1":"6666666","phone_2":"99999999","birthdate":"1992-02-26","gender_category_id":3,"idcard_category_id":4,"club_id":4,"status":1,"connector":"0000000a-b"}

[2022-05-18 11:09:41] dev.INFO: Insert successful for user {"idmember":0000,"idcard":"000","name":"FRANCISCO","surname_1":"GARCIA","surname_2":"SUAREZ","email":"hello@gmail.com","phone_1":"6666666","phone_2":"99999999","birthdate":"1992-02-26","gender_category_id":3,"idcard_category_id":4,"club_id":4,"status":1,"connector":"0000000a-b"} Response Status:201

[2022-05-18 11:09:41] dev.INFO: Starting new import... IdMember: '000000' Connector: 'Big Test'

This is my logstash.conf:

input {
  beats {
    port => 5044
  }
}

##filter data filtering operation
filter {
  grok {
    match => {
      "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] %{DATA:env}\.%{DATA:severity}: (?<log>[^{]+)?%{GREEDYDATA:raw-json}"
    }
  }

  json {
    source => "raw-json"
    target => "json"
  }

  mutate {
    rename => { "message" => "raw-message" }
    rename => { "json" => "raw-json" }
  }
}

output {
    if "log1" in [tags] {      #Write iislog log to es
        elasticsearch{
          hosts => ["http://elasticsearch:9200"]
          index => "log1-%{+YYYY.MM.dd}"
        }
        stdout {}
    }
    if "log2" in [tags] {      #Write iislog log to es
        elasticsearch{
          hosts => ["http://elasticsearch:9200"]
          index => "log2-%{+YYYY.MM.dd}"
        }
        stdout {}
    }
}

With wich I'm only able to parse this log:

[2022-05-18 11:09:41] dev.INFO: Insert successful for user {"idmember":0000,"idcard":"000","name":"FRANCISCO","surname_1":"GARCIA","surname_2":"SUAREZ","email":"hello@gmail.com","phone_1":"6666666","phone_2":"99999999","birthdate":"1992-02-26","gender_category_id":3,"idcard_category_id":4,"club_id":4,"status":1,"connector":"0000000a-b"}

I only need to parse the json when appearing and get the rest of the message as separate fields. Any suggestion?

Thanks!

Your "JSON" is not valid JSON. Numbers cannot have leading zeroes. (Some parsers will accept leading zeroes if the number is octal, but not logstash.)

Other than that the grok seems to work OK.

Thanks for answering. Nice to know not leading zeroes are accepted. However the real problems i that for getting the json I'm using GREEDYDATA wich will get everything. If text exists after json will say json is invalid. A log like this

[2022-05-18 11:09:41] dev.INFO: Insert successful for user {"idmember":0000,"idcard":"000","name":"FRANCISCO","surname_1":"GARCIA","surname_2":"SUAREZ","email":"hello@gmail.com","phone_1":"6666666","phone_2":"99999999","birthdate":"1992-02-26","gender_category_id":3,"idcard_category_id":4,"club_id":4,"status":1,"connector":"30a-b"} Response Status:201

Hey, I managed to make it work. This is my filebeat.yml:

filebeat.inputs:


# Laravel Logs
- type: log
  enabled: true
  paths:
    - /var/elk/logs/*.log
  multiline.pattern: '^\['
  multiline.negate: true
  multiline.match: after
  fields:
    logType: "laravel"
  tags: ["log1"]

setup.kibana:
  host: "http://elasticsearch:5601"

output.logstash:
  hosts: ["logstash:5044"]

And my logstash.conf:

##Input input log beats is used to receive the plug-in codec of filebeat. The format of input log is set as the port of logstash
input {
  beats {
    port => 5044
  }
}

##filter data filtering operation
filter {
  grok {
    match => {
      "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] %{DATA:env}\.%{DATA:severity}: (?<log>[^{]+)?%{GREEDYDATA:raw-json}"
    }
  }

  grok {
    match => {
      "raw-json" => "(?<raw-process-json>\{(.*)\})%{GREEDYDATA:response}"
      tag_on_failure => [ ]
    }
  }

  json {
    source => "raw-process-json"
    target => "json"
  }

  mutate {
    rename => { "message" => "raw-message" }
    rename => { "json" => "raw-process-json" }
  }
}

output {
    if "log1" in [tags] {      #Write iislog log to es
        elasticsearch{
          hosts => ["http://elasticsearch:9200"]
          index => "log1-%{+YYYY.MM.dd}"
        }
        stdout {}
    }
}

The first gtok filter is getting until finds a json. The second one is getting the last chunk where found the json and set json to "raw-process-json". If finds a message after it sets it to "response". Finaly json filter parses the json and mutate just renames

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.