Can't parse Laravel log with json content

Hello!
I'm trying to send this Laravel Log to Elasticksearch using filebeat and logstash visualizing it with Kibana:
[2022-05-16 12:03:50] dev.INFO: Update successful for user {"idmember":"37774", "idcard":"0000000H","name":"DAVID"}

When I try to parse Laravel Logs without json is fine but when I do with I get this error from logastsh:

logstash         | [2022-05-19T10:54:46,908][WARN ][logstash.outputs.elasticsearch][main][75e9e5bd0580746e9fdeeb944a6ea5d4a0230223d26623fb4f638ac127951ac0] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"log2-2022.05.19", :routing=>nil}, {"@version"=>"1", "raw-json"=>{"name"=>"DAVID", "idmember"=>"37774", "idcard"=>"0000000H"}, "fields"=>{"logType"=>"laravel"}, "log"=>{"offset"=>11548051, "file"=>{"path"=>"/var/log/dmesg_log/userimports.log"}}, "tags"=>["log2", "beats_input_codec_plain_applied"], "host"=>{"name"=>"8bf39f63d758"}, "agent"=>{"ephemeral_id"=>"cb38ffb0-bdf1-45d5-86a5-d794a507e72d", "name"=>"8bf39f63d758", "version"=>"8.1.3", "id"=>"0c1ef28f-37b3-4274-80a7-1ad3878b498f", "type"=>"filebeat"}, "severity"=>"INFO", "event"=>{"original"=>"[2022-05-16 12:03:50] dev.INFO: Update successful for user {\"idmember\":\"37774\", \"idcard\":\"0000000H\",\"name\":\"DAVID\"}"}, "timestamp"=>"2022-05-16 12:03:50", "ecs"=>{"version"=>"8.0.0"}, "input"=>{"type"=>"log"}, "@timestamp"=>2022-05-19T10:54:40.759Z, "env"=>"dev", "raw-message"=>"[2022-05-16 12:03:50] dev.INFO: Update successful for user {\"idmember\":\"37774\", \"idcard\":\"0000000H\",\"name\":\"DAVID\"}"}], :response=>{"index"=>{"_index"=>"log2-2022.05.19", "_type"=>"_doc", "_id"=>"XE_024ABuiYBz8YErIxp", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [raw-json] of type [text] in document with id 'XE_024ABuiYBz8YErIxp'. Preview of field's value: '{idcard=0000000H, name=DAVID, idmember=37774}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:28"}}}}}
logstash         | {
logstash         |        "@version" => "1",
logstash         |        "raw-json" => {
logstash         |             "name" => "DAVID",
logstash         |         "idmember" => "37774",
logstash         |           "idcard" => "0000000H"
logstash         |     },
logstash         |          "fields" => {
logstash         |         "logType" => "laravel"
logstash         |     },
logstash         |             "log" => {
logstash         |         "offset" => 11548051,
logstash         |           "file" => {
logstash         |             "path" => "/var/log/dmesg_log/userimports.log"
logstash         |         }
logstash         |     },
logstash         |            "tags" => [
logstash         |         [0] "log2",
logstash         |         [1] "beats_input_codec_plain_applied"
logstash         |     ],
logstash         |            "host" => {
logstash         |         "name" => "8bf39f63d758"
logstash         |     },
logstash         |           "agent" => {
logstash         |         "ephemeral_id" => "cb38ffb0-bdf1-45d5-86a5-d794a507e72d",
logstash         |                 "name" => "8bf39f63d758",
logstash         |              "version" => "8.1.3",
logstash         |                   "id" => "0c1ef28f-37b3-4274-80a7-1ad3878b498f",
logstash         |                 "type" => "filebeat"
logstash         |     },
logstash         |        "severity" => "INFO",
logstash         |           "event" => {
logstash         |         "original" => "[2022-05-16 12:03:50] dev.INFO: Update successful for user {\"idmember\":\"37774\", \"idcard\":\"0000000H\",\"name\":\"DAVID\"}"
logstash         |     },
logstash         |       "timestamp" => "2022-05-16 12:03:50",
logstash         |             "ecs" => {
logstash         |         "version" => "8.0.0"
logstash         |     },
logstash         |           "input" => {
logstash         |         "type" => "log"
logstash         |     },
logstash         |      "@timestamp" => 2022-05-19T10:54:40.759Z,
logstash         |             "env" => "dev",
logstash         |     "raw-message" => "[2022-05-16 12:03:50] dev.INFO: Update successful for user {\"idmember\":\"37774\", \"idcard\":\"0000000H\",\"name\":\"DAVID\"}"
logstash

This is my logstash.conf:

##Input input log beats is used to receive the plug-in codec of filebeat. The format of input log is set as the port of logstash
input {
  beats {
    port => 5044
  }
}

##filter data filtering operation
filter {
  grok {
    match => {
      "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] %{DATA:env}\.%{DATA:severity}: (?<log>[^{]+)?%{GREEDYDATA:raw-json}"
    }
  }

  json {
    source => "raw-json"
    target => "json"
  }

  mutate {
    rename => { "message" => "raw-message" }
    rename => { "json" => "raw-json" }
  }
}

##The output configuration output elasticsearch address can be configured with multiple indexes whose index is elasticsearch, which can be matched by creating index pattern in kibana
##You can also fill in docker compose The container name of logstash in YML, such as“ http://elasticsearch:9200 "(provided that they belong to the same docker network and the type is bridge)
output {
    if "log1" in [tags] {      #Write iislog log to es
        elasticsearch{
          hosts => ["http://elasticsearch:9200"]
          index => "log1-%{+YYYY.MM.dd}"
        }
        stdout {}
    }
    if "log2" in [tags] {      #Write iislog log to es
        elasticsearch{
          hosts => ["http://elasticsearch:9200"]
          index => "log2-%{+YYYY.MM.dd}"
        }
        stdout {}
    }
}

And my filebeat.yml:

filebeat.inputs:
# # Docker logs
# - type: container
#   paths:
#     - '/var/lib/docker/containers/*/*.log'
#   tags: ["docker"]

# Laravel Logs
- type: log
  enabled: true
  paths:
    - /var/elk/logs/*.log
  multiline.pattern: '^[0-9]{2}-[a-z]{3}-[0-9]{4}'
  multiline.negate: true
  multiline.match: after
  fields:
    logType: "laravel"
  tags: ["log1"]

- type: log
  enabled: true
  paths:
    - /var/log/dmesg_log/*.log
  multiline.pattern: '^[0-9]{2}-[a-z]{3}-[0-9]{4}'
  multiline.negate: true
  multiline.match: after
  fields:
    logType: "laravel"
  tags: ["log2"]

  # Enrich laravel logs with docker data (not working)
  # processors:
  # - add_docker_metadata:
  #     host: "unix:///var/run/docker.sock"

setup.kibana:
  host: "http://elasticsearch:5601"

output.logstash:
  hosts: ["logstash:5044"]

Thanks for the help

This error is from Elasticsearch, it could not index the field, logstash parsed it without any problem as you can see in your logs:

logstash         |        "raw-json" => {
logstash         |             "name" => "DAVID",
logstash         |         "idmember" => "37774",
logstash         |           "idcard" => "0000000H"
logstash         |     }

But when it sent the request to index that document in Elasticsearch, it got a error 400 back:

Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"log2-2022.05.19", :routing=>nil}

The most probable cause is that in your index the field raw-json is mapped as a text, so it would expect values like this raw-json: "plaintext value", if you try to send a json object, like the one logstash tried to send, it will be rejected.

What is the mapping of the raw-json field?

Get it using the following request in Kibana Dev Tools (or curl)

GET log2-2022.05.19/_mapping/field/raw-json

Thanks for answering, that's the case.

{
  "log2-2022.05.19" : {
    "mappings" : {
      "raw-json" : {
        "full_name" : "raw-json",
        "mapping" : {
          "raw-json" : {
            "type" : "text",
            "fields" : {
              "keyword" : {
                "type" : "keyword",
                "ignore_above" : 256
              }
            }
          }
        }
      }
    }
  }
}

I tried to do some research and I don't understand firstly why this is happening and secondly how can I change it to json.

As I understand I can't edit current indexing and I should create a new one but I can't select it the creation

Solved! Json was parsing incorrectly and GREEDYDATA was being detected as text. My particular solution was described in this thread. How to parse log file with different log types with json - #4 by Robert_Garcia_Torren

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.