How to parse log file with different log types with json

Hey, I managed to make it work. This is my filebeat.yml:

filebeat.inputs:


# Laravel Logs
- type: log
  enabled: true
  paths:
    - /var/elk/logs/*.log
  multiline.pattern: '^\['
  multiline.negate: true
  multiline.match: after
  fields:
    logType: "laravel"
  tags: ["log1"]

setup.kibana:
  host: "http://elasticsearch:5601"

output.logstash:
  hosts: ["logstash:5044"]

And my logstash.conf:

##Input input log beats is used to receive the plug-in codec of filebeat. The format of input log is set as the port of logstash
input {
  beats {
    port => 5044
  }
}

##filter data filtering operation
filter {
  grok {
    match => {
      "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] %{DATA:env}\.%{DATA:severity}: (?<log>[^{]+)?%{GREEDYDATA:raw-json}"
    }
  }

  grok {
    match => {
      "raw-json" => "(?<raw-process-json>\{(.*)\})%{GREEDYDATA:response}"
      tag_on_failure => [ ]
    }
  }

  json {
    source => "raw-process-json"
    target => "json"
  }

  mutate {
    rename => { "message" => "raw-message" }
    rename => { "json" => "raw-process-json" }
  }
}

output {
    if "log1" in [tags] {      #Write iislog log to es
        elasticsearch{
          hosts => ["http://elasticsearch:9200"]
          index => "log1-%{+YYYY.MM.dd}"
        }
        stdout {}
    }
}

The first gtok filter is getting until finds a json. The second one is getting the last chunk where found the json and set json to "raw-process-json". If finds a message after it sets it to "response". Finaly json filter parses the json and mutate just renames