JSON and syslog in LogStash — how to handle them both?

I send 2 different types of data to Logstash from filebeat: json and syslog. Previously it was only json and it worked fine but now I've added syslog also.

I want to display them both in Kibana in separate dashboards or however it's done. Here's a part of my /etc/logstash/conf.d/my-logstash.conf config:

    output {
      elasticsearch { hosts => ["localhost:9200"] }
      stdout { codec => json } # if this correct?
    }
    
    filter {

      # if this correct?
      json {
        source => "message"
      }
    
      if [type] == "syslog" {
        grok {
          match => { "message" => "some regexp....." }
          add_field => [ "received_at", "%{@timestamp}" ]
          add_field => [ "received_from", "%{host}" ]
        }
        syslog_pri { }
        date {
          match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
        }
      }
    }

I'm not sure if it's correct since I send both json and syslog to it. My questions are in the code. Or perhaps there's also something else I've missed?

P.S.

And also, how can I display those 2 formats separately in Kibana, is it done via different dashboards?

IIRC recent versions of Filebeat support JSON decoding natively. If not you'll want to use a json filter to deal with the JSON data, but only for those events that indeed are JSON. I suggest you set a field to indicate this in your Filebeat prospector, e.g. set the codec field to "JSON" and use that in a Logstash conditional:

filter {
  if [codec] == "JSON" {
    json {
      source => "message"
    }
  }
}

And also, how can I display those 2 formats separately in Kibana, is it done via different dashboards?

You don't have to use different visualizations but you could.