How to parse different format logs from same directory

I have multiple log files in the same directory, that I want to run through Logstash, every file is having different format than other.
My log files can be named randomly by different network providers (I dont have control on how do they name log files). I want to parse based on certain formats they have.

For example: under logs directory there are log files with different format
$ /path/to/logs


However, i was able to parse each log file separately by launching logstash instance to the specific format of log file (While parsing access.log file i'm using access.conf file which has the matching grok filter to parse the access.log data format).

Should I run as many instances as I have different types of logs?

With the grok filter you can list multiple expressions and have the filter try them in order until one matches. Another option could be to use conditionals to classify the events based on what they look like, e.g. like this:

filter {
  if [message] =~ /some regexp that matches one type of event/ {
    mutate {
      replace => {
        "type" => "some type"

Then use additional conditional blocks based on the type field.

Magnus, thank you for response. I'm using Filebeat to ship logs to Logstsash. Can we set some type on the Filebeat, to use conditional filters in Logstash.

Set the prospector's document_type option.