All files collected under on index only. Is it possible to have multiple indexes for multiple files?

four of them as mentioned in above comment. SHould I remove the logstash.conf since I am using pipeline.conf only?

Hi @Badger, I had logstash as index one time and then logstash-date-00001 as other times. are these both in ILM? Should I disable them to get the indexes I am manually typing?

That is what I would expect.

The default rollover alias is called logstash , with a default pattern for the rollover index of {now/d}-00001 , which will name indices on the date that the index is rolled over, followed by an incrementing number.

Try

ilm_enabled => false

and see if you get what you want.

Those for files will get concatenated into a single pipeline and data from all inputs will go to all outputs unless you set them up as separate piplines.

@Mehak_Bhargava
Remove all the config files other that the one you need to use.

Only one input, filter, output will be supported per pipeline.

Keep only one config file with one input, filter and output and then test your results, you will get the idea!

@Badger, Made the following change and still one index is show called: logstash-2019.12.23-000001

input {
  file{
     path: /home/mehak/Documents/filebeat-7.4.0-linux-x86_64/logs/log2.log
     type => {"access"}
  }
   file{
    path: /home/mehak/Documents/filebeat-7.4.0-linux-x86_64/logs/logz.log
    type => {"errors"}
  }
   file{
    path: /home/mehak/Documents/filebeat-7.4.0-linux-x86_64/logs/dispatcher-log.log
    type => {"dispatch"}
  }
  
}

filter {
  if[type] =="access"{
    grok {
	match => {"message" => "%{COMBINEDAPACHELOG}"}
  } else if [type] == "errors" {
        grok {
            match => { "message" => "%{COMBINEDAPACHELOG}" }
        }
  }else [type] == "dispatcher" {
        grok {
            match => { "message" => "\A%{TIMESTAMP_ISO8601:timestamp}%{SPACE}\[%{DATA:threadId}]%{SPACE}%{LOGLEVEL:logLevel}%{SPACE}%{JAVACLASS:javaClass}%{SPACE}-%{SPACE}?(\[%{NONNEGINT:incidentId}])%{GREEDYDATA:message}" }
        }
    }
}
 
output {
    
    if [type] == "access" {
        elasticsearch {
            hosts => ["localhost:9200"]
            sniffing => true
            manage_template => false
            ilm_enabled => false
            index => "access-index"

        }
    } else if [type] == "errors" {
        elasticsearch {
            hosts => ["localhost:9200"]
            sniffing => true
            manage_template => false
            ilm_enabled => false
            index => "errors-index"
        }
    }
    
    stdout {
    codec => rubydebug
    }
  
}

@Christian_Dahlqvist,

Currently the file I am executing is called pipeline.conf which is outside the config directory in logstash. Should I replace my pipeline.conf contents into the logstash.conf directory inside config folder and then have the pipeline.conf inside config folder with the description of multiple pipelines?

Comment all the conditions in the filter for the time being and remove if conditions in the output as well and keep only one Elasticsearch instance. For now,

Like this,
output {
elasticsearch {
hosts => ["localhost:9200"]
sniffing => true
manage_template => false
ilm_enabled => false
index => "access-index"
}
}

Pipeline configurations are in pipeline.yml file
You can define multiple pipelines with configuration directory, but only one input, output and filter will be supported per pipeline.

@mancharagopan Updated teh output and filter in pipeline.conf and I still have logstash-2019.12.23-000001 as index

input {
  
  beats {
    port => 5044
  }
}

filter {
 
}
 
output {
    elasticsearch {
    hosts => ["localhost:9200"]
    sniffing => true
    manage_template => false
    ilm_enabled => false
    index => "access-index"
  }
  stdout {
    codec => rubydebug
  }
}

Where is the pipeline.conf file located?
Please share your pipeline.yml config.

my pipeline.config file is in /home/mehak/Documents/logstash-7.4.0 folder.

This is my pipelines.yml in config folder-

# List of pipelines to be loaded by Logstash
#
# This document must be a list of dictionaries/hashes, where the keys/values are pipeline settings.
# Default values for omitted settings are read from the `logstash.yml` file.
# When declaring multiple pipelines, each MUST have its own `pipeline.id`.
#
# Example of two pipelines:
#
# - pipeline.id: test
#   pipeline.workers: 1
#   pipeline.batch.size: 1
#   config.string: "input { generator {} } filter { sleep { time => 1 } } output { stdout { codec => dots } }"
# - pipeline.id: another_test
#   queue.type: persisted
#   path.config: "/tmp/logstash/*.config"
#
# Available options:
#
#   # name of the pipeline
#   pipeline.id: mylogs
#
#   # The configuration string to be used by this pipeline
#   config.string: "input { generator {} } filter { sleep { time => 1 } } output { stdout { codec => dots } }"
#
#   # The path from where to read the configuration text
#   path.config: "/etc/conf.d/logstash/myconfig.cfg"
#
#   # How many worker threads execute the Filters+Outputs stage of the pipeline
#   pipeline.workers: 1 (actually defaults to number of CPUs)
#
#   # How many events to retrieve from inputs before sending to filters+workers
#   pipeline.batch.size: 125
#
#   # How long to wait in milliseconds while polling for the next event
#   # before dispatching an undersized batch to filters+outputs
#   pipeline.batch.delay: 50
#
#   # Internal queuing model, "memory" for legacy in-memory based queuing and
#   # "persisted" for disk-based acked queueing. Defaults is memory
#   queue.type: memory
#
#   # If using queue.type: persisted, the page data files size. The queue data consists of
#   # append-only data files separated into pages. Default is 64mb
#   queue.page_capacity: 64mb
#
#   # If using queue.type: persisted, the maximum number of unread events in the queue.
#   # Default is 0 (unlimited)
#   queue.max_events: 0
#
#   # If using queue.type: persisted, the total capacity of the queue in number of bytes.
#   # Default is 1024mb or 1gb
#   queue.max_bytes: 1024mb

Is this the full configuration of pipeline.yml?
I don’t see any pipeline configuration, all are commended lines.

Is your pipeline.conf file exactly inside
/home/mehak/Documents/logstash-7.4.0 folder?

Can you share the output of
ls /home/mehak/Documents/logstash-7.4.0
command?

Yes, the above is the entire pipeline.yml file. Should I uncomment the pipeline.id:test and pipeline.id: another_test and if so, what should be the path.config for it? My pipeline.conf path?

Yes that is where the pipline.conf is stored.

Here is the output of ls-

mehak@mehak-VirtualBox:~$ ls /home/mehak/Documents/logstash-7.4.0
bin           Gemfile       logs                      NOTICE.TXT     x-pack
config        Gemfile.lock  logstash-core             pipeline.conf
CONTRIBUTORS  lib           logstash-core-plugin-api  tools
data          LICENSE.txt   modules                   vendor

@Mehak_Bhargava
Create a directory conf.d inside
/home/mehak/Documents/logstash-7.4.0/

Move your pipeline.conf into conf.d directory.

Uncomment pipeline.id and path.config
Change path.config with your pipeline.conf directory.
Like this,
- pipeline.id: test
path.config: "/home/mehak/Documents/logstash-7.4.0/conf.d/pipeline.conf"

And then restart logstash to take effect.

@mancharagopan
Made the changes, and still have same logstash-date-0001 index and this is my pipelines.yml file now-

 - pipeline.id: test 
   path.config: "/home/mehak/Documents/logstash-7.4.0/config/pipeline.config"
   pipeline.workers: 1
   pipeline.batch.size: 1

These are the config folder files-

mehak@mehak-VirtualBox:~$ ls /home/mehak/Documents/logstash-7.4.0/config
jvm.options        logstash.yml   pipelines.yml
log4j2.properties  pipeline.conf  startup.options
mehak@mehak-VirtualBox:~$ 

@Mehak_Bhargava
I edited my last comment please check.

@mancharagopan
updated pipelines.yml-

 - pipeline.id: test 
   path.config: "/home/mehak/Documents/logstash-7.4.0/conf.d/pipeline.conf"
   pipeline.workers: 1
   pipeline.batch.size: 1

And these are file locations-

mehak@mehak-VirtualBox:~$ ls /home/mehak/Documents/logstash-7.4.0/config
jvm.options  log4j2.properties  logstash.yml  pipelines.yml  startup.options
mehak@mehak-VirtualBox:~$ ls /home/mehak/Documents/logstash-7.4.0/conf.d
pipeline.conf

Still same index in kibana.

There is this one error that shows in filebeat in between reading the log files. This has always been there-

2019-12-23T12:05:46.099-0800	ERROR	pipeline/output.go:100	Failed to connect to backoff(async(tcp://localhost:5044)): dial tcp 127.0.0.1:5044: connect: connection refused

Where is the filebeat installed? Same machine as logstash or different machine?

Hi @mancharagopan, filebeat is on same machine.