How to send multiple log files to Kibana through logstash?

I am new to ELK and I want to send multiple files to Kibana using Logstash like apache access logs of different servers. How can I do that?

What part are you finding hard to understand? A single Logstash instance can have multiple inputs that listens on multiple ports or reads multiple files or whatever you use Logstash for. Nothing special needs to be done for reading multiple files. Without further details about your situation it's impossible to give more specific help.

I am able to get the single input file readings into Kibana from Logstash. But as I add another file into the configuration file of Logstash, it didn't work and it shows only logs of only single file. Please find the below configuration file for the same.

input {
file {
type => "app01_apache_access_log"
path => "/apps_data/logs/app01/apache2/access.log"
start_position => "end"
}
file {
type => "app02_apache_access_log"
path => "/apps_log/logs/app02/apache2/access.log"
}
}
filter {
if [type] == "app01_apache_access_log" {
grok{
match => [ "message", "(?<session_id>[A-Z0-9]{32}-[a-z0-9]+.[a-z0-9]+)" ]
remove_tag => ["_grokparsefailure"]
named_captures_only => true
}
grok{
match => [ "message", "%{WORD:method} %{URIPATH:request}" ]
remove_tag => ["_grokparsefailure"]
named_captures_only => true
}

            grok{
                    match => [ "message", "%{NUMBER:duration}" ]
                    remove_tag => ["_grokparsefailure"]
                    named_captures_only => true
            }
    }
    if [type] == "app02_apache_access_log" {
            grok{
                    match => [ "message", "(?<session_id>[A-Z0-9]{32}-[a-z0-9]+\.[a-z0-9]+)" ]
                    remove_tag => ["_grokparsefailure"]
                    named_captures_only => true
            }
            grok{
                    match => [ "message", "%{WORD:method} %{URIPATH:request}" ]
                    remove_tag => ["_grokparsefailure"]
                    named_captures_only => true
            }

            grok{
                    match => [ "message", "%{NUMBER:duration}" ]
                    remove_tag => ["_grokparsefailure"]
                    named_captures_only => true
            }

    }

}

output {
elasticsearch {
hosts => "search-test-elk-qrxqhiyz6vqwck2xnyu3qhdjsu.us-east-1.es.amazonaws.com:443"
ssl => true
}
stdout {
codec => rubydebug
}
}

But as I add another file into the configuration file of Logstash, it didn't work and it shows only logs of only single file.

What does "show only logs of only single file" mean, exactly? And what makes you reach that conclusion? Since you've configured Logstash to tail both access.log files, is data actually being appended to both files?

I strongly suggest that you leave out the elasticsearch output for now and use the stdout output you have to debug things.

Hi,
I can see logs from "/apps_data/logs/app01/apache2/access.log" file which is my first input file. I see no output from another file. Also, I am using the stdout output only for this.

Also, I would like to mention that I am using Elastic Search Service of AWS so using multiple ports for Elastic search is kind of not applicable to me.

Please help me tackle this situation.

Again, since you've configured Logstash to tail both access.log files, is data actually being appended to both files? As currently configured Logstash will not read the files from the beginning.

How to confirm whether the data is appended or not? I think it is not appended as I see only the first log file data into Kibana. The logstash server is an EC2 instance on AWS and we are using ElasticSearch service of AWS cloud.

How to confirm whether the data is appended or not?

I think you're missing the point. We're talking about the file that Logstash is reading from. But as you've configured Logstash it's tailing the fail, i.e. it's reading data from the end of the file. Unless new data is added to the end of the file Logstash won't pick up anything. So, is some other application writing data to that file?

No other application is writing the data to that file as it is a file of access logs of one server of my product which is being copied to the logfile I give as input to Kibana. Also, I added the stat_position as "end" to reduce the reading of whole log file from the beginning and processing of the whole file. Would you please tell me how I can confirm the data appended or not? Please help me.

I'm not sure what you expect from Logstash. You're deliberately tailing the file but you're also not adding any new lines, correct? So why would you expect Logstash to read anything from the file and pass on to ES?

I am adding new lines to the files using rsync for log files are being continuously copied from webservers to Logstash server. So, I thought that would make logstash configuration take up new changes only as we are tailing the file. But only one file is being read and not other. As per the convention, both the files should be concatenated to make a whole new file so that logstash reads it. But here I don't think that is happening.

When rsyncing files I'd assume that the existing file isn't updated in-place but that a new file is created and renamed into place. When that happens the original file gets a new inode number and Logstash will consider it new, and with start_position => end it'll start tailing it from the current end.

Nope. The same file with the same inode number is updated in rsync. So I see the inode number is same as the first file. So, only the new logs which are there in the original log file are updated to the file on Logstash server. The question is, why is the second file not being picked up for processing by Logstash server which is access_log from another apache server?

Start Logstash with --verbose so that the file input logs more details about what it's doing.

starting the logstash in verbose mode showed nothing different usual stuff

Sure, but it will tell you which files were discovered and which sincedb positions they were at. That information should help here.

I couldn't see any such thing on the terminal. Instead, I saw the following thing,
starting agent {:level=>:info}
starting pipeline {:id=>"main", :level=>:info}
Settings: Default pipeline workers: 1
Starting pipeline {:id=>"main", :pipeline_workers=>1, :batch_size=>125, :batch_delay=>5, :max_inflight=>125, :level=>:info}
Pipeline main started..
What can i get from these?

If you start Logstash with --verbose and have a file input in your configuration you should get a lot more than that. What if you start with --debug? You should get an avalanche of logs.

I got the same reply for --debug mode.
I noticed that level:info there..
How can I change it to debug?? I tried from terminal but no luck.