How can filter diiferent logs with different index generated by logstash in kibana

Hi
I take log file from different paths and make different config files according to it.
Then for output all indexes are created as given in config file but while in kibana it filter with different index it show all logs of all indices?

In logstash input plugin is file and output plugin is elasticsearch

can anyone help me here?

Kibana works with index pattern. so if you want to see only the logs from specific index, you need to create an index pattern of if. by default , there’s a logstash-* index pattern in kibana to work with logs generated by logstash. if you have specified a different index name in logstash config, all you need to do is create index pattern that matches those indices.

Hi @ptamba,

I already create index pattern then I check the log in Discover tab and filter with a specific index name but It shows all logs while selecting anyone index.

you will need to show your logstash output section and the index pattern you use. what happens versus what you expect. otherwise it’s really hard to imagine.

Sample Logstash configuration for creating a simple

Beats -> Logstash -> Elasticsearch pipeline.

i/p File o/p kibana

input {
file {
path => "/root/Desktop/mount/LOGS/node01/node01-05-*.log"
start_position => beginning
sincedb_path => "/dev/null"
}
}
output {
elasticsearch {
hosts => ["xx.xxx.xxx.xxx:9200"]
manage_template => false
index => "node1_%{+YYYY.MM}"
}
}

this is my logstash one config file
I created 5 config file for different nodes
node01 to 05

if you put all config files in the same directory and use that folder as path.config , all your input files will be processed by every single config unless you add fields to differentiate them. per the docs :

“If a directory is given, all files in that directory will be concatenated in lexicographical order and then parsed as a single config file”

add a tag (or type) to each input and do conditional on the output. that way, each input get processed by correct output.

This the path where my all config files saved: /etc/logstash/conf.d/
node01.conf
node02.conf
node03.conf
node04.conf
node05.conf

I am using below command for run multiple config file
sudo /usr/share/logstash/bin/logstash --path.settings /etc/logstash/ --path.data /var/log/ -f /etc/logstash/conf.d &

as i mention, all those configuration files will be concatenated into a large single config. so unless you differentiate events per config file, each files will processed by each config. that's why your index contains events from all sources.

you could do for example in (conf1)

input {
.... (your existing config)
type => "node1"
}

then on output section

output {
if [type] == "node1" {
elasticsearch {
hosts => ["xx.xxx.xxx.xxx:9200"]
manage_template => false
index => "node1_%{+YYYY.MM}"
}
}

if you're doing filter processing, ensure that you filter also. uses the same type. use different identifier for different config

Now it's working

Thanks a lot @ptamba

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.