Creat multiple index pattern

hello every one ,

i am trying to make an config to creat a multiple index pattern with logstash , i am using filebeat also ,

principally i have two types of file , each type locate in separate folder

so there is my filebeat.yml :

output:
logstash:
enabled: true
hosts:
- 213.X.X.X:5002
timeout: 15

filebeat:
prospectors:
-
paths:
- "/home/hakim/cluster2/*.log"
input_type: log
document_type: cluster2

-
  paths:
    - "/home/hakim/clustermon/*.log"
  input_type: log
  document_type: clustermon

and there is my logstash.conf :

input {
beats {
port => 5002
}
}

Add your filters / logstash plugins configuration here

filter {
grok {

match => { "message" => [my filter (i deleted it because so long )] }
}
date {
match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
locale => en
}
geoip {
source => "clientip"
}

}

output {
if "_grokparsefailure" in [tags] {
# write events that didn't match to a file
file { "path" => "home/hakim/grok_failures.txt" }
}
else{
if [type] == 'cluster2' {
elasticsearch {
hosts => ["213.X.X.X:9200"]
#manage_template => false
index => "cluster2"
user => XXXX
password => XXXX
}
}
else if [type] == 'clustermon' {
elasticsearch {
hosts => ["213.X.X.X:9200"]
#manage_template => false
index => "clustermon"
user => XXXX
password => XXXX
}
}
}
}

when i put docker-compose up , all it's okay but when i added file logs there nothing happen , i didn't know why ?

Debug this systematically. Is Logstash getting any logs at all? Have you looked in the Logstash logs for clues about any connectivity problems with Elasticsearch?

hi thank you for your response ,

i looked in logs , there are no problems
i think the condition which a make ' if [type] == "type" ' it's not taken by the code so nothing happen

i need help please if you are any other suggestion to how separate creation of indexes pattern by their path location.

thank you

Use a stdout { codec => rubydebug } output (not wrapped in a conditional) to dump the raw contents of the events to the Logstash log. That'll make it very clear if the type field indeed contains one of the two strings you support.

i used it , there a no 'type' field in my logstash logs , i noticed that there are 'source' field , can i use it instead ?

You can use whatever field you want. What fields you use to categorize your logs is up to you, but be careful about using filenames as they're more likely to change.

i tried with source it's worked , but i have to make the exact source ,now what i need is for example :

source = "/doc/file1.log" , but in /doc folder we have other logs files , i tried to use : if source == "/doc/*" for all logs files it's doen't work and i have to specify file per file , is there a way to specify all log files contained in folder ?

As I said, using filenames isn't a great idea. Instead, configure the inputs (Logstash's file input, Filebeat, or whatever you've got) to add a field (e.g. type but could be anything) that indicate the kind of log. Then your filters can focus on that field instead.

hi , i added 'type' tags , now it's appear in logstash logs , but when i use like with "source" , i mean
{ if [type] == "type"} , it doesn't work for me ?

To get help you need to show your Logstash configuration and an example message (use a stdout { codec => rubydebug } output to dump a raw event).

it show me in each log :

prospector => { "type" => "nameType" }

so i guess the conditional in logstash.yml must change in something like ' if [prospector] == "type" => "nameTYPE" '

?

note : in kibana it's the field created displayed : prospector.type

This is what you need:

if [prospector][type] == "nameType" {

it's works , thank you a lot !!! :slight_smile:

hi , i have just a question , can i do my code now with 'translate' instead of 'if , else if ...' because i have more than 2 indexes to creat ? also as i know there is no "switch case" on logstash config

Sure, you can use a translate filter.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.