Multisource index on elasticsearch passing by logstash

hey , im trying to create multiple source input from Filebeat , than injecting them into logstash to apply filters , and finally transfer the sources to elasticsearch as indexes

The problem i have , only one index is created instaed of 2 in Elasticsearch , in which part i should specify my index name please ?

Hello @Abdeljalil_El_Yousso

You need to create an index template and specify the number of shards to 1. This will create two index which means one primary and a replica.

Index Templates | Elasticsearch Guide [2.4] | Elastic

NOTE: Index template will be effective for new index and not for existing index.

1 Like

Hi @Abdeljalil_El_Yousso

You need to share all you logstash pipeline confs?

How are you defining them in pipelines.yml?

1 Like
input {
  ##--------- put file path to analyse   
   #file{
      #path => "
      #start_position => "beginning"
      #sincedb_path => "nul"
   #}
    #stdin{} 
    beats{
		type => "filestream"
		port => 5044
}
stdin{}
}
filter{
     grok {   
    match => {"message" =>  "\[%{HTTPDERROR_DATE:timestamp}\] \[%{WORD:module}:%{LOGLEVEL:loglevel}\] \[pid %{POSINT:pid}(:tid %{NUMBER:tid})?\] \[client %{IPORHOST:source_address}(:%{INT:source_port})\] %{DATA:[php][errorLevel]}\:%{GREEDYDATA:message} "}
     }

     #supprimer indesirable message
     #grok { 
      #overwrite => "message"
     #}
     geoip { 
       source => "source_address" 
        ecs_compatibility => disabled
        target => "destination.geo"
    }
   
      mutate {
        add_field => { "locationn" => "%{geoip.location.lat},%{geoip.location.lon}"
          #rename => { "geoip.location.lat" => "[location][lon]"
           #          "geoip.location.lon"=> "[location][lat]"
      }
}
}
output{

#elasticsearch { 

       # index => "apacheerror-%{+YYYY.MM.dd}" 
        #index => "apacheapacheaccess-unity-%{+YYYY.MM.dd}"
        #hosts => ["https://localhost:9200"]
      
#}
if [type] == "filestream" 
 {
      elasticsearch {
      hosts => ["https://localhost:9200"]
      user => "********"
      password => "*****"
      index => "messages-%{+YYYY.MM.dd}"
      
    }

  #else 
     elasticsearch {
      hosts => ["https://localhost:9200"]
      user => "******"
      password => "******"
       
    }
}





    stdout{ }

}

this is my configuration file on Logstash and im using Logsatsh 8.6

Hi @Abdeljalil_El_Yousso

Your config looks malformed.

Here is a simple config that sends to 2 indices

input {
	beats{
		type => "filestream"
		port => 5044
  }
  stdin{}

}

filter {
}

output {

	if [type] == "filestream" {
		elasticsearch {
			hosts => ["http://localhost:9200"]
			index => "my-index-%{+YYYY.MM.dd}"
		}
	}
	else {
		elasticsearch {
			hosts => ["http://localhost:9200"]
			index => "my-other-index-%{+YYYY.MM.dd}"
		}		
	}
    stdout{ }
}
GET /_cat/indices/my*?v
health status index                     uuid                   pri rep docs.count docs.deleted store.size pri.store.size
yellow open   my-index-2023.04.27       b-zdx_d4Q-O0gvd2vAjYug   1   1      28998            0     11.3mb         11.3mb
yellow open   my-other-index-2023.04.27 s7AZzmezS7qZnEJTmbmVyA   1   1          3            0     13.8kb         13.8kb
1 Like

what if i have 2 filestream in my configuration ?

Not sure I understand the question....

Do you mean file streams coming from filebeat or file streams coming from logstash file input?..

In the end you just need a tag them and then do if else on the outputs.

You can tag inputs and filebeat, then use those in logstash

It's programming....

Set fields the use those fields in conditions

Thanks for your quick reply. What i want to say , if you have 2 filestream inputs in Filebeats , and your output is Logstash , on your logstash configuration file , you specity the output to Elasticsearch .
In this case , how can you create 2 indexes ? hope my question is clear AND THANKS AGAIN

Roll Up your sleeves and read the docs... here is one way. there are multiple ways...

Here is one...

Tag each filebeat input

- type: filestream

  # Unique ID among all inputs, an ID is required.
  id: my-filestream-id

  enabled: true
  paths:
    - /var/log/*.log
  
  # Add a tag to be used later
  tags: ["app-type-1"]

...

Then logstash output

tags are an array so it looks a little different

output {

	if "app-type-1" in [tags]{
		elasticsearch {
			hosts => ["http://localhost:9200"]
			index => "my-type-1-index-%{+YYYY.MM.dd}"
		}
	}
	else if "app-type-2" in [tags] { 
		elasticsearch {
			hosts => ["http://localhost:9200"]
			index => "my-type-2-index-%{+YYYY.MM.dd}"
		}		
	}
	else { 
		elasticsearch {
			hosts => ["http://localhost:9200"]
			index => "my-type-other-index-%{+YYYY.MM.dd}"
		}		
	}
  stdout{ }
}

Good Luck! Dig In!

1 Like

Thank you so much.
I managed to create multiple index after ur 2nd reply by adding Tags in both Filebeats and Logstach.
Thanks again.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.