Create a new index in elasticsearch for each log file by date

Hi, I have managed to get my ELK stack up and ruining which filters logs using grok and passes data to elasticsearch
and allows me to visualize in Kibana.

Currently
I have completed the above task by using one log file and passes data with logstash to one index in elasticsearch :

yellow open logstash-2016.10.19 5 1 1000807 0 364.8mb 364.8mb

What I actually want to do

If i have the following logs files which are named according to Year,Month and Date

MyLog-2016-10-16.log
MyLog-2016-10-17.log
MyLog-2016-10-18.log
MyLog-2016-11-05.log
MyLog-2016-11-02.log
MyLog-2016-11-03.log

I would like to tell logstash to read by Year,Month and Date and create the following indexes :

yellow open MyLog-2016-10-16.log 5 1 1000807 0 364.8mb 364.8mb
yellow open MyLog-2016-10-17.log 1 1000807 0 364.8mb 364.8mb
yellow open MyLog-2016-10-18.log 5 1 1000807 0 364.8mb 364.8mb
yellow open MyLog-2016-11-05.log 5 1 1000807 0 364.8mb 364.8mb
yellow open MyLog-2016-11-02.log 5 1 1000807 0 364.8mb 364.8mb
yellow open MyLog-2016-11-03.log 5 1 1000807 0 364.8mb 364.8mb

Please could I have some guidance as to how do i need to go about doing this ?

Thanks You

You are aware that indexes have a fixed memory overhead and that having too many indexes is a bad idea? With one index per logfile per day the numbers will quickly run up.

The index name is set with the index option of the elasticsearch output. It supports %{fieldname} references, so if you have a logfilename field containing the filename you could do this:

output {
  elasticsearch {
    ...
    index => "%{logfilename}-%{+YYYY.MM.dd}"
  }
}
1 Like

You are aware that indexes have a fixed memory overhead and that having too many indexes is a bad idea?

No I was not aware of this thanks for your input.

What about my filepath how do i get it to read each log file?

input {
file {
path => "C:\Elk\logstash\bin\MyLog-2016-10-16.log"
start_position => "beginning"
}
}

input {
	file {
	path => "C:\Elk\logstash\bin\%{MyLog}-%{+2016-10-16}"
	path => "C:\Elk\logstash\bin\%{MyLog}-%{+2016-10-17}"
	path => "C:\Elk\logstash\bin\%{MyLog}-%{+2016-10-18}"
	
	}
}

filter {
  grok {
    match => { "message" => "%?terms=%{WORD:key_word}*" }
  }
}


output {  
	elasticsearch { 
	
	index =>  %{MyLog}-%{+2016-10-16}"
	index =>  %{MyLog}-%{+2016-10-17}"
	index =>  %{MyLog}-%{+2016-10-18}"
	hosts => ["localhost:9200"] 

}

could i maybe do the above ?

What about my filepath how do i get it to read each log file?

Use a wildcard like MyLog-*.log. The actual path of the file from which an event was read can be found in the path field.

could i maybe do the above ?

No, that doesn't make sense.

Thank very much for your reply. Im going to give it a try and will get back to you with feedback.

Hi Magnus,

I have created a new question thta i need help with. please could you take a look i will really appreciate it Link : Querying elastic search in Kibana using Json