ELK Multi index in signle logstash.conf file

Hi,
I'm trying to push data from logstash to more then 2 index using single logstash.conf file. below is my configuration:

input {
beats {
port => 5044
}
}

input {
file {
type => "java"
path => "C:/ELK/ProjectsLogs/spring-boot-elk.log"
start_position => "beginning"
codec => multiline {
pattern => "^%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}.*"
negate => "true"
what => "previous"
}
}
}

filter {
#If log line contains tab character followed by 'at' then we will tag that entry as stacktrace
if [message] =~ "\tat" {
grok {
match => {
"message" => ["^(\tat)"]
}
add_tag => ["stacktrace"]
}
}

grok {
  # INFO 
	match => {
		"message" => ["^%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME} \.* INFO.*"]
	}
	add_field => {
		"type" => "info"
	}
}

grok {
  # DEBUG
	match => {
		"message" => ["^%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME} \.* DEBUG.*"]
	}
	add_field => {
		"type" => "debug"
	}
}

grok {
  # ERROR
	match => {
		"message" => ["^%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME} \.* ERROR.*"]
	}
	add_field => {
		"type" => "error"
	}
}

if "_grokparsefailure" in [tags] {
drop {}
}
}

output{
if [type] == "info" {
elasticsearch {
hosts => ["localhost:9200"]
index => "infoindexer"
}
} else if [type] == "debug" {
elasticsearch {
hosts => ["localhost:9200"]
index => "debugindexer"
}
}else{
elasticsearch {
hosts => ["localhost:9200"]
index => "errorindexer"
}
}
stdout {
codec => rubydebug
}
}

My Logstash server is running without any issue but not able to see index and logs in Kibana.
Could someone please help me to solve this issue?

Thanks in advance.

Did you create the index pattern in Kibana?

You can see indexes in Kibana index management before creating index patters too,

Hi Len Rugen,
I cant see index infoindexer, debugindexer and errorindexerin in Kibana which I mentioned in logstash.conf. So I couldn't create them.
Even I have put some logs for INFO, DEBUG and ERROR in input logs file. But In Logstash console nothing is appearing to send to ElasticSearch.

Could you please help me to know whether my config looks good or I need to make changes in order to get indexes in Kibana?

Thanks,
Jeet Ram

I would take baby steps as your logstash.conf have multiple things

Step1: Just create the inputs and no filter, just put it to the elasticsearch "infoindexer"
Step2: if above works, then put the filters one by one and see it all comes to "infoindexer"
Step3: if above works, then put the full blown logic

I always like to build things up rather than putting all the configs at the start itself and then debug.

1 Like

Even I was following same but after your comment i followed it with more focus and now I'm able to achieve what i wanted.
My config looks like:

input {
  beats {
    port => 5044
  }
}

input {
  file {
    type => "java"
    path => "C:/ELK/ProjectsLogs/spring-boot-elk.log"
	start_position => "beginning"
    codec => multiline {
      pattern => "^%{TIMESTAMP_ISO8601}"
      negate => "true"
      what => "previous"
    }
  }
}

filter {

	#If log line contains tab character followed by 'at' then we will tag that entry as stacktrace
	if [message] =~ "\tat" {
		grok {
			match => { "message" => ["^(\tat)"]}
			add_tag => ["stacktrace"]
		}
	}

	grok { 
		match => { "message" => [ "%{TIMESTAMP_ISO8601:timestamp}  %{LOGLEVEL:loglevel} %{GREEDYDATA:message}" ] }
		match => { "message" => [ "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:loglevel} %{GREEDYDATA:message}" ] }
		add_field => { "subType" => "%{loglevel}" }
	}
		
	if "_grokparsefailure" in [tags] {
		drop {}
	}
}

filter {
	mutate {
		lowercase => ["subType"]
	}
}

output{
	
	elasticsearch {  
		hosts => ["localhost:9200"]
		index => "%{subType}_indexer"    
	}

	stdout {
		codec => rubydebug 
	}
}

Thank you for giving you valuable input.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.