How to differentiate the logs in Kibana dashboard

I have the Requirement like,

I have the different types of logs from different service components i.e,

 1. AKPOS service log
 2. XE Services log
 3. Performance Service log

Path for input file:

Using this path, I load all 3 different types of logs

 path => [ "D:\Projects\Log\logs/*" ]

Then my question is:

How we can differentiate these logs are different (means from different services), and how we can create different Dashboards(for each Service log different Dashboard) in Kibana showing their difference.

Actually all logs (3 different service logs) are loaded into Elasticsearch and Kibana. After i am unable to showing their differentiation.

This is more of a Logstash question.

It's common to assign different types to different kind of logs. In your case I'd consider using three different file inputs that match different filename patterns:

file {
  path => ["D:/Projects/Log/logs/akpos*"]
  type => "akpos"
}
file {
  path => ["D:/Projects/Log/logs/xeservice*"]
  type => "xeservice"
}
file {
  path => ["D:/Projects/Log/logs/performance*"]
  type => "perf"
}

Then you'll get a type field that you can in Kibana queries or in Logstash filters since you'll probably want to parse the log messages differently depending on the type of log.

Is it Required to write in configuration file for different types of log files like this

input{
file {
  path => ["D:/Projects/Log/logs/akpos*"]
  type => "akpos"
}
file {
  path => ["D:/Projects/Log/logs/xeservice*"]
  type => "xeservice"
}
}
filter {
    if [type] == "akpos" {
            # processing .......
    }
    if [type] == "xeservice" {
            # processing .......
    }
}
output {
    if [type] == "akpos" {
            # output to elasticsearch
    }
    if [type] == "xeservice" {
            # output to elasticsearch
    }
}

No, it's not required. It just tends to be a good idea.

But when am giving Only file types, log data is not loaded into Elasticsearch server...

Please find the logstash.conf file

input {

    file {
      path => [ "D:\Projects\TestLogs\AKPOS/*" ]
      type => "AKPOS"      
     }
     
     file {
      path => [ "D:\Projects\TestLogs\CollectionManager/*" ]
      type => "CollManager"      
     }
     
     file {
      path => [ "D:\Projects\TestLogs\Performance/*" ]
      type => "Performance"      
     }
 }

filter {

multiline{
        pattern => "^%{TIMESTAMP_ISO8601}"
        what => "previous"
        negate=> true
    }

# Delete trailing whitespaces
  mutate {
    strip => "message"
  }

# Delete \n from messages
mutate {
    gsub => ['message', "\n", " "]
}

# Delete \r from messages
mutate {
    gsub => ['message', "\r", " "]
}

grok { 
  match => { "message" => "%{TIMESTAMP_ISO8601:time} \[%{NUMBER:thread}\] %{LOGLEVEL:loglevel} %{JAVACLASS:class} - %{GREEDYDATA:msg}" } 
}

grok { 
  match => { "msg" => '%{GREEDYDATA:text} [<]?[iI]no*voice[_,\" \"]?[iI]?[iI]?[dD]?[:]?[" "]?[=]?[" "]?[\\]?[\"]?%{UUID:InvoiceIID}[\\]?[\"]?' } 
}

}

output {

	elasticsearch {
            bind_host => "127.0.0.1"
            port => "9200"
            protocol => http
       }

	if  "ERROR" in [message]  {
	email  {
		options => [ "smtpIporHost", "smtp.gmail.com",
		 "port", "587",
		 "userName", "testabc@gmail.com",
		 "password", "*****",
		 "authenticationType", "plain",
		 "starttls","true"
	       ]
            from => "<testabc@gmail.com>"
            subject => "logstash alert"
            to => "<test123@gmail.com>"
	    cc => "<test@gmail.com>"
            via => "smtp"
            body => "Here is the event line that occured: %{message}"
	   }
	}
	stdout { codec => rubydebug }
    }

You're never looking at the contents of the type field so whatever value it gets doesn't matter. If your messages aren't reaching ES it's because of something else. Do all your log messages (regardless of type) begin with an ISO8601 date?

Yes logs are starting with ISO8601 date

AKPOS log:

2015-06-09 00:00:07,915 [11] DEBUG AKPOS_MessageWS.MessageProcessor - Processing message type: RentalTransactionProcessor From siteIID: bd20166f-b304-43bd-9ba3-56df4a429b74

Collection Manager log:

2015-06-18 00:00:41,363 [7] DEBUG NCR.AKPOS.CollectionsManager.NonReturnInvoincer - NonReturnConverter.Process() Start

Performace log:

2015-07-31 00:02:30,107	16	INFO 	SiteManager.SiteManager()	45.6379

And new data is being added to these files? You're not trying to import old files that aren't seeing new messages? What if you comment out the elasticsearch output and just keep the stdout output?

Yes different types of logs are loaded. After loading these logs we need to create a dashboard for individual type of logs (individual dashboard for single type of log, like 3 dashboards for 3 types of logs). In this dashboards, how to create the visualizations and represent the visualizations.

Hi,
I tried the same way as discussed here but no luck. Only my first file is shown in the kibana. Would you please help me in this regards?

@tashinfrus, please start a new thread for your question (and include more details).

Hi,
I am trying to integrate ELK in my application. I have multiple log files for dev, qa etc. and Iam configuring all the log files in the filebeat.yml. How can I distinguish this log files in kibana.? Now am getting all the logs in one place. I want this to be in separate places, so that i can distinguish the logs. Is there any way to do so? Please help me on this.