Kibana is not discovering my new data

Hi Team,

I was having an ELK up and running and all was going well. Today I updated the conf file of my logstash to read from a SAMBA network drive and thereafter Kibana stops discovering the values.

I verified that :

Item is parsed by logstash, refer below , but there is an error with parsing of my logtime
{
"message" => "2016-09-21 20:34:48,882 INFO [bravura.commons.security.SecurityLog] (http-/0.0.0.0:8080-8) Login user not found: nipun",
"@version" => "1",
"@timestamp" => "2016-09-21T15:04:49.379Z",
"path" => "////guvctapfil03.bravurasolutions.local//LOG_DIR//security.log",
"host" => "GURDESKTOP207",
"type" => "securitylog",
"year" => "2016",
"month" => "09",
"day" => "21",
"hour" => "20",
"min" => "34",
"sec" => "48",
"msec" => "882",
"Loglevel" => "INFO",
"JavaClass" => "bravura.commons.security.SecurityLog",
"HostName" => "http-/0.0.0.0:8080-8",
"Word3" => "Login",
"Status" => "user",
"Word5" => "not",
"Word6" => "found",
"User_Name" => "nipun",
"tags" => [
[0] "login",
[1] "timestamp-matched"
],
"logtime" => "2016-09-21T20:34:48.882Z"
}
[33mFailed parsing date from field {:field=>"logtime", :value=>"%{year}-%{month}-%{day} %{hour}:%{min}:%{sec},%{msec}", :exception=>"Invalid format: "%{year}-%{month}-%{day} %{hour}:..."", :config_parsers=>"yyyy-MM-dd HH:mm:ss,SSS", :config_locale=>"default=en_IN", :level=>:warn}←[0m

Item is searchable in my Elastic Search index. Refer below

bash-4.3$ curl -XGET 'http://192.168.180.199:9200/login/_search?q=User_Name:nipun&pretty=true'
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 1257 100 1257 0 0 1387 0 --:--:-- --:--:-- --:--:-- 1387{
"took" : 913,
"timed_out" : false,
"_shards" : {
"total" : 5,
"successful" : 5,
"failed" : 0
},
"hits" : {
"total" : 1,
"max_score" : 5.94876,
"hits" : [ {
"_index" : "login",
"_type" : "securitylog",
"_id" : "AVdNRyHiNXuwS9poQOVR",
"_score" : 5.94876,
"_source" : {
"message" : "2016-09-21 20:34:48,882 INFO [bravura.commons.security.SecurityLog] (http-/0.0.0.0:808d: nipun",
"@version" : "1",
"@timestamp" : "2016-09-21T15:04:49.379Z",
"path" : "////guvctapfil03.bravurasolutions.local//LOG_DIR//security.log",
"host" : "GURDESKTOP207",
"type" : "securitylog",
"year" : "2016",
"month" : "09",
"day" : "21",
"hour" : "20",
"min" : "34",
"sec" : "48",
"msec" : "882",
"Loglevel" : "INFO",
"JavaClass" : "bravura.commons.security.SecurityLog",
"HostName" : "http-/0.0.0.0:8080-8",
"Word3" : "Login",
"Status" : "user",
"Word5" : "not",
"Word6" : "found",
"User_Name" : "nipun",
"tags" : [ "login", "timestamp-matched" ],
"logtime" : "2016-09-21T20:34:48.882Z"
}
} ]
}
}

The logstash configuration file is as follows

iinput {

file {
type => "securitylog"
#path => ["C:/Users/vikumar/Demo/Logs/Security/*"]
#path => ["////gurdesktop243//log_VSTrunk//security.log"]
path => ["////guvctapfil03.bravurasolutions.local//LOG_DIR//security.log"]

	start_position => "beginning"
}

}

filter {

mutate {
	remove_tag => [ "_grokparsefailure" ]
	remove_tag => [ "login" ]
	} 

if [type] == "securitylog"  {
#Filter for User Login status
	grok {
		match => { "message" => "%{YEAR:year}-%{MONTHNUM:month}-%{MONTHDAY:day}\s*%{HOUR:hour}:%{MINUTE:min}:%{SECOND:sec},%{NUMBER:msec}\s*%{WORD:Loglevel}\s*\[%{DATA:JavaClass}\]\s*\(%{DATA:HostName}\)\s*%{WORD:Word3}\s*%{WORD:Status}\s%{WORD:Word5}\s%{WORD:Word6}\:\s%{WORD:User_Name}\s?" }
		add_tag => "login"
		
	}
	mutate {
				add_field => {"logtime" =>"%{year}-%{month}-%{day} %{hour}:%{min}:%{sec},%{msec}"
			}
	}
	date {
		   match => ["logtime" , "yyyy-MM-dd HH:mm:ss,SSS"]
		   timezone => "UTC"
		   add_tag => ["timestamp-matched"]
		   target => "logtime"
		}
}	

}

output {

if "login" in [tags]{

	 stdout { codec => rubydebug }
	 elasticsearch  {
		hosts => ["192.168.180.179:9200"]
		index => "login"
	 }
}

else{}
#stdout { codec => rubydebug }
}

I found something very strange , my data is discoverable when I choose the time filter as This Week instead of Last 15 minutes.

But this is wrong because ELK stack is adding + 5: 30 hours to my log event means
if logstash says "logtime" => "2016-09-21T22:09:51.472Z" But Kibana shows September 22nd 2016, 03:39:51.472

Why this is so ?? Its strange

Hi Vinay,

I'm not quite sure. This may be more of a question for logstash Pulling values for properties that have index set to "no" ?

Kibana indeed only shows what fits in the time range that's configured on the top.

Thomas

Thanks for the response , you are right Kibana only shows what fits in time range and as i shared above in my case i don't know somehow +5 hrs were getting added to my logtime and I was thinking data is not recoverable and its a known bug

1 Like