Being very new to ELK I have borrowed the following logstash config which seems to work in the main. Annoyingly though the @timestamp values are not the same as the values in the WebSphere logs but the date/time in which the index rebuilt. How do I get the @timestamp to show the actual date/time as it does for timestamp.
Incidentally, timestamp shows %{tz_num} appended which I would like to remove
Here is the config
input {
file {
path => [ "/opt/logs/SystemOut*.log" ]
start_position => "beginning"
type => "websphere"
# important! logstash read only logs from files touched the last 24 hours
# 8640000 = 100 days
sincedb_path => "/dev/null"
ignore_older => "8640000"
}
}
filter {
if [type] =~ "websphere" {
grok {
match => ["source", "%{GREEDYDATA}/%{GREEDYDATA:server_name}/SystemOut.log"]
}
grok {
match => ["message", "[%{DATA:wastimestamp} %{WORD:tz}] %{BASE16NUM:was_threadID} (?<was_shortname>\b[A-Za-z0-9$]{2,}\b) %{SPACE}%{WORD:was_loglevel}%{SPACE} %{GREEDYDATA:was_msg}"]
}
grok {
match => ["was_msg", "(?<was_errcode>[A-Z0-9]{9,10})[:,\s\s]%{GREEDYDATA:was_msg}"]
overwrite => [ "was_msg" ]
tag_on_failure =>
}
translate {
field => "tz"
destination => "tz_num"
dictionary => [
"CET", "+0100",
"CEST", "+0200",
"EDT", "-0400"
]
}
translate {
field => "was_errcode"
destination => "was_application"
regex => "true"
exact => "true"
dictionary => [
"CLFRW", "Search",
"CLFRA", "Activities",
"CLFRS", "Blogs",
"CLFRL", "Bookmarks",
"CLFRK", "Common",
"CLFRM", "Communities",
"EJPVJ", "Files",
"CLFRV", "Forums",
"CLFRQ", "Homepage",
"CLFRP", "Installer",
"CLFRO", "Configuration",
"CLFRR", "Notifications",
"CLFNF", "Portlet",
"CLFRT", "FedSearch",
"CLFWX", "News",
"CLFWY", "Event",
"CLFWZ", "Widget",
"CLFRN", "Profiles",
"CLFWY", "User",
"EJPIC", "Portal",
"EJPVJ", "Wikis",
"ADMS", "Websphere",
"SECJ", "Security"
]
}
mutate {
replace => ['timestamp', '%{wastimestamp} %{tz_num}']
}
date{
match => ["timestamp", "MM/dd/YY HH:mm:ss:SSS Z", "M/d/YY HH:mm:ss:SSS Z"]
tag_on_failure =>
}
mutate {
remove_field => [ 'tz', 'tz_num', 'wastimestamp' ]
}
}
}
output {
elasticsearch {
hosts => ["elasticsearch:9200"]
}
stdout { codec => rubydebug }
}
I have tried changing it to the following but that produces the same results.
}
date{
match => ["timestamp", "MM/dd/YY HH:mm:ss:SSS Z", "M/d/YY HH:mm:ss:SSS Z"]
target => "@timestamp"
tag_on_failure =>
}
Here is an example from logstash
{
"was_loglevel" => "I",
"was_msg" => " The trace state has changed. The new trace state is *=info.",
"message" => "[2/7/17 7:02:28:564 GMT] 000001b1 ManagerAdmin I TRAS0018I: The trace state has changed. The new trace state is *=info.",
"type" => "websphere",
"was_shortname" => "ManagerAdmin",
"tags" => [
[0] "_grokparsefailure"
],
"was_threadID" => "000001b1",
"path" => "/opt/logs/SystemOut.log",
"@timestamp" => 2017-02-09T07:59:23.656Z,
"@version" => "1",
"host" => "f2730df8227c",
"was_errcode" => "TRAS0018I",
"timestamp" => "2/7/17 7:02:28:564 %{tz_num}"
}
I'm pretty new to this so please would you mind explaining whilst I'm trying to learn the format of this file and what can be manipulated to what result?
Thanks in advance