Vipin_Gaur
(Vipin Gaur)
September 16, 2019, 8:31am
1
All,
We are successfully shipping logs to ELK,Actual App Logs are generating in Kibana but with different sequence.Basically I want the log lines shown on Kibana to be in the same order as I would see them in the actual log files. Screenshot 2019-09-16 at 1.11.10 PM
Hey @Vipin_Gaur , it looks like the message
field contains the actual date and time that you wish to use to order your events. If it isn't already, you'll want to extract this date and time from the message
field into a dedicated field. After doing so, you can configure this field as the Time Filter field name per https://www.elastic.co/guide/en/kibana/current/tutorial-define-index.html#_create_an_index_pattern_for_time_series_data and your documents will then be ordered this way.
Vipin_Gaur
(Vipin Gaur)
September 17, 2019, 6:14am
3
how to extract date and time from message field into a dedicated field.
this is my .conf file
input {
beats {
port => 5044
}
}
filter {
if [log_type] == "json" {
mutate {
rename => {
"message" => "raw_data"
}
}
json {
source => "raw_data"
}
date {
match => [ "timestamp", "ISO8601" ]
}
}
if [fields][log_type] == "nojson" {
grok {
match => {"message" => ["(?m)%{LOGLEVEL:level}\s*\[%{TIMESTAMP_ISO8601:timestamp}\]\s*\[(?<thread>[\w._.-]+)\]\s*(?<logger>[\w\.\w]+)\s*(?<message>[^{]*)","(?m)%{LOGLEVEL:level}\s*\[%{TIMESTAMP_ISO8601:timestamp}\]\s*\[(?<thread>[\w\-\s./]+)\]\s*%{UUID:requestid}(?<message>[^{]*)"] }
}
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => ["localhost:9160"]
index => "logstash-api-logs-%{+YYYY-MM-dd}"
manage_template => false}
#file { path => "/opt/elk_data/devops/devops-%{+YYYY-MM-dd}.log" }
}
Hey @Vipin_Gaur , I'd recommend posting your most recent reply as a new post in the Logstash topic. You'll get a more informed response over there.
system
(system)
Closed
October 15, 2019, 2:18pm
5
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.