Logs with time difference only in seconds not ordered properly

I am using filebeats to send logs. When log events with time differnce only in seconds do not appear properly in kibana. Below is my filbeat configuration. The latest logs appear first as of now in kibana, but for logs with time difference only in seconds dont appear right

  • type: log
    enabled: true
    paths:
    • /opt/atlassian/jira/logs/catalina.out
      fields:
      log_type: catalina
      log_application: atlassian_jira
      multiline.pattern: '^[0-9]{2}-[[:alpha:]]{3}-[0-9]{4}'
      multiline.negate: true
      multiline.match: after

18-Apr-2020 21:01:50.455 WARNING [http-nio-8080-exec-97] com.sun.jersey.spi.container.servlet.WebComponent.filterFor
18-Apr-2020 21:01:55.891 WARNING [http-nio-8080-exec-149] com.sun.jersey.spi.container

Could you run the following Elasticsearch query to see what timestamps are being captured in the documents indexed by Filebeat?

GET filebeat-*/_search?sort=@timestamp:asc&filter_path=hits.hits._source.@timestamp

Shaunak

{"hits":{"hits":[{"_source":{"@timestamp":"2020-04-11T16:07:40.000Z"}},{"_source":{"@timestamp":"2020-04-11T16:07:40.000Z"}},{"_source":{"@timestamp":"2020-04-11T16:07:40.0 00Z"}},{"_source":{"@timestamp":"2020-04-11T16:07:40.000Z"}},{"_source":{"@timestamp":"2020-04-11T16:07:40.000Z"}},{"source":{"@timestamp":"2020-04-11T16:07:40.000Z"}},{" source":{"@timestamp":"2020-04-11T16:07:40.000Z"}},{"_source":{"@timestamp":"2020-04-11T16:07:40.000Z"}},{"_source":{"@timestamp":"2020-04-11T16:07:40.000Z"}},{"_source":{" @timestamp":"2020-04-11T16:07:40.000Z"}}]}}

Thanks, that tells us that the @timestamp values are being indexed at second precision. Which indicates that there's some issue with parsing between Filebeat and Elasticsearch, i.e. the Kibana side of things is working fine.

Next, could you post your complete filebeat.yml please (with any sensitive information redacted)?

Thanks,

Shaunak

Hi shaun,
I am pasting the filebeat configuration below. Also we use logstash before elastic search to parse the log time instead of event creation time. I am pasting the logstash configuration as well.

Filebeat configuration

  • type: log
    enabled: true
    paths:
    • /opt/atlassian/confluence/logs/catalina.out
      fields:
      log_type: catalina
      log_application: atlassian_confluence
      multiline.pattern: '^[0-9]{2}-[[:alpha:]]{3}-[0-9]{4}'
      multiline.negate: true
      multiline.match: after

Logstash Configuration

filter {
if [fields][log_type] == "catalina" {
grok {
match => { "message" => "(?%{MONTHDAY}-%{MONTH}-%{YEAR} %{HOUR}:?%{MINUTE}(?::?%{SECOND}))\s%{LOGLEVEL:level}\s+[%{DATA:thread}]\s+%{GREEDYDATA:log}" }
}
date {
match => ["logtimestamp2", "dd-MMM-yyyy HH:mm:ss.SSS"]
target => "datestamp2"
}
}

else if [fields][log_type] == "Atlassian" {

grok {
match => { "message" => "%{TIMESTAMP_ISO8601:logtimestamp3}\s+%{GREEDYDATA:thread}\s+%{LOGLEVEL:level}\s+%{GREEDYDATA:message}" }
}
date {
match => ["logtimestamp3", "ISO8601"]
target => "datestamp3"
}
}
else if [fields][log_type] == "Atlassian_confluence" {

grok {
match => { "message" => "%{TIMESTAMP_ISO8601:logtimestamp3}\s+%{LOGLEVEL:level}\s+%{GREEDYDATA:message}" }
}
date {
match => ["logtimestamp3", "ISO8601"]
target => "datestamp3"
}
}
}

Hi @shaunak,
Did you get a chance to review this

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.