Incorrect logs format in Kibana

Hi. I use Filebeat, Kafka and Logstash to forward logs to Elasticsearch. I receive logs in Kibana in the following format:

April 24th 2017, 20:20:04.218	2017-01-16 19:11:28.114 STDIO [INFO] SystemOut: read
April 24th 2017, 20:20:04.218		at java.lang.Thread.run(Thread.java:745) []
April 24th 2017, 20:20:04.218		... 6 more
April 24th 2017, 20:20:04.218		at clojure.lang.AFn.run(AFn.java:22) []

You can see that every line in original log file displayed as a new separate log with assigned timestamp in Kibana.
How can I get logs in correct format, so those 5 lines would form only one log message for time
April 24th 2017, 20:20:04.218?

filebeat.yml:

filebeat.prospectors:

- input_type: log
  paths:
    - /var/log/sampleLogs/*.log
  fields:
    application_name: ABC
  fields_under_root: true

registry_file: /var/lib/filebeat/registry

output.kafka:
  hosts: ["ip_address:port"]
  topic: 'logstash'
  max_message_bytes: 1000000

logstash.conf:

input {
  kafka {
    bootstrap_servers => "ip_address:port"
    topics => ["logstash"]
    codec => "json"
  }
}

filter {
  if [type] == "apache-access" {
    grok {
      match => [ "message", "%{COMBINEDAPACHELOG}" ]
    }
  }
}

output {
  elasticsearch {
    hosts => "elasticsearch:9200"
  }
}

Any suggestions?

Sounds like a job for filebeat multiline.

Thanks. I am trying to combine all lines which don't start with timestamp with the first previous line that start with a timestamp.
The sample timestamp is 2017-01-16 19:11:28.114
And this is filebeat.yml:

- input_type: log
  paths:
    - /var/log/sampleLogs/*.log
  fields:
    application_name: ABC
  fields_under_root: true
  multiline.pattern: '^\[0-9]{4}-[0-9]{2}-[0-9]{2}'
  multiline.negate: true
  multiline.match: after

In logstash.conf I don't do any filtering.
But in result I get only one message with all concatenated lines.

Output in Kibana:

April 24th 2017, 20:20:04.218	2017-01-16 19:11:28.114 STDIO [INFO] SystemOut: read
		                    at java.lang.Thread.run(Thread.java:745) []
                                    ... 6 more
                                    at clojure.lang.AFn.run(AFn.java:22) []
                                2017-01-16 19:11:29.129 STDIO [INFO] ......
                                Caused by: java.lang.RuntimeException:......
                                    at ......
                                    at ......
                                2017-01-16 19:11:30.425 STDIO [INFO] ......
                                2017-01-16 19:11:31.243 STDIO [INFO] ......

You can see lines are combined correctly but all those logs correspond to only one date April 24th 2017, 20:20:04.218 in Kibana.
How can I make a separate timestamps assigned to separate logs in log file in Kibana?

Expected output:

April 24th 2017, 20:20:04.218	2017-01-16 19:11:28.114 STDIO [INFO] SystemOut: read
		                    at java.lang.Thread.run(Thread.java:745) []
                                    ... 6 more
                                    at clojure.lang.AFn.run(AFn.java:22) []
April 24th 2017, 20:20:05.452   2017-01-16 19:11:29.129 STDIO [INFO] ......
                                Caused by: java.lang.RuntimeException:......
                                    at ......
                                    at ......
April 24th 2017, 20:20:06.412   2017-01-16 19:11:30.425 STDIO [INFO] ......
April 24th 2017, 20:20:06.926   2017-01-16 19:11:31.243 STDIO [INFO] ......

The lines in log file start either with timestamp, or with 4 spaces followed by some text, or with "java.lang.RuntimeException:", or with "Caused by:"

Why do you have a backslash in your pattern?
Try with this.
'[1]{4}-[0-9]{2}-[0-9]{2}'


  1. 0-9 ↩︎

Thanks. By the way what does the time (in my case April 24th 2017, 20:20:04.218) show in Kibana? Is it the time when log was received in Elasticsearch?

April 24th 2017, 20:20:04.218	2017-01-16 19:11:28.114 STDIO [INFO] SystemOut: read
		                    at java.lang.Thread.run(Thread.java:745) []

I am using old (2017-01-16) log file now but when working with current log file, should timestamp in Kibana be exactly the same as in log file? Like this:

April 24th 2017, 20:20:04.218	2017-04-27 20:20:04.218 STDIO [INFO] SystemOut: read...

the timestamp added by filebeat is the one when read from the log file. You can use logstash or ingest to extract the timestamp from the log event.

Ok, thanks. Btw is it possible to read only logs with today's timestamp in the log file name?
For example, read log_file.log.2017-05-02 but ignore log_file.log.2017-05-02.gz

Just "today" is not possible. But it seems more like don't want .gz files. This you can do with exclude_files config option.

This topic was automatically closed after 21 days. New replies are no longer allowed.