Hi,
I am using logstash 6.5 and filebeat 6.5. I want to ship all docker container logs to logstash/elasticsearch. The docker container logs are formatted through JSON log driver and each line of stack trace is created as a separate json. See below.
{"log":"[2018-11-09 15:21:46,920] WARN [ReplicaFetcher replicaId=3, leaderId=2, fetcherId=1] Error connecting to node kafka-logs-1.kafka-logs.default.svc.cluster.local:9092 (id: 2 rack: null) (org.apache.kafka.clients.NetworkClient)\n","stream":"stdout","time":"2018-11-09T15:21:46.930338319Z"}
{"log":"java.io.IOException: Can't resolve address: kafka-logs-1.kafka-logs.default.svc.cluster.local:9092\n","stream":"stdout","time":"2018-11-09T15:21:46.930371914Z"}
{"log":"\u0009at org.apache.kafka.common.network.Selector.doConnect(Selector.java:235)\n","stream":"stdout","time":"2018-11-09T15:21:46.930376969Z"}
{"log":"\u0009at org.apache.kafka.common.network.Selector.connect(Selector.java:214)\n","stream":"stdout","time":"2018-11-09T15:21:46.930381203Z"}
{"log":"\u0009at org.apache.kafka.clients.NetworkClient.initiateConnect(NetworkClient.java:864)\n","stream":"stdout","time":"2018-11-09T15:21:46.930385023Z"}
{"log":"\u0009at org.apache.kafka.clients.NetworkClient.ready(NetworkClient.java:265)\n","stream":"stdout","time":"2018-11-09T15:21:46.930388788Z"}
{"log":"[2018-11-09 15:21:47,013] INFO [ReplicaFetcher replicaId=3, leaderId=2, fetcherId=1] Retrying leaderEpoch request for partition logging-4 as the leader reported an error: UNKNOWN_SERVER_ERROR (kafka.server.ReplicaFetcherThread)\n","stream":"stdout","time":"2018-11-09T15:21:47.01591875Z"}
I tried to use Filebeat and multicodec plugin to put the stacktrace together and write it as single message to elasdticsearch.
For testing purposes, I was reading the logs from a file and I am forwarding them to logstash. I used the following filebeat.yml configuration
# filebeat.yml
filebeat.prospectors:
- type: log
paths:
- '/home/ubuntu/logstash/someapp.log'
multiline.pattern: '^\\t'
multiline.negate: false
multiline.match: after
processors:
- decode_json_fields:
fields: ["message"]
target: ""
overwrite_keys: true
output.logstash:
hosts: ["localhost:5044"]
logging.to_files: true
logging.to_syslog: false
Filebeat is able to stripout @stream @time from logs but multiline is not working. See log , message attributes.
{
"beat" => {
"hostname" => "playground",
"name" => "playground",
"version" => "6.5.1"
},
"@timestamp" => 2018-12-03T10:05:12.758Z,
"log" => "\tat org.apache.kafka.clients.NetworkClient.ready(NetworkClient.java:265)\n",
"offset" => 918,
"message" => "{\"log\":\"\\u0009at org.apache.kafka.clients.NetworkClient.ready(NetworkClient.java:265)\\n\",\"stream\":\"stdout\",\"time\":\"2018-11-09T15:21:46.930388788Z\"}",
"host" => {
"name" => "playground"
},
"source" => "/home/ubuntu/logstash/someapp.log",
"prospector" => {
"type" => "log"
},
"tags" => [
[0] "beats_input_codec_plain_applied"
],
"time" => "2018-11-09T15:21:46.930388788Z",
"stream" => "stdout",
"input" => {
"type" => "log"
},
"@version" => "1"
}
Can some one help me getting the stacktrace from above logs as a single message in logstash/elasticsearch?
Regards,
Sgarap