[7.0.1] Journalbeat not processing Elasticsearch Log4j ESJsonLayout logs properly?

This entire code block is from the "message" field that journalbeat has shipped from an elasticsearch node, and it ends with a comma

{
"type": "server",
"timestamp": "2019-06-06T22:50:40,854+0000",
"level": "DEBUG",
"component": "o.e.a.b.TransportShardBulkAction",
"cluster.name": "es",
"node.name": "es-node",
"cluster.uuid": "Y0cqVUMKSu-j9fULMbjJWA",
"node.id": "k5O86uz8QZG-iKLj-u4TAQ",
"message": "[journalbeat-2019.06][0] failed to execute bulk item (index) index {[journalbeat-2019.06][_doc][xdP8LmsBVsAdhdNRsRVR], source[{\"systemd\":{\"transport\":\"journal\",\"invocation_id\":\"ab868a55d4b74affa8849e67ff39ca2c\",\"cgroup\":\"/system.slice/docker.service\",\"unit\":\"docker.service\",\"slice\":\"system.slice\"},\"container\":{\"log\":{\"tag\":\"kibana\"},\"id\":\"d0a26fb1dec0adafe40f6089682ce764caaffabace9136a35dbc223feb1eae50\",\"name\":\"kibana\",\"id_truncated\":\"d0a26fb1dec0\"},\"journald\":{\"custom\":{\"selinux_context\":\"unconfined\\n\"}},\"process\":{\"executable\":\"/usr/bin/dockerd-ce\",\"pid\":978,\"cmd\":\"/usr/bin/dockerd -H fd:// --containerd=/run/containerd/containerd.sock\",\"name\":\"dockerd\",\"uid\":0,\"capabilites\":\"3fffffffff\"},\"tags\":[\"production\",\"journalbeat\",\"beats_input_codec_plain_applied\",\"_grokparsefailure\"],\"syslog\":{\"priority\":6,\"identifier\":\"kibana\"},\"message\":{\"tags\":[\"api\"],\"statusCode\":200,\"message\":\"GET /api/status 200 22ms - 9.0B\",\"pid\":1,\"type\":\"response\",\"method\":\"get\",\"req\":{\"method\":\"get\",\"userAgent\":\"kibana\",\"headers\":{\"cache-control\":\"no-cache\",\"pragma\":\"no-cache\",\"content-length\":\"0\",\"user-agent\":\"FortiGate (FortiOS 6.0) Chrome/ Safari/\",\"host\":\"kibana11:5601\",\"keep-alive\":\"timeout=15\",\"connection\":\"Keep-Alive\"},\"remoteAddress\":\"kibana\",\"url\":\"/api/status\"},\"res\":{\"statusCode\":200,\"responseTime\":22,\"contentLength\":9},\"@timestamp\":\"2019-06-06T22:50:30Z\"},\"ecs\":{\"version\":\"1.0.0\"},\"@version\":\"1\",\"event\":{\"created\":\"2019-06-06T22:50:30.741Z\"},\"agent\":{\"version\":\"7.0.1\",\"id\":\"832fe381-ffd8-455a-88e4-f32db02286d2\",\"hostname\":\"e00be7c32821\",\"type\":\"journalbeat\",\"name\":\"kibana-2\",\"ephemeral_id\":\"c98891c5-05d8-41f4-8813-32bb03f9fc2a\"},\"type\":\"journalbeat\",\"host\":{\"boot_id\":\"3cca70ccfe014f64bcb4140343cf976e\",\"hostname\":\"kibana-2\",\"id\":\"59f5129737e54801b7ced2ff9f5c55a8\",\"name\":\"kibana-2\"},\"@timestamp\":\"2019-06-06T22:50:30.724Z\"}]}",

All layout.type in log4j2.properties are set to ESJsonLayout

We were trying to figure out how to combine Java stack trace error logs into a single event, and stumbled on this comma issue when Logstash Json filter wasn't working

Journalbeat does not do any manipulation to the message retrieved from systemd journal. Except for lowercasing field names.

You could try checking the event using journalctl to see check the original entry.

I am not sure what comma you are referring to. The key-value pairs in the event are separeted by commas, I assume the partial event you have shared contains more fields.

This is from journalctl

May 26 12:51:34 node elasticsearch[1182]: {"type": "server", "timestamp": "2019-05-26T12:51:34,804+0000", "level": "WARN", "component": "o.e.x.m.e.l.LocalExporter", "cluster.name": "cluster", "node.name": "node", "cluster.uuid": "Y0cqVUMKSu-j9fULMbjJWA", "node.id": "rqXFKinOTjy6NMf5azYIXw",  "message": "unexpected error while indexing monitoring document" , 

There is a comma , at the end of the message (same comma shown in kibana)

Elasticsearch log4j2.properties is set to use ESJsonLayout