Best practice for JSON logging (e.g. timestamp processing)

Hi,
I'm quite new to the elastic stack so please excuse the question if it doesn't make any sense.
I've got an application (let's call it JBoss) which runs in a docker container (all components in my scenario do so) and it produces json logs. In my understanding there is no need for a logstash instance, so I configured the following things
filebeat.yml

filebeat.autodiscover:
  providers:
    - type: docker
      hints.enabled: true
      hints.default_config.enabled: false

docker-compose.yml
JBoss (further details not relevant I guess)

   labels:
      co.elastic.logs/enabled: true
      co.elastic.logs/json.keys_under_root: true
      co.elastic.logs/json.overwrite_keys: true
      co.elastic.logs/json.message_key: message
      co.elastic.logs/json.add_error_key: true

This is one log entry from JBoss:

{"timestamp":"2021-09-17T09:15:24.33Z","sequence":639,"loggerClassName":"org.jboss.logging.Logger","loggerName":"org.keycloak.transaction.JtaTransactionWrapper","level":"DEBUG","message":"JtaTransactionWrapper  commit","threadName":"Timer-2","threadId":163,"mdc":{},"ndc":"","hostName":"83add8eceadd","processName":"jboss-modules.jar","processId":172}

The output in Kibana looks like this (shortened):

{
  "_index": "filebeat-7.14.0-2021.09.17-000001",
  "_type": "_doc",
  "_id": "tzAq83sBuczAaH_z4Yww",
  "_score": 1,
  "_source": {
    "@timestamp": "2021-09-17T09:51:16.728Z",
    "ndc": "",
    "loggerClassName": "org.jboss.logging.Logger",
    "log": {
      "offset": 2919243,
      "file": {
        "path": "/var/lib/docker/containers/83add8eceadd31f392e55c4af15cee46e409304bd9a874953442393c4ca1b28f/83add8eceadd31f392e55c4af15cee46e409304bd9a874953442393c4ca1b28f-json.log"
      }
    },
    "threadId": 163,
    "timestamp": "2021-09-17T09:51:16.727Z",
    "message": "Executed scheduled task AbstractLastSessionRefreshStoreFactory$$Lambda$2041/0x0000000841731840",
    "sequence": 1842,
    "host": {
      "architecture": "x86_64",
      "os": {
        "codename": "Core",
        "type": "linux",
        "platform": "centos",
        "version": "7 (Core)",
        "family": "redhat",
        "name": "CentOS Linux",
        "kernel": "5.10.47-linuxkit"
      },
      "id": "3c2fb2b6af196dc020fd58ec005618e7",
      "containerized": true,
      "ip": [
        "172.20.0.5"
      ]
      ],
      "hostname": "de5d54c85632",
      "name": "de5d54c85632"
    },
    "processName": "jboss-modules.jar",
    "level": "DEBUG",
    "input": {
      "type": "container"
    },
    "loggerName": "org.keycloak.services.scheduled.ScheduledTaskRunner",
    "hostName": "83add8eceadd",
    "stream": "stdout",
    "processId": 171,
   }
}

Question:
My question is now how to deal e.g. with the timestamp field from the JBoss log which is slightly different from the @timestamp field. Is there a need for copying the timestamp field to the @timestamp field, as one is the date of processing the event and one is the real timestamp from the log entry? If so how to achieve this?

You could try Copy fields | Filebeat Reference [7.14] | Elastic where you copy timestamp to @ timestamp.

Yes u should overwrite @timestamp with the times from the log. The @timestamp field should be when the log actually happened. The ECS fields event.ingested is when the log was actually ingested into ES.