How to configure filebeat to store logs in elasticsearch correctly

Hi there , i am newbie in this stack sorry for dummy questions , in my backend i store logs like this

{"context":{"userId":"66ffe5e929e84af79a89dde8"},"level":"info","message":"getProfile: user found","timestamp":"2025-02-20T13:50:41.502Z"}
{"context":{"userId":"66ffe5e929e84af79a89dde8"},"level":"info","message":"getById: user found","timestamp":"2025-02-20T13:50:44.148Z"}
{"context":{"userId":"66ffe5e929e84af79a89dde8"},"level":"info","message":"getProfile: user found","timestamp":"2025-02-20T13:50:44.149Z"}

I see here timestamp is different for all but when filebeat sends this and stores in elasticsearch ( i dont use logstash , filebeat direct to elasticsearch ) , the kibana web page shows all logs in single field ( screeshot attached )

Screenshot 2025-02-21 at 22.38.42

here is my filebeat config file ( chatgpt generated )

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /var/www/api/logs/*.log
  multiline:  # Important if you have stack traces
    pattern: '^\[\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}\]' # Adjust if your timestamp format is different
    negate: true
    match: after
  close_inactive: 5m # Keep this to prevent Filebeat from closing if log is not actively written
  close_renamed: false
  close_removed: false

output.elasticsearch:
  hosts: ["localhost:9200"]
  index: "nestjs-logs-%{+yyyy.MM.dd}"

setup.template:
  name: "nestjs-logs"
  pattern: "nestjs-logs-*"

I will be very thankfull to get some clear explonation how to config it correctly or what i do wrong . Thank you.

Hi @Mirali_Rafiyev, welcome to our community!

This link might give you a broader perspective:

Dec 4th 2022: [EN] Ingesting JSON logs with Elastic-Agent (and/or Filebeat))

It primarily covers JSON parsing in the initial steps.