How to push logs to elasticsearch in filebeat?

hear is my filebeat.yml

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - ../typescript/rate-limit-test/logs/*.log
  json.message_key: "message"
  json.keys_under_root: true
  json.overwrite_keys: true
  scan_frequency: 1s

filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false

setup.template.settings:
  index.number_of_shards: 1

logging.level: debug

output.elasticsearch:
  hosts: ["34.97.108.113:9200"]
  index: "filebeat-%{+yyyy-MM-dd}"
setup.template:
  name: 'filebeat'
  pattern: 'filebeat-*'
  enabled: true
setup.template.overwrite: true
setup.template.append_fields:
- name: time
  type: date

processors:
  - drop_fields:
      fields: ["agent","host","ecs","input","log"]

setup.ilm.enabled: false`

I changed scan_frequncy but elasticsearch could'nt get logs faster
How can i get logs in elasticsearch instantly?
Please help me..

Ingestion latency is affected by a lot of factors, including CPU and memory limits of both the beat and elasticsearch nodes, the network connection between them, and so on. You'll never see logs in the search index instantly but there may be ways to reduce the latency.

scan_frequency specifies how often filebeat should scan its input paths for new files, but it doesn't affect how fast the data in those files is processed ones they're being read. Some diagnostic questions to start with when troubleshooting ingestion speed are: how much delay are you observing between the initial logs and their appearance in elasticsearch? Is the delay steady, or does it vary depending on the time of day or other factors? How much log data (on average) are you trying to transmit? What is the network bandwidth between your beats and your elasticsearch server?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.