I'm trying out the elastic agent feature in a test cluster but I haven't been able to figure out how to ship custom logs to it.
In the integration, I've specified the log path to be
I was able to install and start an agent (with the correct policy) on my mac. I then created some
path/to/my/log folders relative to where I installed the agent and put a few ECS formatted log files into them:
However, after a few minutes I did not find a data stream that contained the log files (only metric stuff, which I turned off later).
I believe my error was incorrectly specifying the
path argument. How does the agent know what directory to scan once it's installed? And how do I specify the right path?
For reference, here's the policy I've created
id: 4d36f8c0-5d05-11eb-84d5-c3f10edecab1 revision: 3 outputs: default: type: elasticsearch hosts: - 'XXXXX' agent: monitoring: enabled: false logs: false metrics: false inputs: - id: f222cd70-5f33-11eb-84d5-c3f10edecab1 name: python-logs-integration revision: 1 type: logfile use_output: default meta: package: name: log version: 0.4.6 data_stream: namespace: default streams: - id: logfile-log.log data_stream: dataset: tbd paths: - path/to/my/logs fleet: kibana: protocol: https hosts: - XXXX