I think I read in the docs that you cold just leave the folder like that to consume the whole folder. But I did add the *.logs and we are still getting no logs and the agent still shows as Unhealthy.
C:\inetpub\Hamilton\AccountManagement\Logs\*.log
Looking at the agent logs, I noticed the issue below but I am not sure what it means:
"components": [
{
"id": "log-default",
"type": "log",
"status": "HEALTHY",
"message": "Healthy: communicating with pid '8520'",
"units": [
{
"id": "log-default-logfile-logs-691a5a32-64ce-4c02-9f5d-d479cd5bb856",
"type": "input",
"status": "FAILED",
"message": "[failed to reloading inputs: 2 errors: Error creating runner from config: Can only start an input when all related states are finished: {Id: native::1441792-2369445-1620868268, Finished: false, Fileinfo: &{19102023.log 32 {2945512199 31064694} {3402349450 31064729} {3402349450 31064729} 0 618885 0 0 {0 0} 1620868268 1441792 2369445 false}, Source: C:\\inetpub\\Hamilton\\AccountManagement\\Logs\\19102023.log, Offset: 1380633, Timestamp: 2023-10-19 16:41:18.6887885 -0400 EDT m=+21657.986057801, TTL: -1ns, Type: log, Meta: map[], FileStateOS: 1441792-2369445-1620868268}; Error creating runner from config: Can only start an input when all related states are finished: {Id: native::1703936-2369502-1620868268, Finished: false, Fileinfo: &{19102023.log 32 {2996491781 31064694} {3402619461 31064729} {3402619461 31064729} 0 908088 0 0 {0 0} 1620868268 1703936 2369502 false}, Source: C:\\inetpub\\Hamilton\\BookingManagement\\Logs\\19102023.log, Offset: 1934646, Timestamp: 2023-10-19 16:41:11.9932887 -0400 EDT m=+21651.290527201, TTL: -1ns, Type: log, Meta: map[], FileStateOS: 1703936-2369502-1620868268}]"
},
It seems like we managed to fix the issue by updating the agent version from 8.7.0 to 8.10.4. Once this was done the agent came back as Healthy, but we were not getting logs.
We also noticed the custom integration was using "-" as part of the dataset name, once this part was fixed we started receiving logs.
Now we are trying to filter in the stream by the dataset name provided in the custom integration but this one is not showing.
Appreciate any suggestions you could make with that part
Not sure what that means I would suggest opening a new topic with a complete description.
I am curious ...
Where you saw that in the docs, that did not work for me, I always have to provide the file *.* or *.log or * etc I tried just putting the folder / directory and it did not work
Will create a new threat, Thanks following up @stephenb, appreciate the help!
Regarding the *.log part, funny, I didn't read that in the docs actually. It was a comment you left in some other topic a while ago hehe. Maybe I missed something and this doesn't apply to the custom logs integration.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.