How to read logs in subfolder of main folder filebeat

HI,
I have one main folder inside this I have 2 folder I am able to read the data of both folder in filebeat but how can be show which log is belong to which subfolder.

@devdev7711 when the logs are ingested via filebeat, there's a field called log.file.path added as a metadata. That property will tell you the absolute location of the log file from where this event was read and ingested.

This field is essentially defined by ECS which every component of the Elastic Stack conforms to: ECS fields | Filebeat Reference [8.6] | Elastic

"log.file.path": [
"C:\ProgramData\FD\SES\Logs\user1\101.log" 10 count
"log.file.path": [
"C:\ProgramData\FD\SES\Logs\user1\102.log" 20 count
"log.file.path": [
"C:\ProgramData\FD\SES\Logs\user2\101.log" 10 count
"log.file.path": [
"C:\ProgramData\FD\SES\Logs\user1\102.log" 50 count

I am getting the log file like above
in above example i have 30 logs for user1 and 60 logs for user 2
i want to aggregate and show as per user wise currently it is shwoing 4 graph
how can we specify aggrgation group as per user

Can you please be clear what exactly you are trying to achieve ? As per initial description, it seemed you were only interesting in knowing about which log entry comes from which file. But now, the description is more about visualizing the data based on different field altogether.

if you log file path i am getting data as per user folder and i want to in pie like user1 count user2 cound user3 count and many more. how can we make aggregation of specific for that.
I am making one pie graph and want to show like user1 30 count and user 2 60 count

Ok, I quite don't understand why log file path was your concern when you want a pie chart visualization based on some field.
About pie chart, I believe you cannot have min_doc_count functionality there without some additional Kibana plugins like Vega.

Our requirement to show the log count as per the user which user have how many log count this is our requirement. IN our project we are getting log data within user name folder.

OK, for pie chart, use metric as count and split buckets on user field.

I do not have user value in data we need to extract from the log.file.path value so we can get user1 user2 how can we extract this.
I am getting data in this way.
"log.file.path": [
"C:\ProgramData\UNO\SES\Logs\vs00776071\UnoSES25012023.log"
],
"log.offset": [
0
],
"message": [
"25-Jan-2023 14:12:56 INFO : BOT license verification successful."
],
"suricata.eve.timestamp": [
"2023-01-25T08:43:04.381Z"

Can we create new field and store username like user1 as per the data is coming

For a smaller use case, you can use add_field processor of filebeat and based on log file path, set the value for userID. But, imagine you have 1000s of users and logs are being shipped for all of them. This approach will become cumbersome and unmanageable.

One option would be to store the log file for each user under /user${ID} directory and using userID field can then be added from logstash based on log.file.path substring (probably gsub processor is required, but not sure).

Another way would be to create an ingest pipeline using script, get the substring of log.file.path to store only user${ID} as per approach # 1. Add a new field in _ctx with the value retrieved and create pie chart based on that field.

is it possible to extract userid from logpath location value and store in new field
somewhere I heard about tokenizer.

Yes it's possible using tokenizer Dissect strings | Filebeat Reference [8.6] | Elastic
Or, you can use ingest pipeline to grab the substring of log file path and add another field using script processor: Script processor | Elasticsearch Guide [master] | Elastic

for dissect string which file and which configuration we need to make change which setting exactly what we need to do it for can you please suggest

@Ayush_Mathur How can we implement as per your suggestion??
Can you please suggest step by step?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.