I wanted to input multiple files with same category from a directory to a same index into Kibana with Elasticsearch backend. I noticed that it could be done using Logstash but not clear documentation on how to do to using Elastic agent and ingest pipelines. Any ideas or tips would be appreciated. Thanks.
How about FSCrawler?
@cheshirecat Is there no inbuilt solution for this with elastic agent and ingest pipeline?
Perhaps take a look at Custom Logs Integration add each of the Logs Paths and and an Ingest Pipelines settings etc
You might need to Create a Top Level Pipeline that then call the more specific pipelines to process each type, much like using the If / Else processing
@stephenb Yes, I think I should look into this. Maybe its time to make a documentation about this from Elastic team.
I don't know things like that - I use FSCrawler for ingesting many files into indices.
You could also look at
@stephenb Thanks but the above link is for adding a new field to each document. But what I am looking is concatenating 100's of log files into a single index from a specified directory.
The picture I showed above using custom logs can accomplish that ... you can use wildcards in the log path
I am not sure what you mean by concatenating ... but it is very common to read many log files into a single index / index pattern etc that is a very common use case.
@stephenb. Yes, I am playing with custom logs integration. I think
/path/to/logs/*.log will work. Thanks for the reply.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.