So far the samples i have done are all the CSV files that are sourced into Elastic search thru logstash.
If I want to read IIS logs for specific errors is there a plugin in ELK stack or any tool that scans the IIS logs for the desired fields and generate CSV file?
Assuming you mean either the access or error logs, and not the FREB logs, then I would ingest the entire log with filebeat. You could then use logstash with an elasticsearch input and a csv output and whatever filters you want in the middle.
You might want to explain the use case a little more, because I think @warkolm and I are reading different use cases into what you wrote.
I realized that we can apply filters using logstash and pull only the fields that we need.
As a part this learning process of ELK, I want to ingest Jenkin logs (continous integration) and show on the dashboard the success/failed jobs that ran during last night. I have atleast 1500 jobs that run every night.
In the practice samples, I only ingested data from (kaggle.com) CSV files and I wrote .config (pipeline) to pull that data. It was simple so far.
But when I am thinking about pulling data from Log files which are not structured like CSV files, I was thinking if there is any tool to convert the desired fields into CSV format. Now I realized that logstash can do apply filters with combination of regular expressions. Let me read more about logstash and I will post again if I need .
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.