I have been using ELK for the past few weeks. It is working very well with my logs data and dynamically I am able to update it also. Now I have some JSON data which is being generated every hour as a JSON file. I wanted to know how can I upload these JSON files and also I want them to be updated every hour is it possible to do so. If yes can you please explain how I can approach this problem? Thanks in advance.
Hi, @tylersmalley thanks for the response. I wanted to add one thing that the new JSON files are generated every hour and we are not overriding this new JSON on the old one, we want every JSON files to be pushed to elasticsearch. Also, every hour when new JSON is generated it should get populated there. Is this possible, like the options you provided, will work for this scenario?
Also, if I use the filebeat, how do I filter the data. We have to write the filter in logstash right? I tried with the logstash but my logstash.conf file was unable to read it from the JSON files.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.