Load JSON data to Kibana dynamically

Hi all,

I have been using ELK for the past few weeks. It is working very well with my logs data and dynamically I am able to update it also. Now I have some JSON data which is being generated every hour as a JSON file. I wanted to know how can I upload these JSON files and also I want them to be updated every hour is it possible to do so. If yes can you please explain how I can approach this problem? Thanks in advance.

I think there are a couple options if I understand your ask correctly.

You will want to create an index and ingest these into Elasticsearch, that is where all the data which Kibana uses is stored.

You can use Filebeat to monitor these files, which supports JSON format: https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-log.html#filebeat-input-log-config-json

Or, instead of writing a JSON file, just index it directly into Elasticsearch: https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-index_.html

Hi, @tylersmalley thanks for the response. I wanted to add one thing that the new JSON files are generated every hour and we are not overriding this new JSON on the old one, we want every JSON files to be pushed to elasticsearch. Also, every hour when new JSON is generated it should get populated there. Is this possible, like the options you provided, will work for this scenario?

Also, if I use the filebeat, how do I filter the data. We have to write the filter in logstash right? I tried with the logstash but my logstash.conf file was unable to read it from the JSON files.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.