Noob Alert: Trying to use FileBeat to store my JSON file into Elasticsearch

Hi, I am a noob in ELK.
My use case is this, I have created a Json file named data.json, which looks like this:
{"clusterVirtutalCores": {"totalvc": 1, "reservedvc": 0, "allocatedvc": 0, "availablevc": 1}, "Capacity": {"capacityRemaining": 43801, "capacityUsed": 25000, "capacityTotal": 100000}, "numLiveDataNode": 1, "maintenancedataNode": {"nuEnteringnmaintenancedataNode": 0, "numInmaintenanceDeaddataNode": 0, "numInmaintenanceLivedataNode": 0}, "timestamp": "2018-01-01T18:05:49.635551", "Hostname": HDFS, "numDeadDataNode": 0, "nodeInsService": 1, "fileTotal": 560, "clustermemory": {"totalMB": 2048, "reservedMB": 520, "availableMB": 520, "allocatedMB": 520}, "datanode": {"availableMB": 2048, "usedmemoryMB": 2048, "Hostname": "hdfs-quickstart", "usedVC": 1, "availableVC": 1, "state": "running", "ID": "hdfs-quickstart"}}
The data is fake

So I want to use Filebeat to read the Json file and store it in the Elasticsearch.
I read over the internet and came out with this solution for filebeat:

Filebeat 6.1.1:
filebeat.prospectors:

  • paths:
    • /home/internship/Desktop/workspace/logs/data.json
      input_type: log
      json.keys_under_root: true
      json.add_error_key: true

output.elasticsearch:
hosts: ["http://localhost:9200"]

But I am unsure if it is even doing anything. I checked Kibana but it does not show the entries. What can I do to solve my problem?

Have you checked filebeat logs?

Filebeat requires the log file to have one json document per line. Plus a newline separator, after each json document is required.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.