Filebeat is not sending Json data to Elasticsearch

I am trying to send a set of json data (basically logs in the format of json) to elasticsearch instance using filebeat. Here are my config file

filebeat.prospectors:
- paths:
   - ~/PIE/Logs/*.json
  input_type: log
  json.keys_under_root: true
  json.add_error_key: true

output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["localhost:9200"]
  template.name: "filebeat"
  template.path: "filebeat.template.json" 

one of the the record within the file looks like below. All the records are in separate lines

{"instance_id_str":"0","source_id_str":"APP/PROC/WEB","app_name_str":"App1","message":"hello","type":"syslog","event_uuid":"883d774f-02db-4967-9b89-eeff73890b09","origin_str":"rep","ALCH_TENANT_ID":"3213cd20-63cc-4592-b3ee-6a204769ce16","logmet_cluster":"topic3-elasticsearch_3","org_name_str":"Org1","@timestamp":"2017-09-29T02:40:50.519Z","message_type_str":"OUT","@version":"1","space_name_str":"prod","application_id_str":"3104b522-aba8-48e0-aef6-6291fc6f9250","ALCH_ACCOUNT_ID_str":"","org_id_str":"d728d5da-5346-4614-b092-e17be0f9b820","timestamp":"2017-09-29T02:40:50.519Z"}

I ran the configtest

sudo ./filebeat -configtest -e

The output looks ok

2017/10/11 22:12:13.895468 beat.go:297: INFO Home path: [/Users/taroy/Software/filebeat-5.6.3-darwin-x86_64] Config path: [/Users/taroy/Software/filebeat-5.6.3-darwin-x86_64] Data path: [/Users/taroy/Software/filebeat-5.6.3-darwin-x86_64/data] Logs path: [/Users/taroy/Software/filebeat-5.6.3-darwin-x86_64/logs]
2017/10/11 22:12:13.895783 beat.go:192: INFO Setup Beat: filebeat; Version: 5.6.3
2017/10/11 22:12:13.895821 metrics.go:23: INFO Metrics logging every 30s
2017/10/11 22:12:13.895797 processor.go:44: DBG  Processors: 
2017/10/11 22:12:13.896051 beat.go:198: DBG  Initializing output plugins
2017/10/11 22:12:13.896508 output.go:258: INFO Loading template enabled. Reading template file: /Users/taroy/Software/filebeat-5.6.3-darwin-x86_64/filebeat.template.json
2017/10/11 22:12:13.898453 output.go:269: INFO Loading template enabled for Elasticsearch 2.x. Reading template file: /Users/taroy/Software/filebeat-5.6.3-darwin-x86_64/filebeat.template-es2x.json
2017/10/11 22:12:13.899530 output.go:281: INFO Loading template enabled for Elasticsearch 6.x. Reading template file: /Users/taroy/Software/filebeat-5.6.3-darwin-x86_64/filebeat.template-es6x.json
2017/10/11 22:12:13.900029 client.go:128: INFO Elasticsearch url: http://localhost:9200
2017/10/11 22:12:13.900388 outputs.go:108: INFO Activated elasticsearch as output plugin.
2017/10/11 22:12:13.900400 publish.go:243: DBG  Create output worker
2017/10/11 22:12:13.900447 publish.go:285: DBG  No output is defined to store the topology. The server fields might not be filled.
2017/10/11 22:12:13.900463 publish.go:300: INFO Publisher name: Tathas-Mac.local
2017/10/11 22:12:13.900533 async.go:63: INFO Flush Interval set to: 1s
2017/10/11 22:12:13.900540 async.go:64: INFO Max Bulk Size set to: 50
2017/10/11 22:12:13.900546 async.go:72: DBG  create bulk processing worker (interval=1s, bulk size=50)
Config OK

Yet when i run the service

sudo ./filebeat -e -c filebeat.yml -d "publish"

The filebeat index is not created in my local elasticsearch instance and also the files under ~/PIE/Logs/ are not processed

Can someone please guide me in figuring out the problem

Thanks,
Tatha

Do you have some filebeat log output?

Thanks Steffens for looking into it. Here is the log output

I have installed all the 3 products (ELstic, KIbana and FileBeats) and not using the docker containers. There is also no security in place.

Tathas-Mac:filebeat-5.6.3-darwin-x86_64 taroy$ sudo ./filebeat -e -c filebeat.yml -d "publish"
2017/10/11 22:29:13.029341 beat.go:297: INFO Home path: [/Users/taroy/Software/filebeat-5.6.3-darwin-x86_64] Config path: [/Users/taroy/Software/filebeat-5.6.3-darwin-x86_64] Data path: [/Users/taroy/Software/filebeat-5.6.3-darwin-x86_64/data] Logs path: [/Users/taroy/Software/filebeat-5.6.3-darwin-x86_64/logs]
2017/10/11 22:29:13.029418 metrics.go:23: INFO Metrics logging every 30s
2017/10/11 22:29:13.030011 beat.go:192: INFO Setup Beat: filebeat; Version: 5.6.3
2017/10/11 22:29:13.030470 output.go:258: INFO Loading template enabled. Reading template file: /Users/taroy/Software/filebeat-5.6.3-darwin-x86_64/filebeat.template.json
2017/10/11 22:29:13.033106 output.go:269: INFO Loading template enabled for Elasticsearch 2.x. Reading template file: /Users/taroy/Software/filebeat-5.6.3-darwin-x86_64/filebeat.template-es2x.json
2017/10/11 22:29:13.033661 output.go:281: INFO Loading template enabled for Elasticsearch 6.x. Reading template file: /Users/taroy/Software/filebeat-5.6.3-darwin-x86_64/filebeat.template-es6x.json
2017/10/11 22:29:13.035191 client.go:128: INFO Elasticsearch url: http://localhost:9200
2017/10/11 22:29:13.035229 outputs.go:108: INFO Activated elasticsearch as output plugin.
2017/10/11 22:29:13.035237 publish.go:243: DBG  Create output worker
2017/10/11 22:29:13.035311 publish.go:285: DBG  No output is defined to store the topology. The server fields might not be filled.
2017/10/11 22:29:13.035329 publish.go:300: INFO Publisher name: Tathas-Mac.local
2017/10/11 22:29:13.035404 async.go:63: INFO Flush Interval set to: 1s
2017/10/11 22:29:13.035412 async.go:64: INFO Max Bulk Size set to: 50
2017/10/11 22:29:13.035419 async.go:72: DBG  create bulk processing worker (interval=1s, bulk size=50)
2017/10/11 22:29:13.036425 beat.go:233: INFO filebeat start running.
2017/10/11 22:29:13.036454 registrar.go:85: INFO Registry file set to: /Users/taroy/Software/filebeat-5.6.3-darwin-x86_64/data/registry
2017/10/11 22:29:13.036489 registrar.go:106: INFO Loading registrar data from /Users/taroy/Software/filebeat-5.6.3-darwin-x86_64/data/registry
2017/10/11 22:29:13.036515 registrar.go:123: INFO States Loaded from registrar: 0
2017/10/11 22:29:13.036546 crawler.go:38: INFO Loading Prospectors: 1
2017/10/11 22:29:13.036655 spooler.go:63: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
2017/10/11 22:29:13.037023 registrar.go:236: INFO Starting Registrar
2017/10/11 22:29:13.036655 prospector_log.go:65: INFO Prospector with previous states loaded: 0
2017/10/11 22:29:13.036595 sync.go:41: INFO Start sending events to output
2017/10/11 22:29:13.037370 prospector.go:124: INFO Starting prospector of type: log; id: 11576210144221994408 
2017/10/11 22:29:13.037385 crawler.go:58: INFO Loading and starting Prospectors completed. Enabled prospectors: 1
2017/10/11 22:29:43.031317 metrics.go:34: INFO No non-zero metrics in the last 30s
2017/10/11 22:30:13.030285 metrics.go:34: INFO No non-zero metrics in the last 30s

No file is really picked up here. Is the file actually listed in the registry file? Does it work if you use the absolute paths to the log files (replace ~ with the actual path).

Thanks for the input. Yes that was the problem

Once I gave the absolute path /Users/taroy/.... it was able to pick the file.

Apologize for wasting your time on such a basic problem

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.