Filebeat index is not happening on the elastic search

I m using ELK stack 5.2 version.

I have changed the output to elasticsearch in filebeat.yml. I dont see any index on elasticsearch and kibana.

I want to move the file with logs on machine A to machine B[elk server]. Can Anyone help me out in solving the issue. I want to move the logs from the machine A to elk server and get stored and displayed on kibana.

i have started filebeat by using the command ./filebeat -e -c filebeat.yml -d "publish". I can see its not publishing events, please find below log.

PS C:\Program Files\cURL\bin> cd d:
PS D:\ELK-Stack\filebeat-5.1.1-windows-x86_64> .\filebeat.exe -e -c filebeat.yml -d "publish"
2017/03/01 11:25:36.812007 beat.go:267: INFO Home path: [D:\ELK-Stack\filebeat-5.1.1-windows-x86_64] Config path: [D:\EL
K-Stack\filebeat-5.1.1-windows-x86_64] Data path: [D:\ELK-Stack\filebeat-5.1.1-windows-x86_64\data] Logs path: [D:\ELK-S
tack\filebeat-5.1.1-windows-x86_64\logs]
2017/03/01 11:25:36.816010 beat.go:177: INFO Setup Beat: filebeat; Version: 5.1.1
2017/03/01 11:25:36.812007 logp.go:219: INFO Metrics logging every 30s
2017/03/01 11:25:36.816010 output.go:167: INFO Loading template enabled. Reading template file: D:\ELK-Stack\filebeat-5.
1.1-windows-x86_64\filebeat.template.json
2017/03/01 11:25:36.824011 output.go:178: INFO Loading template enabled for Elasticsearch 2.x. Reading template file: D:
\ELK-Stack\filebeat-5.1.1-windows-x86_64\filebeat.template-es2x.json
2017/03/01 11:25:36.846012 client.go:120: INFO Elasticsearch url: http://10.209.68.81:9201
2017/03/01 11:25:36.848011 outputs.go:106: INFO Activated elasticsearch as output plugin.
2017/03/01 11:25:36.852011 publish.go:234: DBG Create output worker
2017/03/01 11:25:36.852011 publish.go:276: DBG No output is defined to store the topology. The server fields might not
be filled.
2017/03/01 11:25:36.853014 publish.go:291: INFO Publisher name: BNGWIDAP107
2017/03/01 11:25:36.860013 async.go:63: INFO Flush Interval set to: 1s
2017/03/01 11:25:36.861014 async.go:64: INFO Max Bulk Size set to: 50
2017/03/01 11:25:36.861014 async.go:72: DBG create bulk processing worker (interval=1s, bulk size=50)
2017/03/01 11:25:36.861014 beat.go:207: INFO filebeat start running.
2017/03/01 11:25:36.862012 registrar.go:85: INFO Registry file set to: D:\ELK-Stack\filebeat-5.1.1-windows-x86_64\data\r
egistry
2017/03/01 11:25:36.862012 registrar.go:106: INFO Loading registrar data from D:\ELK-Stack\filebeat-5.1.1-windows-x86_64
\data\registry
2017/03/01 11:25:36.863011 registrar.go:131: INFO States Loaded from registrar: 0
2017/03/01 11:25:36.863011 crawler.go:34: INFO Loading Prospectors: 1
2017/03/01 11:25:36.863011 prospector_log.go:57: INFO Prospector with previous states loaded: 0
2017/03/01 11:25:36.863011 crawler.go:46: INFO Loading Prospectors completed. Number of prospectors: 1
2017/03/01 11:25:36.864012 crawler.go:61: INFO All prospectors are initialised and running with 0 states to persist
2017/03/01 11:25:36.864012 registrar.go:230: INFO Starting Registrar
2017/03/01 11:25:36.864012 sync.go:41: INFO Start sending events to output
2017/03/01 11:25:36.864012 spooler.go:63: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
2017/03/01 11:25:36.864012 prospector.go:111: INFO Starting prospector of type: log
2017/03/01 11:25:36.866013 log.go:84: INFO Harvester started for file: D:\Team\Sujith\logs\EDSPlugin.log
2017/03/01 11:25:41.864338 client.go:184: DBG Publish: {
"@timestamp": "2017-03-01T11:25:36.867Z",
"beat": {
"hostname": "BNGWIDAP107",
"name": "BNGWIDAP107",
"version": "5.1.1"
},
"input_type": "log",
"message": "2017-02-27 18:27:41 ERROR - Class :'OpenItemFilter',Method :'null',Message :Exception -- null",
"offset": 100,
"source": "D:\Team\Sujith\logs\EDSPlugin.log",
"type": "log"
}
2017/03/01 11:25:41.867336 client.go:184: DBG Publish: {
"@timestamp": "2017-03-01T11:25:36.867Z",
"beat": {
"hostname": "BNGWIDAP107",
"name": "BNGWIDAP107",
"version": "5.1.1"
},
"input_type": "log",
"message": "2017-02-27 18:27:41 ERROR - Class :'OpenItemFilter',Method :'null',Message :Exception Details ---",
"offset": 203,
"source": "D:\Team\Sujith\logs\EDSPlugin.log",
"type": "log"
}
2017/03/01 11:25:41.875337 client.go:184: DBG Publish: {
"@timestamp": "2017-03-01T11:25:36.867Z",
"beat": {
"hostname": "BNGWIDAP107",
"name": "BNGWIDAP107",
"version": "5.1.1"
},
2017/03/01 11:25:42.516372 output.go:109: DBG output worker: publish 50 events
2017/03/01 11:25:42.522369 single.go:140: ERR Connecting error publishing events (retrying): 401 Unauthorized
2017/03/01 11:25:43.528432 single.go:140: ERR Connecting error publishing events (retrying): 401 Unauthorized
2017/03/01 11:25:45.811579 single.go:140: ERR Connecting error publishing events (retrying): 401 Unauthorized

Duplicate. Please don't open multiple topics discussing the very same issue.

Sorry about that, I have not got response yesterday Hence thought of posting again so that i get an help from any of the collegue.

Sorry again Steffens.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.