We are working to capture logs from server to elasticsearch by filebeat. We are getting below issue, while setup the filebeat.
Response: {"statusCode":413,"error":"Request Entity Too Large","message":"Payload content length greater than maximum allowed: 1048576"}
We have added http.max_content_length: 1200mb in elasticsearch.yml and server.maxPayloadBytes: 26214400 in kibana.yml. Still this issue is there.
How to resolve this issue?
Here is the YML files we are using for FileBeat, ElasticSearch & Kibana
FileBeat.yml
filebeat.inputs:
- type: logenabled: true paths:
- c:\logs\XXX*
- type: filestreamenabled: falsepaths:
- /var/log/*.log
filebeat.config.modules: path: ${path.config}/modules.d/*.yml
reload.enabled: false
setup.template.settings: index.number_of_shards: 1
setup.kibana:
host: "0.0.0.0:0000"
username: "XXXX"
password: "XXXX"
output.elasticsearch: hosts: ["0.0.0.0:0000"]
username: "XXXX" password: "XXXX"
processors:
- add_host_metadata: when.not.contains.tags: forwarded
- add_cloud_metadata: ~
- add_docker_metadata: ~
- add_kubernetes_metadata: ~
ElasticSearch.yml
- module: elasticsearch server: enabled: truegc: enabled: trueaudit: enabled: trueslowlog: enabled: truedeprecation: enabled: trueaccess: var.paths: ["/var/logs/XXX/XXX.log"]http.max_content_length: 1200mb
Kibana.yml
- module: kibanalog: enabled: trueaudit: enabled: trueserver.maxPayloadBytes: 26214400