Hello,
I'm not managing to migrate to filestream instead of log input.
Can you give me a hand, please!
When I get out of my configuration I stop receiving logs in kibana.
My current configuration that is working but seems deprecated (I'm using a helm installation of filebeat daemonset type in kubernetes)
Current:
filebeatConfig:
filebeat.yml: |
logging.level: ${debug_level}
filebeat.autodiscover:
providers:
- type: kubernetes
node: $${NODE_NAME}
hints.enabled: true
hints.default_config:
type: container
paths:
- /var/log/containers/*$${data.kubernetes.container.id}.log
exclude_lines: '^[[:space:]]*$'
multiline.pattern: '^[[:space:]]+(at|\.{3})\b|^Caused by:'
multiline.negate: false
multiline.match: after
processors:
- add_cloud_metadata:
- add_host_metadata:
- decode_json_fields:
fields: ["message"]
target: "fb"
overwrite_keys: true
- add_fields:
target: ''
fields:
gkeclustername: ${gke_cluster}
environment: ${environment}
cloud.id: '$${ELASTIC_CLOUD_ID}'
cloud.auth: '$${ELASTIC_CLOUD_AUTH}'
setup.template.enabled: true
setup.ilm.rollover_alias: "${index_prefix}-%%{[agent.version]}"
setup.template.name: "${index_prefix}-%%{[agent.version]}"
setup.template.pattern: "${index_prefix}-%%{[agent.version]}*"
setup.template.settings:
index.number_of_shards: ${number_of_shards}
index.number_of_replicas: ${number_of_replicas}
output.elasticsearch:
hosts: ['$${ELASTICSEARCH_HOST:elasticsearch}:$${ELASTICSEARCH_PORT:9200}']
index: "${index_prefix}-%%{[agent.version]}"
New one:
filebeatConfig:
filebeat.yml: |
logging.level: ${debug_level}
filebeat.autodiscover:
providers:
- type: kubernetes
node: $${NODE_NAME}
hints.enabled: true
hints.default_config:
type: filestream
id: container-log-$${data.kubernetes.pod.name}-$${data.kubernetes.container.id}
paths:
- /var/log/containers/*$${data.kubernetes.container.id}.log
exclude_lines: '^[[:space:]]*$'
multiline.pattern: '^[[:space:]]+(at|\.{3})\b|^Caused by:'
multiline.negate: false
multiline.match: after
processors:
- add_host_metadata:
- decode_json_fields:
fields: ["message"]
target: "fb"
overwrite_keys: true
max_depth: 4
- add_fields:
target: ''
fields:
gkeclustername: ${gke_cluster}
environment: ${environment}
cloud.id: '$${ELASTIC_CLOUD_ID}'
cloud.auth: '$${ELASTIC_CLOUD_AUTH}'
setup.template.enabled: true
setup.ilm.rollover_alias: "${index_prefix}-%%{[agent.version]}"
setup.template.name: "${index_prefix}-%%{[agent.version]}"
setup.template.pattern: "${index_prefix}-%%{[agent.version]}*"
setup.template.settings:
index.number_of_shards: ${number_of_shards}
index.number_of_replicas: ${number_of_replicas}
output.elasticsearch:
hosts: ['$${ELASTICSEARCH_HOST:elasticsearch}:$${ELASTICSEARCH_PORT:9200}']
index: "${index_prefix}-%%{[agent.version]}"
I don't understand why they are not being sent, I haven't seen any communication problems in the logs either, nor in the kibana part they are not being written in any other index, they just don't arrive.
Should I change anything?
Thank you very much