I'm using FunctionBeat to export ECS logs to Elastic Cloud. Here is my configuration:
functionbeat.provider.aws.deploy_bucket: "gatsby-deploy"
functionbeat.provider.aws.functions:
- name: ProdServiceLogs
enabled: true
type: cloudwatch_logs
description: "Lambda function for transferring 'prod' logs from Cloudwatch '/services/container/prod' to ELK stack index (prod-services)."
triggers:
- log_group_name: /services/container/prod
cloud.id: "XXXX"
cloud.auth: "XXXX"
output.elasticsearch.index: "prod-services-%{+yyyy.MM.dd}"
setup.template.name: "prod-services"
setup.template.pattern: "prod-services-*"
setup.ilm.rollover_alias: "prod-services"
processors:
- if:
regexp:
message: "^{"
then:
- decode_json_fields:
fields: ["message"]
process_array: false
max_depth: 3
target: ""
overwrite_keys: true
- rename:
fields:
- from: "service"
to: "service.name"
We as part of the Json log entry we have a field user.username
which has the customers email. It gets populated correctly, but if I do a search "foo@bar.com"
it doesn't show up. But if I do user.username: "foo@bar.com"
, it shows up correctly.
We have another service that uses Logstash, that also processes the similar logs, and creates two fields user.username
and user.username.keyword
. The general search works correctly here.
I noticed that the FunctionBeat makes user.username
aggregatable, but user.username
in Logstash version is not, just the user.username.keyword
is.
How can I make FunctionBeat work like Logstash so I get two fields and user.username
is searchable with general search?