Kibana version:
7.14.0
Elasticsearch version:
7.14.0
APM Server version:
7.14.0
Filebeat version:
7.14.0
APM Agent language and version:
Python Django - elastic-apm 6.3.3
Description:
The APM server and Python agents are working as expected just like the Filebeat. They're collecting logs and metrics, but I don't know how to correlate them:
Live streaming from Logs section is working and I can filter apm instances with container.id
.
1. APM, Elasticsearch, Kibana configs
docker-compose.yml:
version: '3.3'
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:7.14.0
hostname: elasticsearch
environment:
- ES_JAVA_OPTS=-Xms512m -Xmx512m
- ELASTIC_PASSWORD=password
ports:
- 192.168.100.100:9200:9200
volumes:
- ./data:/usr/share/elasticsearch/data
- ./elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml
networks:
- elk
kibana:
image: docker.elastic.co/kibana/kibana:7.14.0
hostname: kibana
restart: always
ports:
- 192.168.100.100:5601:5601
volumes:
- ./kibana.yml:/usr/share/kibana/config/kibana.yml:ro
networks:
- elk
apm:
image: docker.elastic.co/apm/apm-server:7.14.0
hostname: apm
command: --strict.perms=false
depends_on:
- elasticsearch
cap_add: ["CHOWN", "DAC_OVERRIDE", "SETGID", "SETUID"]
cap_drop: ["ALL"]
volumes:
- ./apm-server.yml:/usr/share/apm-server/apm-server.yml
ports:
- 192.168.100.100:8200:8200
networks:
- elk
networks:
elk:
driver: bridge
apm-server.yml:
apm-server:
host: "apm:8200"
secret_token: token
rum:
enabled: true
kibana:
enabled: true
host: "kibana:5601"
protocol: "http"
username: "elastic"
password: "password"
setup.template.enabled: true
setup.template.name: "apm-%{[observer.version]}"
setup.template.pattern: "apm-%{[observer.version]}-*"
setup.template.fields: "${path.config}/fields.yml"
setup.template.overwrite: false
setup.template.settings:
index:
number_of_shards: 1
number_of_replicas: 0
codec: best_compression
number_of_routing_shards: 30
mapping.total_fields.limit: 2000
output.elasticsearch:
hosts: ["elasticsearch:9200"]
username: elastic
password: password
index: "apm-%{[observer.version]}-%{+yyyy.MM.dd}"
indices:
- index: "apm-%{[observer.version]}-sourcemap"
when.contains:
processor.event: "sourcemap"
- index: "apm-%{[observer.version]}-error-%{+yyyy.MM.dd}"
when.contains:
processor.event: "error"
- index: "apm-%{[observer.version]}-transaction-%{+yyyy.MM.dd}"
when.contains:
processor.event: "transaction"
- index: "apm-%{[observer.version]}-span-%{+yyyy.MM.dd}"
when.contains:
processor.event: "span"
- index: "apm-%{[observer.version]}-metric-%{+yyyy.MM.dd}"
when.contains:
processor.event: "metric"
- index: "apm-%{[observer.version]}-onboarding-%{+yyyy.MM.dd}"
when.contains:
processor.event: "onboarding"
monitoring.enabled: true
monitoring.elasticsearch:
username: "elastic"
password: "password"
hosts: ["elasticsearch:9200"]
elasticsearch.yml:
---
cluster.name: "docker-cluster"
network.host: 0.0.0.0
discovery.type: single-node
xpack.license.self_generated.type: basic
xpack.security.enabled: true
xpack.monitoring.collection.enabled: true
kibana.yml:
---
server.name: kibana
server.host: "0"
server.publicBaseUrl: https://kibana.domain.com
elasticsearch.hosts: [ "http://elasticsearch:9200" ]
monitoring.ui.container.elasticsearch.enabled: true
elasticsearch.username: elastic
elasticsearch.password: password
2. Python agent config:
base.py:
(...)
import elasticapm
import logging
from elasticapm.handlers.logging import Formatter
from elasticapm.handlers.logging import LoggingFilter
console = logging.StreamHandler()
console.addFilter(LoggingFilter())
logging.getLogger("").addHandler(console)
fh = logging.FileHandler("spam.log")
formatter = Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s")
fh.setFormatter(formatter)
formatstring = "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
formatstring = (
formatstring + " | elasticapm "
"transaction.id=%(elasticapm_transaction_id)s "
"trace.id=%(elasticapm_trace_id)s "
"span.id=%(elasticapm_span_id)s"
)
ELASTIC_APM = {
"SERVICE_NAME": "app",
"SECRET_TOKEN": "secret",
"SERVER_URL": "http://192.168.100.100:8200",
"ENVIRONMENT": "test",
}
(...)
MIDDLEWARE = [
"elasticapm.contrib.django.middleware.TracingMiddleware",
"elasticapm.contrib.django.middleware.Catch404Middleware",
(...)
]
INSTALLED_APPS = [
"elasticapm.contrib.django",
(...)
]
(...)
3. Filebeat agent configs
/etc/filebeat/filebeat.yml:
filebeat.inputs:
- type: container
paths:
- '/var/lib/docker/containers/*/*.log'
processors:
- add_docker_metadata:
host: "unix:///var/run/docker.sock"
- decode_json_fields:
fields: ["message"]
target: "json"
overwrite_keys: true
output.elasticsearch:
hosts: ["192.168.100.100:9200"]
indices:
- index: "docker-%{[agent.version]}-%{+yyyy.MM.dd}"
username: "elastic"
password: "password"
logging.json: true
logging.metrics.enabled: false
/usr/share/filebeat/module/elasticsearch/server/ingest/pipeline-json.yml:
(...)
- grok:
field: message
patterns: %{GREEDYDATA:msg} | elasticapm transaction.id=%{DATA:transaction.id} trace.id=%{DATA:trace.id} span.id=%{DATA:span.id}
(...)
How to make it work? What am I doing wrong?