Hi,
I'm having trouble getting data to transfer correctly from Filebeat to Logstash (then to Elasticsearch).
When I configure in filebeat.yml output directly to Elasticsearch everything is ok.
However, when I configure in filebeat.yml output to Logstash with the following pipeline:
input {
beats {
port => 5044
}
}
filter {
if [event][module] == "apache" {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
}
}
}
output {
if [@metadata][pipeline] {
elasticsearch {
hosts => ["https://PK-ELK1:9200", "https://PK-ELK2:9200", "https://PK-ELK3:9200"]
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}"
action => "create"
pipeline => "%{[@metadata][pipeline]}"
cacert => "/etc/logstash/certs/ca.crt"
user => "elastic"
password => "ukEXHJ3xMfx+rqxl08cn"
}
} else {
elasticsearch {
hosts => ["https://PK-ELK1:9200", "https://PK-ELK2:9200", "https://PK-ELK3:9200"]
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}"
action => "create"
cacert => "/etc/logstash/certs/ca.crt"
user => "elastic"
password => "xxxxx"
}
}
}
... I have problems ...
Not all fields that were created with the Filebeat -> Elasticsearch direct connection are created with the Filebeat -> Logstash -> Elasticsearch connection.
During the analysis, for example, when I use Logstash to Elastic fields are not added:
user.name
user_agent.version
user_agent.os.version
user_agent.os.name
user_agent.os.full
user_agent.name
user_agent.device.name
url.path
traefik.access.user_agent.os_name
traefik.access.user_agent.name
source.ip
event.outcome
event.kind
event.created
event.category
What could be the problem here?