I have an issue where logstash is sending the variables to elasticsearch instead of enumerating the contents of the variable. The variables in question are coming from filebeat and I'm using these variables to create 2 different indexes in elasticsearch.
I am getting indexes in elasticsearch that have variables names in them AND enumerated names like this:
filebeat-%{[fields][host]}-%{[fields][log_type]}-2020-08-06
filebeat-dev6-nginx-2020-08-06
filebeat-dev6-reqresp-2020-08-06
I must be doing something wrong - I am very new to elastic stack. Thanks in advance for any help.
Here are my configs:
filebeat
filebeat.inputs:
- type: log
enabled: true
# Specify nginx as the log type for further processing by logstash
fields:
log_type: nginx
host: dev6
paths:
- /var/log/nginx/*.log
exlcude_files: ['my-dev-reqresp.log']
exlcude_files: ['my-other-reqresp.log']
- type: log
enabled: true
# Specify reqresp as the log type for logstash
fields:
log_type: reqresp
host: dev6
paths:
- /var/log/nginx/my-dev-reqresp.log
- /var/log/nginx/my-other-reqresp.log
output.logstash:
# The Logstash hosts
hosts: ["docker-dev0:5044"]
Logstash
input {
beats {
port => 5044
}
}
output {
elasticsearch {
hosts => ["elasticsearch:9200"]
user => "elastic"
password => "password"
index => "filebeat-%{[fields][host]}-%{[fields][log_type]}-%{+YYYY-MM-dd}"
}
stdout {
codec => rubydebug {
metadata => true
}
}
}
Badger
August 6, 2020, 6:41pm
2
When you look at documents in that index, so they have [fields][host] and [fields][log_type]? Can you pull a sample document as JSON and post it?
I hope you mean this:
"message" => "a bunch of stuff here",
"fields" => {
"log_type" => "nginx",
"host" => "dev6"
This is some data from the stdout of logstash.
Also, here are what is see at this URL: /_cat/indices?v
health status index uuid pri rep docs.count docs.deleted store.size pri.store.size
green open .monitoring-kibana-7-2020.08.06 HhIwCzs4Q7azGml9gm2OeQ 1 0 6822 0 1.6mb 1.6mb
green open .kibana-event-log-7.8.0-000001 S5nN_gAXQs-vmBnkFRMoHA 1 0 1 0 5.3kb 5.3kb
green open .security-7 BPx275ILQrqdd-1gqbSGWA 1 0 37 0 101.2kb 101.2kb
yellow open filebeat-%{[fields][host]}-%{[fields][log_type]}-2020-08-06 Hvs1WQC_QvqQwuHdZSARhw 1 1 1881 0 796.7kb 796.7kb
green open .apm-custom-link miwPbd0ASam92v6wfYjaeg 1 0 0 0 208b 208b
green open .kibana_task_manager_1 EDeKMJx4Qh-usdulR3W74g 1 0 5 0 35kb 35kb
green open .apm-agent-configuration o5JRXbZaQIiOVM3Rt0NDaw 1 0 0 0 208b 208b
green open .monitoring-logstash-7-2020.08.06 00LZs7HBT3-Mgrdey-3S0A 1 0 82578 0 6.1mb 6.1mb
green open .monitoring-es-7-2020.08.06 9in6px8jRciJ1YfiIYSyOA 1 0 102902 82420 50.6mb 50.6mb
green open .kibana_1 byY5xo1-Tce_9UMgvHaCKg 1 0 61 10 103.6kb 103.6kb
yellow open filebeat-dev6-nginx-2020-08-06 JA5E_efkTjK7Sr7yWl77wQ 1 1 6524 0 2.8mb 2.8mb
yellow open filebeat-dev6-reqresp-2020-08-06 N4fYswMzTNSQCTlZ48BK6g 1 1 159 0 435.2kb 435.2kb
Thanks.
Badger
August 6, 2020, 10:24pm
4
Which index did that particular message end up in?
I would assume that would have went to this index:
filebeat-dev6-nginx-2020-08-06
The output that I show was from logstash and not elasticsearch (Kibana).
I can try to get more information - but please let me know where.
Here is a portion of a document that did come through on this index:
filebeat-dev6-nginx-2020-08-06
"input": {
"type": "log"
},
"log": {
"offset": 2097913,
"file": {
"path": "/var/log/nginx/cache.log"
}
},
"fields": {
"log_type": "nginx",
"host": "dev6"
},
"host": {
"ip": [
"10.0.10.66",
"fe80::20c:29ff:fe70:5c33"
],
"os": {
"kernel": "4.15.0-111-generic",
"codename": "bionic",
"name": "Ubuntu",
"platform": "ubuntu",
"family": "debian",
"version": "18.04.4 LTS (Bionic Beaver)"
},
Badger
August 6, 2020, 10:39pm
7
I am not interested in data that went into the index you wanted it to go into. You are claiming (as I understand it) that there are documents in the index called "filebeat-%{[fields][host]}-%{[fields][log_type]}-2020-08-06", where the values of the fields have not been interpolated. I would like to see an example of a document from that index.
I can get that information from the elasticsearch if I were to index it. However, I am not indexing anything called "filebeat-%{[fields][host]}-%{[fields][log_type]}-2020-08-06" and I don't see how that would help any. The problem is that logstash is not enumerating the variables and inserting the variables (instead of values) into elasticsearch as an index.
Badger
August 6, 2020, 11:44pm
9
Then I guess I have completely misunderstood your problem. I do not think I can help you.
I don't know what else to say other than I don't want the variable names showing up in elastic search like this:
filebeat-%{[fields][host]}-%{[fields][log_type]}-2020-08-06
So I found the solution to my problem. The issues was that not all documents retrieved by filebeat matched the following lines:
exlcude_files: ['my-dev-reqresp.log']
exlcude_files: ['my-other-reqresp.log']
Since some documents did not match this expression, they did not the the following applied:
fields:
log_type: nginx
host: dev6
Since they were not applied, logstash could not enumerate these variable to anything and forwarded them to elasticsearch as the variable names.
I changed the regular expression to this:
exlcude_files: ['my-(dev|other)-reqresp.log']
Which would be a better representation of the regular expression that filebeat expects.
Thanks for trying to help me @Badger
system
(system)
Closed
September 4, 2020, 3:19pm
12
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.