Today is second day i'm spending on ELK stack and have no any experience with this.
My goal:
Create "parent" index (client01) and for all machines related to client01 collect logs so i can filter data based on that "parent" index.
filebeat sends log to logstash, logstash sends it to elasticsearch-serach index from KIBANA
elasticsearch,kibana and filebeat are on the same machine
In elasticsearch created cluster01 index
(no mapping)i can telnet to ports 9200 and 5044
logstash_simple.conf
input {
beats {
port => 5044
#ssl => true
#ssl_certificate => "/etc/pki/tls/certs/logstash.crt"
#ssl_key => "/etc/pki/tls/private/logstash.key"
}
}
#filter {
# if [type] == "syslog" {
# grok {
# match => {
# "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}"
# }
# add_field => [ "received_at", "%{@timestamp}" ]
# add_field => [ "received_from", "%{host}" ]
# }
# syslog_pri { }
# date {
# match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
# }
# }
#}
output {
elasticsearch {
hosts => "localhost:9200"
index => "client01-logfiles-%{[beat.version]}-%{+YYYY.MM.dd}"
}
}
i set basic configuration to get it working and to understand basic concepts
logstash.yml
- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/log/audit/audit.log
#----------------------------- Logstash output --------------------------------
#output.logstash:
# The Logstash hosts
hosts: ["localhost:5044"]
filebeat -e -c /etc/filebeat/filebeat.yml
ERROR pipeline/output.go:100 Failed to connect to backoff(elasticsearch(http://localhost:5044)): Get http://localhost:5044: read tcp 127.0.0.1:50202->127.0.0.1:5044: read: connection reset by peer
2019-03-26T09:19:33.805Z INFO pipeline/output.go:93 Attempting to reconnect to backoff(elasticsearch(http://localhost:5044)) with 2 reconnect attempt(s)