I did change my setup a little from the original posting. Since your March 12 posting and forgot to mention. I have updated to 5.6.8, and I am no longer using the Ingest-geoip plugin for elasticsearch. I have installed Logstash. My configuration files are below.
filebeat.yml
File
###################### Filebeat Configuration #########################
#=========================== Filebeat prospectors =============================
filebeat.prospectors:
- input_type: log
# Paths that should be crawled and fetched. Glob based paths.
paths:
#- /var/log/*.log
#- /var/log/auth.log
#- /var/log/syslog
- /opt/data/auth.log
#- c:\programdata\elasticsearch\logs\*
#================================ Outputs =====================================
#-------------------------- Elasticsearch output ------------------------------
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["localhost:9200"]
# Optional protocol and basic auth credentials.
#protocol: "https"
username: "elastic"
password: "changeme"
#----------------------------- Logstash output --------------------------------
#output.logstash:
# The Logstash hosts
#hosts: ["localhost:5044"]
#================================ Logging =====================================
# Sets log level.
logging.level: debug
# At debug level, you can selectively enable logging only for some components.
logging.selectors: ["publish"]
10-syslog-filter.conf
File
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
02-beats-input.conf
File
input {
beats {
port => 5044
}
}
30-elasticsearch-output.conf
output {
elasticsearch {
hosts => ["localhost:9200"]
user => elastic
password => changeme
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
So this is the first picture showing the index was created for filebeat-*
This is now my discover tab
However, this is not allowing me to use the ML tools from x-pack. After creating the job for the ML, I got this error message
Datafeed lookback retrieved no data
And the datafeed preview showed this.
I realized that the above filebeat.yml
configuration file was sending the log data to elasticsearch and not to logstash. I have fixed this and the configuration is below.
###################### Filebeat Configuration #########################
#=========================== Filebeat prospectors =============================
filebeat.prospectors:
- input_type: log
# Paths that should be crawled and fetched. Glob based paths.
paths:
#- /var/log/*.log
#- /var/log/auth.log
#- /var/log/syslog
- /opt/data/auth.log
#- c:\programdata\elasticsearch\logs\*
#================================ Outputs =====================================
#-------------------------- Elasticsearch output ------------------------------
#output.elasticsearch:
# Array of hosts to connect to.
#hosts: ["localhost:9200"]
# Optional protocol and basic auth credentials.
#protocol: "https"
#username: "elastic"
#password: "changeme"
#----------------------------- Logstash output --------------------------------
output.logstash:./reset_job.sh suspicious_login_activity
# The Logstash hosts
hosts: ["localhost:5044"]
#================================ Logging =====================================
# Sets log level.
logging.level: debug
# At debug level, you can selectively enable logging only for some components.
logging.selectors: ["publish"]
I updated the beats input plugin for logstash. I start Filebeat in debug mode and it ran fine. However, when I go to the Management
Tab in Kibana, I am able to index the data now.
However, I am still getting the same error after creating the job for the ML.
So I just want to make sure that I have the data in there correctly before moving to another forum concerning the ML job