Hi All,
I am getting error while running for ELK. Below is the error.
"2017-02-18T01:49:07+11:00 ERR Connecting error publishing events (retrying): dial tcp Y.Y.Y.Y:5044: connectex: No connection could be made because the target machine actively refused it.
2017-02-18T01:49:17+11:00 INFO No non-zero metrics in the last 30s"
Lets say my Elastic Search, Logstash and Kibana is on server X.X.X.X and Filebeat on Y.Y.Y.Y. There are no firewalls, on/out bound ports are open, operating system is Windows Server 2012 R2 Standard, filebeat-5.2.1-windows-x86_64, kibana-5.0.0-windows-x86, logstash-5.2.0, elasticsearch-5.2.0 and jdk-8u121-windows-x64 all successfully installed running.
When i run filebeat and logstash from same server X.X.X it works fine creates logs i can see coming in kibana but when i filebeat tries to communicate from server Y.Y.Y to X.X.X, its not creating logs. Only above error is coming.
Below are my Filebeat and Logstash config files. Please help me out on this, I am working from last 12 hrs on this but no result:-
filebeat.yml:
# Optional fields that you can specify to add additional information to the
#fields_under_root: true
# output.
fields:
# Environment: LOCAL
# ServerName: 'Server 1'
# ServerIP: XXXX
#fields_under_root: true
# Environment: LOCAL
ServerName: 'Server 1'
ServerIP: XXXX
fields_under_root: true
#================================ Outputs =====================================
# Configure what outputs to use when sending the data collected by the beat.
# Multiple outputs may be used.
#-------------------------- Elasticsearch output ------------------------------
#output.elasticsearch:
# Array of hosts to connect to.
#hosts: ["XXXX:9200"]
# Optional protocol and basic auth credentials.
#protocol: "https"
#username: "elastic"
#password: "changeme"
#----------------------------- Logstash output --------------------------------
output.logstash:
# The Logstash hosts
hosts: ["XXXX:5044"]
#loadbalance: true
#worker: 2
Logstash config:
input {
beats{
port=>5044
}
}
filter {
if[type]=="log4net"{
grok {
match => [ "message", "(?m)%{TIMESTAMP_ISO8601:Time_Stamp}\s*%{LOGLEVEL:Log_Level}\s*%{INT:Thread}\s*(?<Logger>.+?(?=.?DIAGNOSTICCONTEXT))\s*(?<Diagnostic_Context>.+?(?=.?MESSAGE))\s*(?<Message>.+?(?=.?ADDITIONALDATA))\s*(?<Additional_Data>.+?(?=.?EXCEPTION:))\s*(?<Exception>.+?(?=.?CLIENTIP))\s*(?<Client_IP>.+?(?=.?HOSTIP))\s*(?<Host_IP>.+?(?=.?REQUESTIDENTIFIER))\s*(?<Request_Identifier>.+?(?=.?SESSIONIDENTIFIER))\s*(?<User_identifier>.+?([^*]+))"]
match => [ "message", "(?m)%{TIMESTAMP_ISO8601:Time_Stamp}\s*%{LOGLEVEL:Log_Level}\s*(?<Logger>.+?(?=.?DIAGNOSTICCONTEXT))\s*(?<Diagnostic_Context>.+?(?=.?MESSAGE))\s*(?<Message>.+?(?=.?ADDITIONALDATA))\s*(?<Additional_Data>.+?(?=.?EXCEPTION:))\s*(?<Exception>.+?(?=.?CLIENTIP))\s*(?<Client_IP>.+?(?=.?HOSTIP))\s*(?<Host_IP>.+?(?=.?REQUESTIDENTIFIER))\s*(?<Request_Identifier>.+?(?=.?SESSIONIDENTIFIER))\s*(?<User_identifier>.+?([^*]+))"]
match=> ["message","(?m)%{TIMESTAMP_ISO8601:Time_Stamp}\s*%{LOGLEVEL:Log_Level}\s*%{INT:Thread}\s*(?<Logger>.+?(?=.?DIAGNOSTICCONTEXT))\s*(?<Diagnostic_Context>.+?(?=.?MESSAGE))\s*(?<Message>.+?(?=.?ADDITIONALDATA))\s*(?<Additional_Data>.+?(?=.?EXCEPTION:))\s*(?<Exception>.+?(?=.?CLIENTIP))\s*(?<Client_IP>.+?(?=.?HOSTIP))\s*(?<Host_IP>.+?(?=.?APPLICATION))\s*(?<APPLICATION>[^*]+)"]
match=> ["message","(?m)%{TIMESTAMP_ISO8601:Time_Stamp}\s*%{LOGLEVEL:Log_Level}\s*%{INT:Thread}\s*(?<Logger>.+?(?=.?DIAGNOSTICCONTEXT))\s*(?<Diagnostic_Context>.+?(?=.?MESSAGE))\s*(?<Message>.+?(?=.?ADDITIONALDATA))\s*(?<Additional_Data>.+?(?=.?EXCEPTION:))\s*(?<Exception>.+?(?=.?CLIENTIP))\s*(?<Client_IP>.+?(?=.?HOSTIP))\s*(?<Host_IP>.+?([^*]+))"]
match=> ["message","(?m)%{TIMESTAMP_ISO8601:Time_Stamp}\s*%{LOGLEVEL:Log_Level}\s*(?<Logger>.+?([\s]))\s*\s*(?<Diagnostic_Context>.+?([\s]))(?<Message>[^*]+)"]
}
date {
match => ["Time_Stamp", "YYYY-MM-dd HH:mm:ss.SSS"]
add_field=>["Time_Stamp_parsed","true"]
}
mutate {
add_field=> ["Log_source_Server"," Local Server 1"]
add_field=>["LogEvent_TimeStamp","%{Time_Stamp}"]
add_field=>["Environment","LOCAL"]
add_field=>["AppName",""]
}
}
}
output {
elasticsearch {
hosts => ["XXXX:9200"]
#action=> "index"
#index=> "ErrorLogs-%{+YYYY.MM.dd}"
}
#stdout { codec => rubydebug }
}
Lostatsh Yml:
# ------------ Metrics Settings --------------
#
# Bind address for the metrics REST endpoint
#
http.host: "XXXX"
#
# Bind port for the metrics REST endpoint, this option also accept a range
# (9600-9700) and logstash will pick up the first available ports.
#
http.port: 5044
#