Hi,
I'm a newcomer in "Elastic World" so maybe my question is really stupid, but after few hours with my freind Google, I did not found the answer.
I start with a brand new installation, following each part of documentation (Installing the Elastic Stack | Elastic Installation and Upgrade Guide [8.11] | Elastic)
So I've an elasticsearch + x-pack who is working (I can connect to http://localhost:9200/), a Kibana + x-pack who is working (I can connect to http://localhost:5601) and an logstash who is working (I can send to it apache logs using filebeat).
If I'm using this conf file, I can see my logs perfectly parsed in my cmd windows (yes, all installations are on Windows server) :
input {
beats
{
port => "5044"
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
}
geoip {
source => "clientip"
}
}
output {
stdout { codec => rubydebug }
}
Next step, I add an output to elasticsearch :
input {
beats
{
port => "5044"
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
}
geoip {
source => "clientip"
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
}
stdout { codec => rubydebug }
}
At this point, I have this error message :
10:16:37.094 [Ruby-0-Thread-7: C:/logstash-5.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-5.4.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:188] WARN logstash.outputs.elasticsearch - Attempted to resurrect connection to dead ES instance, but got an error. {:url=>#URI::HTTP:0xc38ff90URL:http://localhost:9200, :error_type=>LogStash::Outputs::Elasticsearch::HttpClient::Pool::BadResponseCodeError, :error=>"Got response code '401' contact Elasticsearch at URL 'http://localhost:9200/'"}
According to this topic and this documention, I have created a user in Kibana and I integrate authentification informations in my conf file :
input {
beats {
port => "5044"
}
user => logstash_internal
password => password1
}
# The filter part of this file is commented out to indicate that it is
# optional.
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
}
geoip {
source => "clientip"
}
user => logstash_internal
password => password1
}
output {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
}
user => logstash_internal
password => password1
}
But unfortunatly, and this is the object of my question, now I have this error message :
Could not find log4j2 configuration at path /logstash-5.1.1/config/log4j2.properties. Using default config which logs to console10:47:12.325 [LogStash::Runner] ERROR logstash.agent - fetched an invalid config {:config=>"input { \n\tbeats \n\t{\n port => "5044"\n } \n user => logstash_internal\n password => password1\n}\nfilter {\n grok {\n match => { "message" => "%{COMBINEDAPACHELOG}"}\n }\n geoip {\n source => "clientip"\n }\n user => logstash_internal\n password => changeme}\noutput {\n elasticsearch {\n hosts => [ "localhost:9200" ]\n }\n\tstdout { codec => rubydebug }\n user => logstash_internal\n password => password1\n}\n", :reason=>"Expected one of #, { at line 6, column 10 (byte 64) after input { \n\tbeats \n\t{\n port => "5044"\n } \n user "}
I tried with double and simple quotes for user and password values, without user/password information in input and filter (I don't understand why they must be there), each time I've got the same error message...
So, without user informations, my logstash can not speak with elasticsearch, and with informations... it still can not
Is anybody can help me to understand where is the mistake ?