Problem with Fake Log in Logshash

  1. I install Elastic (no problem)
  2. I install Kibana (no problem)
  3. I run Logshtast with argument:

./logstash -f apache_config.conf

apache_config.conf

input {
file {
path => "/var/log/my_apache_logs/*.log"
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
}
geoip {
source => "clientip"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}

mutate {
convert => { "[geoip][coordinates]" => "float" }
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => [ "localhost:9200" ]
index => "logstash-apache-%{+YYYY.MM.dd}"
}
}


In /var/log/my_apache_logs/*.log i generated Fake Log from


GitHub - kiritbasu/Fake-Apache-Log-Generator: Generate a boatload of Fake Apache Log files very quickly


Serwer respond:

Sending Logstash's logs to /data/dg_vbox/Logstash/logstash-6.2.2/logs which is now configured via log4j2.properties
[2018-03-06T10:34:39,179][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/data/dg_vbox/Logstash/logstash-6.2.2/modules/netflow/configuration"}
[2018-03-06T10:34:39,192][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/data/dg_vbox/Logstash/logstash-6.2.2/modules/fb_apache/configuration"}
[2018-03-06T10:34:39,613][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-03-06T10:34:40,081][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.2"}
[2018-03-06T10:34:40,393][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-03-06T10:34:43,162][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-03-06T10:34:43,558][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2018-03-06T10:34:43,567][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-03-06T10:34:43,714][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-03-06T10:34:43,760][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>nil}
[2018-03-06T10:34:43,763][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-03-06T10:34:43,781][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-03-06T10:34:43,796][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-03-06T10:34:43,831][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["//localhost:9200"]}
[2018-03-06T10:34:43,979][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/data/dg_vbox/Logstash/logstash-6.2.2/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2018-03-06T10:34:44,251][INFO ][logstash.pipeline ] Pipeline started succesfully {:pipeline_id=>"main", :thread=>"#<Thread:0x7ccde8a5 run>"}
[2018-03-06T10:34:44,339][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}


I can't read that fake log from Kibana CLI (menagement).
I can't read that fake log from localhost:9200 (no password)
X-pack is not installed

What's a problem??

PS
I will be grateful for your help

Seems no error with conf file. If input size is huge, logstash take some time to push pipeline events.

try running logstash in debug mode.
./logstash -f apache_config.conf --debug

Logstash is tailing the log file you generated. Look into the start_position option for the file input and read the file input documentation in general to understand how it maintains the current state of the monitored files.

It's only 100 events.
I try with --debug mode - not help

Can you explain exactly?
I starting using Elastic and don't know how.

Respond from Kibana:
GET /logstash/_search
{
"query": { "match_all": {} }
}
{

"error": {
"root_cause": [
{
"type": "index_not_found_exception",
"reason": "no such index",
"resource.type": "index_or_alias",
"resource.id": "logstash",
"index_uuid": "na",
"index": "logstash"
}
],
"type": "index_not_found_exception",
"reason": "no such index",
"resource.type": "index_or_alias",
"resource.id": "logstash",
"index_uuid": "na",
"index": "logstash"
},
"status": 404
}

input {
file {
path => "/var/log/my_apache_logs/*.log"
sincedb_path => "/dev/null"
start_position => "beginning"
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
}
geoip {
source => "clientip"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}

mutate {
convert => { "[geoip][coordinates]" => "float" }
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => [ "localhost:9200" ]
index => "logstash-apache-%{+YYYY.MM.dd}"
}
}

https://pastebin.com/BV7mF06a
it's read and return to cli

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.