hi, i have started with ELK but i have some error.
when i start kibana and look for some index name or pattern, can't find any logstash-*
if i use option "Use event times to create index names [DEPRECATED]" with name "logstash"* i can find it on kibana.
Attempted to match the following indices and aliases:
{"index":"logstash-2017.04.04","min":1491264000000,"max":1491350399999}
{"index":"logstash-2017.04.05","min":1491350400000,"max":1491436799999}
{"index":"logstash-2017.04.06","min":1491436800000,"max":1491523199999}
{"index":"logstash-2017.04.07","min":1491523200000,"max":1491609599999}
{"index":"logstash-2017.04.08","min":1491609600000,"max":1491695999999}
{"index":"logstash-2017.04.09","min":1491696000000,"max":1491782399999}
{"index":"logstash-2017.04.10","min":1491782400000,"max":1491868799999}
here is my logstash.yml => https://ptpb.pw/Y0vb
and here my pipeline.conf
input {
file {
path => "/home/reversal/fakelogs/*.log"
start_position => beginning
ignore_older => 0
}
beats {
port => "5043"
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
geoip {
source => "clientip"
}
}
output {
elasticsearch {
hosts => [ "129.129.129.137:63200" ]
index => "logstash-%{+yyyy/MM/dd HH:mm:ss Z||yyyy/MM/dd Z}"
}
}
and here is my logstash log:
> [2017-04-07T03:10:31,600][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://129.129.129.137:63200/]}}
> [2017-04-07T03:10:31,604][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://129.129.129.137:63200/, :path=>"/"}
> [2017-04-07T03:10:31,696][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<URI::HTTP:0x726ba01a URL:http://129.129.129.137:63200/>}
> [2017-04-07T03:10:31,698][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
> [2017-04-07T03:10:31,741][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
> [2017-04-07T03:10:31,746][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::Generic:0x2e671c80 URL://129.129.129.137:63200>]}
> [2017-04-07T03:10:31,792][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-geoip-4.0.4-java/vendor/GeoLite2-City.mmdb"}
> [2017-04-07T03:10:31,805][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
> [2017-04-07T03:10:32,461][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5043"}
> [2017-04-07T03:10:32,506][INFO ][logstash.pipeline ] Pipeline main started
> [2017-04-07T03:10:32,613][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>63001}
> [reversal@rv03prd ~]$
what i missing?
i appreciate any help.