Logstash is not create indexes

hi, i have started with ELK but i have some error.
when i start kibana and look for some index name or pattern, can't find any logstash-*

if i use option "Use event times to create index names [DEPRECATED]" with name "logstash"* i can find it on kibana.

Attempted to match the following indices and aliases:
{"index":"logstash-2017.04.04","min":1491264000000,"max":1491350399999}
{"index":"logstash-2017.04.05","min":1491350400000,"max":1491436799999}
{"index":"logstash-2017.04.06","min":1491436800000,"max":1491523199999}
{"index":"logstash-2017.04.07","min":1491523200000,"max":1491609599999}
{"index":"logstash-2017.04.08","min":1491609600000,"max":1491695999999}
{"index":"logstash-2017.04.09","min":1491696000000,"max":1491782399999}
{"index":"logstash-2017.04.10","min":1491782400000,"max":1491868799999}

here is my logstash.yml => https://ptpb.pw/Y0vb
and here my pipeline.conf

input {   
        file {
                path => "/home/reversal/fakelogs/*.log"
                start_position => beginning 
                ignore_older => 0
        }
        beats {
                port => "5043"
        }
}
filter {
        grok {
                match => { "message" => "%{COMBINEDAPACHELOG}" }
        }
         date {
                match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
        }
        geoip {
                source => "clientip"
        }
}
output {
        elasticsearch {
                hosts => [ "129.129.129.137:63200" ]
                index => "logstash-%{+yyyy/MM/dd HH:mm:ss Z||yyyy/MM/dd Z}"
}
}

and here is my logstash log:

> [2017-04-07T03:10:31,600][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://129.129.129.137:63200/]}}
> [2017-04-07T03:10:31,604][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://129.129.129.137:63200/, :path=>"/"}
> [2017-04-07T03:10:31,696][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<URI::HTTP:0x726ba01a URL:http://129.129.129.137:63200/>}
> [2017-04-07T03:10:31,698][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
> [2017-04-07T03:10:31,741][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
> [2017-04-07T03:10:31,746][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::Generic:0x2e671c80 URL://129.129.129.137:63200>]}
> [2017-04-07T03:10:31,792][INFO ][logstash.filters.geoip   ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-geoip-4.0.4-java/vendor/GeoLite2-City.mmdb"}
> [2017-04-07T03:10:31,805][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
> [2017-04-07T03:10:32,461][INFO ][logstash.inputs.beats    ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5043"}
> [2017-04-07T03:10:32,506][INFO ][logstash.pipeline        ] Pipeline main started
> [2017-04-07T03:10:32,613][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>63001}
> [reversal@rv03prd ~]$ 

what i missing?
i appreciate any help.

This is wrong. What is it you are trying to achieve? What is wrong with the default pattern?

It does not work, i have tried

no index. i can try again and post outputs

i have used
index => "logstash-%{+YYYY.MM.dd}"

but not working and without index => "logstash-%{+YYYY.MM.dd}" same thing

Can you remove the index specification from the elastic search output (default will be used) as well as the ignore_older statement for the file input in order to see if this makes a difference? You may also need to remove the sincedb file in order to get the files processed again.

ok, i comment that options like this:

[reversal@rv03prd ~]$ sudo cat /etc/logstash/conf.d/pipeline.conf
input {   
        file {
                path => "/home/reversal/fakelogs/*.log"
                start_position => beginning 
                #ignore_older => 0
        }
        beats {
                port => "5043"
        }
}
filter {
        grok {
                match => { "message" => "%{COMBINEDAPACHELOG}" }
        }
         date {
                match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
        }
        geoip {
                source => "clientip"
        }
}
output {
        elasticsearch {
                hosts => [ "129.129.129.137:63200" ]
                #index => "logstash-%{+yyyy/MM/dd HH:mm:ss Z||yyyy/MM/dd Z}"
}
#stdout { codec => rubydebug}
}

and restart logstash but still cant find "logstash-*"

sorry to ask but how can i delete sincedb?

[reversal@rv03prd ~]$ sudo find / -name sincedb
[reversal@rv03prd ~]$

and kibana > dev tools

DELETE sincedb
{
  "error": {
    "root_cause": [
      {
        "type": "index_not_found_exception",
        "reason": "no such index",
        "resource.type": "index_or_alias",
        "resource.id": "sincedb",
        "index_uuid": "_na_",
        "index": "sincedb"
      }
    ],
    "type": "index_not_found_exception",
    "reason": "no such index",
    "resource.type": "index_or_alias",
    "resource.id": "sincedb",
    "index_uuid": "_na_",
    "index": "sincedb"
  },
  "status": 404
}

sorry, find info about it on https://www.elastic.co/guide/en/logstash/current/plugins-inputs-file.html

stop logstash and now i remove sincedb after find it on:

[root@rv03prd reversal]# find / -name .sincedb*
/var/lib/logstash/plugins/inputs/file/.sincedb_79a16c88523e2ae2caee8f60623d8fbc
[root@rv03prd reversal]# sudo rm -f /var/lib/logstash/plugins/inputs/file/.sincedb_79a16c88523e2ae2caee8f60623d8fbc 
[root@rv03prd reversal]#

start logstash again but cant find any "logstash*" index

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.