Logstash not creating index daily based on date

Hi guys,

I have set up a ELK stack platform for work and I am facing some issues with some indices, basically they are not rotating daily based on the date (like logstash-YYYY.MM.DD).

The pipeline config is the following:

input {
udp {
host => "192.168.157.110"
port => 10514
codec => "json"
type => "rsyslog"
}
}
filter { ... }
output {
if [type] == "rsyslog" {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
index => "logstash-%{+YYYY.MM.dd}"
user => "logstash_internal"
password => "mypassword"
}
stdout { codec => rubydebug }
}
}

I am not seeing any error message in the logs and they are set on debug.
Actually, the only strange message in the logs is the following (I am not sure if it is related):
[WARN ][o.e.d.s.a.b.h.DateHistogramAggregationBuilder] [smartmsops-elk] [interval] on [date_histogram] is deprecated, use [fixed_interval] or [calendar_interval] in the future.

Thank you,
Mihai

2 Likes

What do you get when you query for all indexes in elasticsearch that match logstash-*

GET /_cat/indices/logstash-*

Hi Andreas,

Thank you for replying!

Please see the below output:

green open logstash-2019.07.16-000001 YXDNZsx4RaunwXv2351LXA 1 0 23631345 0 2.3gb 2.3gb

HI @svm89
are you using the linux environment

Hi, yes, I am using CentOS 7.

hi @svm89

the above works in windows but not in linux ,I hope this works

```
 input {
  jdbc {
    jdbc_driver_library => "/Users/logstash/mysql-connector-java-5.1.39-bin.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_connection_string => "jdbc:mysql://localhost:3306/database_name"
    jdbc_user => "root"
    jdbc_password => "password"
    schedule => "* * * * *"
    statement => "select * from table1"
    type => "table1"
  }



output {
    if "table1" in [tags] {
 elasticsearch {
 index => "ABC"
 hosts => "localhost:9200" 
} }

Hi @Sammeta_David_Raju,

I am using rsyslog stream as input for a pipeline and a local file for another pipeline.

I think my issue is somehow related to the output part which has changed a bit and forgot to update it:

output {
if [type] == "rsyslog" {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
manage_template => true
index => "logstash-%{+YYYY.MM.dd}"
user => "elastic"
password => "mypassword"
}
stdout { codec => rubydebug }
}
}

This was the configuration from the security implementation guide. I used the elastic user because the only security part I am interested in, is to define read-only users to provide to the development/testing/etc teams.

Thanks.

Hi all,

I found the solution:

index => "logstash-%{+YYYY.MM.dd}" ---> index => "logstash-%{+yyyy.MM.dd}" - the year should be lowercase.

All the best!

4 Likes

Oh for christ's sake... I've been tearing my hair out over this for several days now. Turns out the documentation is wrong.

Can someone at Elastic fix this? https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-index

Edit: never mind, I opened an issue on Github. https://github.com/logstash-plugins/logstash-output-elasticsearch/issues/876

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.