Index Creation Issues

While looking through my ES indexes, I see these:

DATANODE-03:~$ 
health status index                             uuid                   pri rep docs.count docs.deleted store.size pri.store.size
green  open   .kibana                           wSUtT63IQ2uPTncpYBh1EQ   1   0          9            2    745.6kb        745.6kb
green  open   4.06                              Re9rGbiJSJyNC9x_6JEtdw   1   0          1            0     25.7kb         25.7kb
green  open   loash-2017.04.10                  FSnsWCbkR7mwLwMYfYWF2w   1   0          0            0       159b           159b
green  open   log17.04.08                       -hoICxNpRJuys7Bh3pqkxQ   1   0          1            1     30.6kb         30.6kb
green  open   logst                             Zd7bWKJoTb-pWyvVSsplGA   1   0          1            0     17.7kb         17.7kb
green  open   logst017.04.07                    _y8R0XAJT0-6wStFBcpi4Q   1   0          1            0     17.7kb         17.7kb
green  open   logstas                           ofsnTBMvQvi5FyhOBCM26w   1   0          1            0     26.3kb         26.3kb
green  open   logstas.04.10                     uWgUpiJHS1yWxp57gIA31w   1   0          0            0       130b           130b
green  open   logstash-2.06                     vWsvjj5cQ8KCPOGpmQGQJw   1   0          0            0       159b           159b
green  open   logstash-2017.03.25               sHm3jRtxTAqTFTuzLdXN5A   1   0      12203            0     11.3mb         11.3mb
green  open   logstash-2017.03.26               pt9Nz9xdR9mWBB1G_t47BA   1   0      36361            0     35.6mb         35.6mb
green  open   logstash-2017.03.27               SJ5KqV3vSQuC1cqBTzap6A   1   0      49549            0       47mb           47mb
green  open   logstash-2017.03.29               KLUg9kJdTpe1sQaZY_2w6w   1   0   83954425            0    144.4gb        144.4gb
green  open   logstash-2017.03.30               wXCcuMt6QXK_Z-UvFg4MYA   1   0  210550983        18276    356.7gb        356.7gb
green  open   logstash-2017.03.31               6-ov_wk0RPGbYtlxyzJuXQ   1   0  182481129         2897    310.7gb        310.7gb
green  open   logstash-2017.04.01               Fa9H8KTJRwerBzlqHEVNFA   1   0  190716333         4489    316.6gb        316.6gb
green  open   logstash-2017.04.03               WliIfTrYRhuV_4q8D4d_Zw   1   0        116            0    346.1kb        346.1kb
green  open   logstash-2017.04.04               TsVn5QMuS9SdhZsvHtAdpw   1   0  182961755        23047    308.2gb        308.2gb
green  open   logstash-2017.04.05               ruGpnWfNTjCE6ZmaPQKwbg   1   0  176783533            0    285.8gb        285.8gb
green  open   logstash-2017.04.0505             wlJYlR1-TlGIKC0_rd3yRw   1   0          0            0       159b           159b
green  open   logstash-2017.04.06               JLlvsdlWT1m890R3II5Vfg   1   0  168572377          646    275.6gb        275.6gb
green  open   logstash-2017.04.06ash-2017.04.06 eH-vhqzMRKia-_BX92y5qQ   1   0          1            0     34.2kb         34.2kb
green  open   logstash-2017.04.07               ZR19vhuGQXG6nISe7Xk9cg   1   0  198701472            0    322.6gb        322.6gb
green  open   logstash-2017.04.08               rXNJ-fYyT9uAZQS84bTXKg   1   0  142461110          930    221.4gb        221.4gb
green  open   logstash-2017.04.09               v7SsmNuMRPudHgx_1Y2RmQ   1   0  126227980          682    193.3gb        193.3gb
green  open   logstash-2017.04.10               2Z80oAuPSPqN1NKXVNTwhA   1   0  122454452            0    194.7gb        194.7gb
green  open   logstash-2017.04.11               zUxJHKfpS6CKdxBVXmxwAw   1   0  101706619            0    164.3gb        164.3gb
green  open   logstash-2017.04.ash-2017.04.06   wwSn__TXQ9mUeZCz6wNBkQ   1   0          1            0       18kb           18kb
green  open   logstash-2017.gstash-2017.04.11   Y4hwrXG1TA2C4MJVeRwXgA   1   0          1            0     17.3kb         17.3kb
green  open   logstash-2stash-2017.04.06        8WywZj60TE62yfjyYf2VZQ   1   0          1            0     37.5kb         37.5kb
green  open   logstash0                         e-S6j3BYQ2Sug2aDqNqSiQ   1   0          1            0     29.6kb         29.6kb
green  open   logstash4                         IReArmGRTESGtn6scM5Cgg   1   0          1            0     33.9kb         33.9kb
green  open   logstasstash-2017.04.07           mg37CIZ8TLiEISKIDGK4HA   1   0          1            0       15kb           15kb
green  open   lostash-2017.04.07                xIhM9XzgQ-isVKSa1UmJNQ   1   0          0            0       159b           159b

Any ideas as to why the index names are getting hosed up? Is this an issue from Logstash or Elasticsearch?

Logstash issues an index directive. It doesn't actually create indices, but tells Elasticsearch, "put this document in this index."

Logstash attempts to create an index named logstash-YYYY.MM.dd, based on the timestamp of the log entry being processed. Based on what I'm seeing here, something is most definitely messed up. The only way I can feature that happening is some very messed up communications between Logstash and Elasticsearch. This is very abnormal. I've never even seen this happen.

Can you share your Logstash configuration? What version of Logstash & Elasticsearch?

The Logstash shippers are 5.2.2 and the ES cluster is 5.2.2.

I'll post the configs shortly.

I apologize, all the ELK servers are 5.3.0 now.

Here is my output:

##########
# ELASTICSAERCH Output Parameters
##########
output {
  elasticsearch {
    hosts => ["http://x.x.x.x:9200","http://x.x.x.x:9200","http://x.x.x.x:9200","http://x.x.x.x:9200","http://x.x.x.x:9200"]
    index => "logstash-%{+YYYY.MM.dd}"
  }
}
##################################################

Are there any logs from Logstash from the time period where the unusual index names are from?

I tried to find some.

I ended up creating an index matching the name and opened it via Kibana, they are random dates and the closest one was 4 days ago.

To date, I have over 6.2 billion logs in the cluster. Talk about a needle in a haystack.

Do you only have one logstash instance feeding all of those nodes in your Elasticsearch cluster?

No sir.

I have 30 locations with a Logstash server piping to my data center.

And the configuration is identical on each? The output block is identical to what you shared above?

I'm moving this to the Logstash forum, as I'm relatively certain it's a Logstash issue.

Yeap, they are all templated. They use the same basic configs.

Hmmm... One of them may be the culprit. Have you looked at the odd ones? Do they all contain logs from the same host? Hopefully they indicate which Logstash instance they came through. That would be a relatively easy way to tell. If each of the Logstash hosts are producing these, it will be harder to trace.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.