Daily index not created

Hi,

I see below indices getting created daily without an issue.

yellow open   access_server-2021.07.06          HGHg9wiBTz-42qxzIAuV3A   1   1        203            0    341.5kb        341.5kb
yellow open   access_server-2021.07.07          _0yBXpnATgaV2azW2OydKA   1   1       3258            0      2.2mb          2.2mb
yellow open   access_server-2021.07.04          mn7p_pjURp-9I6iS6YwjSA   1   1        104            0    142.7kb        142.7kb
yellow open   access_server-2021.07.05          rAQ_mUD0SOupKO8_20L9qg   1   1        245            0    328.8kb        328.8kb
yellow open   access_server-2021.07.02          qJFHw1UISZmhv0td6L2dNQ   1   1        275            0    407.8kb        407.8kb
yellow open   access_server-2021.07.03          zezz9XYYR5GSIjC3xWeH2g   1   1        176            0      233kb          233kb
yellow open   access_server-2021.07.01          UyrfvdZzT4CVqt-PF98l3A   1   1       1675            0      1.3mb          1.3mb
yellow open   access_server-2021.06.30          mSNmyWKCQBCUMcQmrrMF3Q   1   1      81995            0     41.7mb         41.7mb

I am struggling to find out why index is not created on 8th July?

Below are the log files which are monitored by filebeat and send to logstash.

-rw-r--r-- 1 access access 307K Jul  1 14:07 access_audit.2021-07-01.log
-rw-r--r-- 1 access access 7.0K Jul  2 11:19 access_audit.2021-07-02.log
-rw-r--r-- 1 access access 6.3K Jul  6 05:43 access_audit.2021-07-06.log

Filebeat configuration for above files.

- type: log
      fields_under_root: true
      fields:
         log_type:  access_server
         app_id: as
      multiline.pattern: ^[[:space:]]+(at|\.{3})\b|^Caused by:|^java|^...|^-
      multiline.negate: true
      multiline.match: after
      paths:
        - /opt/access-1.0.0/log/*

logstash configuration for above files are

if [log_type] == "access_server" and [app_id] == "as"
  {
    grok {
        match => { "message" => "%{YEAR}-%{MONTHNUM}-%{MONTHDAY}[T ]%{HOUR}:%{MINUTE}(?::?%{SECOND})\| %{USERNAME:exchangeId}\| %{DATA:trackingId}\| %{NUMBER:RoundTrip:int}%{SPACE}ms\| %{NUMBER:ProxyRoundTrip:int}%{SPACE}ms\| %{NUMBER:UserInfoRoundTrip:int}%{SPACE}ms\| %{DATA:Resource}\| %{DATA:subject}\| %{DATA:authmech}\| %{DATA:scopes}\| %{IPV4:Client}\| %{WORD:method}\| %{DATA:Request_URI}\| %{INT:response_code}\| %{DATA:failedRuleType}\| %{DATA:failedRuleName}\| %{DATA:APP_Name}\| %{DATA:Resource_Name}\| %{DATA:Path_Prefix}"} 
    }
  }

output {

 if [log_type] == "access_server" {
  elasticsearch {
    hosts => ['http://ip:9200']
    index => "%{type}-%{+YYYY.MM.dd}"
        user => elastic
    password => x
      }
 }
  elasticsearch {
    hosts => ['http://ip:9200']
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
    user => elastic
    password => x
  }
}

i see some logs of index of 5th and 6th got created successfully.

[2021-07-05T13:14:19,770][INFO ][o.e.c.m.MetaDataCreateIndexService] [elasticsearch_1] [access_ping_server-2021.07.05] creating index, cause [auto(bulk api)], templates [], shards [1]/[1], mappings []
[2021-07-05T13:14:19,851][INFO ][o.e.c.m.MetaDataMappingService] [elasticsearch_1] [access_ping_server-2021.07.05/rAQ_mUD0SOupKO8_20L9qg] create_mapping [_doc]

[2021-07-06T05:05:49,765][INFO ][o.e.c.m.MetaDataCreateIndexService] [elasticsearch_1] [access_server-2021.07.06] creating index, cause [auto(bul
k api)], templates [], shards [1]/[1], mappings []
[2021-07-06T05:05:49,840][INFO ][o.e.c.m.MetaDataMappingService] [elasticsearch_1] [access_server-2021.07.06/HGHg9wiBTz-42qxzIAuV3A] create_mapping [_doc]

in elastic-2021-07-07-1.log.gz file, can see it created index for 7th July.

[2021-07-07T04:12:52,125][INFO ][o.e.c.m.MetaDataCreateIndexService] [elasticsearch_1] [access_server-2021.07.07] creating index, cause [auto(bulk api)], templates [], shards [1]/[1], mappings []
[2021-07-07T04:12:52,211][INFO ][o.e.c.m.MetaDataMappingService] [elasticsearch_1] [access_server-2021.07.07/_0yBXpnATgaV2azW2OydKA] create_mapping [_doc]

but i also see error about 7th July index in elastic.log

[2021-07-08T08:06:25,401][DEBUG][o.e.a.s.TransportSearchAction] [elasticsearch_1] [access_server-2021.07.07][0], node[oygWDlpVTUJN6lHTYnw], [P], s[STARTED], a[id=MhuISicnTyW-_VYInDog]: Failed to execute [SearchRequest{searchType=QUERY_THEN_FETCH, indices=[access_server-2021.07.07], indicesOptions=IndicesOptions[ignore_unavailable=false, allow_no_indices=true, expand_wildcards_open=true, expand_wildcards_closed=false, allow_aliases_to_multiple_indices=true, forbid_closed_indices=true, ignore_aliases=false, ignore_throttled=true], types=[], routing='null', preference='null', requestCache=null, scroll=null, maxConcurrentShardRequests=0, batchedReduceSize=512, preFilterShardSize=128, allowPartialSearchResults=true, localClusterAlias=null, getOrCreateAbsoluteStartMillis=-1, ccsMinimizeRoundtrips=true, source={"size":1,"query":{"match_all":{"boost":1.0}},"sort":[{"timestamp":{"order":"desc"}}]}}]
org.elasticsearch.transport.RemoteTransportException: [elasticsearch_1][ip:9300][indices:data/read/search[phase/query]]
Caused by: org.elasticsearch.index.query.QueryShardException: No mapping found for [timestamp] in order to sort on
        at org.elasticsearch.search.sort.FieldSortBuilder.build(FieldSortBuilder.java:387) ~[elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.search.sort.SortBuilder.buildSort(SortBuilder.java:153) ~[elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.search.SearchService.parseSource(SearchService.java:755) ~[elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.search.SearchService.createContext(SearchService.java:586) ~[elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:545) ~[elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:348) ~[elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.search.SearchService.lambda$executeQueryPhase$1(SearchService.java:340) ~[elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.action.ActionListener.lambda$map$2(ActionListener.java:145) ~[elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.action.ActionListener$1.onResponse(ActionListener.java:62) [elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.search.SearchService.lambda$rewriteShardRequest$7(SearchService.java:1043) [elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.action.ActionRunnable$1.doRun(ActionRunnable.java:45) [elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.common.util.concurrent.TimedRunnable.doRun(TimedRunnable.java:44) [elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:773) [elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-7.4.0.jar:7.4.0]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
        at java.lang.Thread.run(Thread.java:830) [?:?]
[2021-07-08T08:06:25,405][DEBUG][o.e.a.s.TransportSearchAction] [elasticsearch_1] All shards failed for phase: [query]
org.elasticsearch.index.query.QueryShardException: No mapping found for [timestamp] in order to sort on
        at org.elasticsearch.search.sort.FieldSortBuilder.build(FieldSortBuilder.java:387) ~[elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.search.sort.SortBuilder.buildSort(SortBuilder.java:153) ~[elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.search.SearchService.parseSource(SearchService.java:755) ~[elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.search.SearchService.createContext(SearchService.java:586) ~[elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:545) ~[elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:348) ~[elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.search.SearchService.lambda$executeQueryPhase$1(SearchService.java:340) ~[elasticsearch-7.4.0.jar:7.4.0]
         at org.elasticsearch.action.ActionListener.lambda$map$2(ActionListener.java:145) ~[elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.action.ActionListener$1.onResponse(ActionListener.java:62) [elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.search.SearchService.lambda$rewriteShardRequest$7(SearchService.java:1043) [elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.action.ActionRunnable$1.doRun(ActionRunnable.java:45) [elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.common.util.concurrent.TimedRunnable.doRun(TimedRunnable.java:44) [elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:773) [elasticsearch-7.4.0.jar:7.4.0]
        at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-7.4.0.jar:7.4.0]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
        at java.lang.Thread.run(Thread.java:830) [?:?]

but there are no logs of 8th July index on why it's not created.

What is the reason and where i can find logs for it.

The error has nothing to do with index creation. Something has run a query and asked for the results to be sorted on the timestamp field

source={"size":1,"query":{"match_all":{"boost":1.0}},"sort":[{"timestamp":{"order":"desc"}}]}}]

but there is no timestamp field.

Caused by: org.elasticsearch.index.query.QueryShardException: No mapping found for [timestamp] in order to sort on

The log files you show end on the 6th. That should not result in the index for the 7th being created. Is it possible no data for the 8th has been indexed? The index will not be created unless it is needed.

@Badger, Thanks for your quick reply. Appreciate.

Understood about timestamp field error.

I also got your point on file ending on one day should cause next index to create unless there is some data coming and matching for it.

There are different types of files getting generated in that path not just above one type of file.

There is one file on 7th July due to which we can say index got created for 7th.

-rw-r--r--  1 access access 605K Jul  7 10:48 access_audit.log

For 8th,

there is one file but it has start date of 12th Feb,

-rw-r--r--. 1 access access 5.5M Jul  8 18:46 access.log

head -3 access.log

2021-02-12T07:29:40,660  INFO [] com.core.util.jos.HostKeyLoader - Generated new host key: {"kty":"oct","kid":"YJVgOWC1OfQB","key_ops":["encrypt","decrypt"],"alg":"A1CC-HS6"}
2021-02-12T07:29:40,671  INFO [] com.core.util.jos.HostKeyLoader - Created host key file (unencrypted): /usr/share/access-1.1.0/conf/pa.jwk
2021-02-12T07:29:40,683  INFO [] com.cli.RouterCLI - Java Version: 1.8.0_221-b11 (Java HotSpot(TM) 64-Bit Server VM)

and last timestamp of 8th July,

tail -3 access.log

2021-07-08T18:46:39,486 ERROR [] com.core.transport.http.async.frontend.FrontEndConnectionHandler - Additional exception caught after already in an error state
io.netty.handler.codec.DecoderException: javax.net.ssl.SSLException: Received fatal alert: protocol_version
    at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:472) ~[netty-codec-4.1.30.Final.jar:4.1.30.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:405) ~[netty-codec-4.1.30.Final.jar:4.1.30.Final]

I think it didn't create the index for 8th because although there is log entires for 8th July in an existing file (which is from Feb) but it didnt match as per the above logstash pipeline. Please correct me if i am wrong.

Thanks,

Is it possible you have events that have log_type set but not type, so that you have an index called %{}-2021.07.08?

Hi @Badger,

Thanks for your reply.

Sorry i could not get you on what you are trying to say. can you please say it again.

@Badger - Is it possible you have events that have log_type set -

I dont have any files with name log_type i.e access_server or any message lines having access_server.

I can see below type and log_type in filebeat.yml.

type: log

log_type:  access_server

and in logstash, it is trying to match if log_type is access_server to apply that match block and in output section it is again trying to match log_type and creating index as access_server-2021-07...

If the value of the [log_type] field is "access_server" then the index name is determined by referencing the value of the [type] field. If that field does not exist then you will not get the index name you expect.

Hi @badger,
As shown in above filebeat.yml output, I have both type and log_type defined and due to below config in logstash.conf , type is getting replaced by access_server.

 mutate {
             replace => {
               "[type]" => "access_server"
             }
           }

and index is getting created as access_server-2021...

 index => "%{type}-%{+YYYY.MM.dd}"

You were right, i can see below index was created but still not sure why and why its name is missing. i.e why its not like access_server-2021...

yellow open   %{type}-2021.07.08   epj_-5ruQcSrz3uA   1   1    284   0    336.2kb   336.2kb

I see today's (9th July) index is created after running some test data.

yellow open   access_server-2021.07.09     bPV6GXW3da7A   1   1     359    0    763.3kb    763.3kb