Endless stream of logstash "No mapping found for [@timestamp] in order to sort on" after X-Pack install


(David Lam) #1

I'm getting an endless stream of these "No mapping found for [@timestamp] in order to sort on" warnings in my logstash log. Anyone know how to stop them?

here's what i was doing

  • I needed more memory for my ELK instance... so I rebuilt my environment today

  • I started reloading all my documents via logstash

  • Since that was going to take a long time, I tried installing X-Pack to try out its kibana CSV export

  • After restarting elasticsearch / logstash... I started getting an endless stream of these messages
    in my logstash & elasticsearch logs

  • They appear to be related to X-Pack's watcher or something?

  • Removing x-pack dosent appear to make the endless stream of error logging =(

Any ideas would help!

  • /var/log/logstash/logstash-plain.log

    [2018-02-02T05:57:23,322][WARN ][logstash.filters.elasticsearch] Failed to query elasticsearch for previous event {:index=>"", :query=>"", :event=>#LogStash::Event:0x5fdd15a9, :error=>#<RuntimeError: Elasticsearch query error: [{"shard"=>0, "index"=>".kibana", "node"=>"xR8x_Q2_RdWMUnWFAutt-Q", "reason"=>{"type"=>"query_shard_exception", "reason"=>"No mapping found for [@timestamp] in order to sort on", "index_uuid"=>"dmpKlxqhQtKbPUWqZ4yYUw", "index"=>".kibana"}}, {"shard"=>0, "index"=>".monitoring-alerts-6", "node"=>"xR8x_Q2_RdWMUnWFAutt-Q", "reason"=>{"type"=>"query_shard_exception", "reason"=>"No mapping found for [@timestamp] in order to sort on", "index_uuid"=>"hjp4CBo-QsSrJuPcuBqbtw", "index"=>".monitoring-alerts-6"}}, {"shard"=>0, "index"=>".monitoring-es-6-2018.02.02", "node"=>"xR8x_Q2_RdWMUnWFAutt-Q", "reason"=>{"type"=>"query_shard_exception", "reason"=>"No mapping found for [@timestamp] in order to sort on", "index_uuid"=>"5wWh1wtFQj-MAMRmwrcZYw", "index"=>".monitoring-es-6-2018.02.02"}}, {"shard"=>0, "index"=>".monitoring-logstash-6-2018.02.02", "node"=>"xR8x_Q2_RdWMUnWFAutt-Q", "reason"=>{"type"=>"query_shard_exception", "reason"=>"No mapping found for [@timestamp] in order to sort on", "index_uuid"=>"fv3VQwhBT4OnioyhCUAMUw", "index"=>".monitoring-logstash-6-2018.02.02"}}, {"shard"=>0, "index"=>".security-6", "node"=>"xR8x_Q2_RdWMUnWFAutt-Q", "reason"=>{"type"=>"query_shard_exception", "reason"=>"No mapping found for [@timestamp] in order to sort on", "index_uuid"=>"lzLSNTjnTj2pvTZsUPaePA", "index"=>".security-6"}}, {"shard"=>0, "index"=>".triggered_watches", "node"=>"xR8x_Q2_RdWMUnWFAutt-Q", "reason"=>{"type"=>"query_shard_exception", "reason"=>"No mapping found for [@timestamp] in order to sort on", "index_uuid"=>"QcqCVplpTPqFGchuihEUag", "index"=>".triggered_watches"}}, {"shard"=>0, "index"=>".watcher-history-7-2018.02.02", "node"=>"xR8x_Q2_RdWMUnWFAutt-Q", "reason"=>{"type"=>"query_shard_exception", "reason"=>"No mapping found for [@timestamp] in order to sort on", "index_uuid"=>"afyUBrqrQjOgO4BYvCW3wg", "index"=>".watcher-history-7-2018.02.02"}}, {"shard"=>0, "index"=>".watches", "node"=>"xR8x_Q2_RdWMUnWFAutt-Q", "reason"=>{"type"=>"query_shard_exception", "reason"=>"No mapping found for [@timestamp] in order to sort on", "index_uuid"=>"X1f3bvB0T9egtTM5VnqiuQ", "index"=>".watches"}}]>}

  • elasticsearch log

    [2018-02-02T05:57:15,181][DEBUG][o.e.a.s.TransportSearchAction] [xR8x_Q2] [.monitoring-logstash-6-2018.02.02][0], node[xR8x_Q2_RdWMUnWFAutt-Q], [P], s[STARTED], a[id=4Xdp3yqHSxumsC0v_O
    EpJA]: Failed to execute [SearchRequest{searchType=QUERY_THEN_FETCH, indices=[.kibana, .monitoring-alerts-6, .monitoring-es-6-2018.02.02, .monitoring-logstash-6-2018.02.02, .security,
    .security-6, .triggered_watches, .watcher-history-7-2018.02.02, .watches, newsletters_mandrill], indicesOptions=IndicesOptions[id=38, ignore_unavailable=false, allow_no_indices=true, e
    xpand_wildcards_open=true, expand_wildcards_closed=false, allow_aliases_to_multiple_indices=true, forbid_closed_indices=true, ignore_aliases=false], types=[], routing='null', preferenc
    e='null', requestCache=null, scroll=null, maxConcurrentShardRequests=5, batchedReduceSize=512, preFilterShardSize=128, source={"size":1,"query":{"query_string":{"query":"","fields":[],
    "type":"best_fields","default_operator":"or","max_determinized_states":10000,"enable_position_increments":true,"fuzziness":"AUTO","fuzzy_prefix_length":0,"fuzzy_max_expansions":50,"phr
    ase_slop":0,"analyze_wildcard":false,"escape":false,"auto_generate_synonyms_phrase_query":true,"fuzzy_transpositions":true,"boost":1.0}},"sort":[{"@timestamp":{"order":"desc"}}]}}] las
    tShard [true]
    org.elasticsearch.transport.RemoteTransportException: [xR8x_Q2][172.31.26.248:9300][indices:data/read/search[phase/query]]
    Caused by: org.elasticsearch.index.query.QueryShardException: No mapping found for [@timestamp] in order to sort on
    at org.elasticsearch.search.sort.FieldSortBuilder.build(FieldSortBuilder.java:319) ~[elasticsearch-6.1.3.jar:6.1.3]
    at org.elasticsearch.search.sort.SortBuilder.buildSort(SortBuilder.java:155) ~[elasticsearch-6.1.3.jar:6.1.3]
    at org.elasticsearch.search.SearchService.parseSource(SearchService.java:718) ~[elasticsearch-6.1.3.jar:6.1.3]
    at org.elasticsearch.search.SearchService.createContext(SearchService.java:552) ~[elasticsearch-6.1.3.jar:6.1.3]
    at org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:528) ~[elasticsearch-6.1.3.jar:6.1.3]
    at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:324) ~[elasticsearch-6.1.3.jar:6.1.3]
    at org.elasticsearch.search.SearchService$2.onResponse(SearchService.java:310) [elasticsearch-6.1.3.jar:6.1.3]
    at org.elasticsearch.search.SearchService$2.onResponse(SearchService.java:306) [elasticsearch-6.1.3.jar:6.1.3]
    at org.elasticsearch.search.SearchService$3.doRun(SearchService.java:996) [elasticsearch-6.1.3.jar:6.1.3]
    at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:637) [elasticsearch-6.1.3.jar:6.1.3]
    at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-6.1.3.jar:6.1.3]
    at org.elasticsearch.common.util.concurrent.TimedRunnable.doRun(TimedRunnable.java:41) [elasticsearch-6.1.3.jar:6.1.3]
    at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-6.1.3.jar:6.1.3]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_161]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_161]
    at java.lang.Thread.run(Thread.java:748) [?:1.8.0_161]


(Magnus Bäck) #2

What does your elasticsearch filter configuration look like? It sounds like you haven't overridden the filter's sort option even though your index doesn't have a @timestamp field.


(David Lam) #3

hmm here's my logstash conf files (I think thats what you mean by "elasticsearch filter configuration"?), but from the error messages it seems related to x-pack monitoring/watching or something:

input {
    elasticsearch {
        user => elastic
        password => somepassword
    }
    
    file {
        path => "/log_data_folder/*.csv"
        sincedb_path => "/sincedb_logstash_paths/cool_stuff"
        # sincedb_path => "/dev/null"
        start_position => "beginning"
        type => "foobarbaz"
    }
}

filter {
    elasticsearch {
        user => elastic
        password => somepassword
    }

    if "foobarbaz" in [type] {
        csv {
            columns => ["Date","Email","Sender","Subject","Status","Tags","Subaccount","Opens","Clicks","BounceDetail","Site","Region","Age","Gender","SignupFlowID","OnboardingQuerystring","JoinDate", "MetroArea", "EmailFrequency","DayOfWeek","HourOfDay","DaysSinceJoined","UnsubTest","CoolStuffID","Offer1","Offer2","Offer3","SponsorshipID"]
            convert => {
                "Age" => "integer"
                "Clicks" => "integer"
                "Date" => "date"
                "DaysSinceJoined" => "integer"
                "HourOfDay" => "integer"
                "Offer1" => "integer"
                "Offer2" => "integer"
                "Offer3" => "integer"
                "Opens" => "integer"
            }
            remove_field => ["BounceDetail", "Sender", "Status", "Subaccount"]
            separator => ","
        }

        if ([Date] == "Date") {
            drop {}
        }

        date {
            match => ["Date", "YYYY-MM-dd HH:mm:ss"]
            target => "@timestamp"
        }

        date {
            match => ["JoinDate", "YYYY-MM-dd HH:mm:ss"]
            target => "JoinDate"
        }

        mutate {
            split => { "Tags" => ";" }
        }
    }
}

output {
      elasticsearch {
          hosts => "http://localhost:9200"
          index => "cool_stuff"
          user => elastic
          password => somepassword
      }

#   stdout { codec => rubydebug }
    
}

another file in /etc/logstash/conf.d/

input {
    elasticsearch {
        user => elastic
        password => somepassword
    }

    file {
        codec => json
        path => "/log_data_folder/unsubscribe_log_folder/unsubscribe.log*"
        sincedb_path => "/sincedb_logstash_paths/unsubscribe"
        # sincedb_path => "/dev/null"
        start_position => "beginning"
        type => "unsubscribe"
    }
}

filter {
    elasticsearch {
        user => elastic
        password => somepassword
    }

    mutate {
        rename => { "asctime" => "Date" }
        # 'message' has special meaning in logstash, rename it
        rename => { "message" => "logmsg" }
        remove_field => ["levelname", "logmsg"]
    }

    date {
        match => ["Date", "YYYY-MM-dd HH:mm:ss"]
        target => "@timestamp"
    }

    date {
        match => ["Date", "YYYY-MM-dd HH:mm:ss"]
        target => "Date"
    }

    date {
        match => ["JoinDate", "YYYY-MM-dd HH:mm:ss"]
        target => "JoinDate"
    }

    json {
        source => "message"
    }
}


output {
    elasticsearch {
        hosts => "http://localhost:9200"
        index => "cool_stuff"
        user => elastic
        password => somepassword
    }
}

(Magnus Bäck) #4

Since your elasticsearch filter doesn't specify the index option you're searching in all indexes, which probably isn't what you want, but it's also an actual problem since not all indexes have a @timestamp to sort on (and that's the default sort field).


(David Lam) #5

hey that worked!!!!! wooooo

that explanation totally makes sense as I added a whole bunch of elasticsearch { } filter places so I can pass username/password after turning on the xpack authentication stuff


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.