Logstash - No mapping found for [@timestamp] in order to sort on

Hi,

I am using 6.5.4 version

Logstash keeps printing the below line almost every second.

Failed to query elasticsearch for previous event {:index=>"", :query=>"", :event=>#LogStash::Event:0x535f580d, :error=>"#<RuntimeError: Elasticsearch query error: [{"shard"=>0, "index"=>".logstash", "node"=>"Az0RWz6mSfuY9lvDctkagg", "reason"=>{"type"=>"query_shard_exception", "reason"=>"No mapping found for [@timestamp] in order to sort on", "index_uuid"=>"F2VMpJSBTs2yVLbvWs7juQ", "index"=>".logstash"}}]>"}

What does your query look like? What does the mapping of the index look like?

I didn't execute any query. As soon I start the logstash this logs popping up.

Not sure about background processes or any query from Kibana.

You are saying you do not have an elasticsearch input? What does the configuration look like?

Hi @Badger,

Thanks for checking this.

Filebeat -> Logstash -> Elastic search -> Kiabana

Have enabled x-pack, pipeline and other user related settings.

input {
  elasticsearch {
    user => logstash_admin_user
    password => "t0p.s3cr3t"
    hosts => ["HOSTNAME:9200"]
  }
  beats {
    port => 5044
  }
}
filter {
  elasticsearch {
    user => logstash_admin_user
    password => "t0p.s3cr3t"
    hosts => ["HOSTNAME:9200"]
  }
  mutate {
    copy => {
      "[fields][log_prefix]" => "[@metadata][log_prefix]"
      "[fields][log_idx]" => "[@metadata][index]"
      "[fields][application]" => "[@metadata][application]"
    }
  }
}
output {
  elasticsearch {
    user => logstash_admin_user
    password => "t0p.s3cr3t"
    hosts => ["HOSTNAME:9200"]
  }
  stdout { codec => rubydebug }
}

Logstash

$ cat logstash.yml
path.data: /datavg/logstash/data
path.logs: /datavg/logstash/log
xpack.management.enabled: true
xpack.management.pipeline.id: ["main"]
xpack.management.elasticsearch.username: logstash_admin_user
xpack.management.elasticsearch.password: t0p.s3cr3t
xpack.management.elasticsearch.url: "http://HOSTNAME:9200"

That fetches all documents from logstash-* indexes and sorts them using @timestamp. One or more of those indexes does not have a @timestamp field. The error message says the index in question is "index"=>".logstash", which does not match logstash-*. Are you sure that is the configuration you are running?

Also it makes no sense that you have both an elasticsearch input and an elasticsearch filter that query everything.

Thanks @Badger for your quick response.

I am new to Elastic stack. Logstash was not listening to port 5044 for beats input.

Once I add the below line. it has started to listen.

elasticsearch {
user => logstash_admin_user
password => "t0p.s3cr3t"
hosts => ["HOSTNAME:9200"]
}

if not required I'll to remove it from input and from filter.

I have custom index . Not using logstash-*.

If you just want to read in data from beats you can remove the elasticsearch input and filter.

You will still need to tell the elasticsearch output which index to write to. If you want it to set it using that metadata field you would use this option on the output

index => "%{[@metadata][index]}"

Those error has gone now. Thanks @Badger for your help on this.

Logstash listening on port 5044.

Now my .conf files and Pipeline configuration like below.

input {
  beats {
    port => 5044
  }
}
filter {
  mutate {
    copy => {
      "[fields][log_prefix]" => "[@metadata][log_prefix]"
      "[fields][log_idx]" => "[@metadata][index]"
      "[fields][application]" => "[@metadata][application]"
    }
  }
}
output {
  elasticsearch {
    user => logstash_admin_user
    password => "t0p.s3cr3t"
    hosts => ["HOSTNAME:9200"]
    manage_template => false
    index => "%{[@metadata][log_prefix]}-%{[@metadata][index]}-%{+YYYY.MM.dd}"
  }  
  stdout { codec => rubydebug }
} 

Is it fine? Please correct me if anything is not required.

Now it is fetching the logs and working fine.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.