I get error type "Could not locate thatindex-pattern-field" then Dashboard not loading

(Korhan Herguner) #1

Hi everyone
I am using new
I want to use nginx module to filebeat, I made settings for logstash to nginx modules then filebeat start service. I installed plugin geoip & user-agent for elasticsearch. Nginx modules is active but dashboard no t loading and could not locate that index-pattern-filed(id:traefik.access.remote.ip) or get similar error. in the meantime Picture 2 filebeat is load 110 fields. Why less space is loading?
I upload 3 pictures about my mistakes

(Steffen Siering) #2

Can you share your configs and logs (please use the </> button to format those)?

Have you checked filebeat/logstash logs?

When starting with filebeat modules better not use Logstash.

(Korhan Herguner) #3

Logstash using the reason with filebeat; I want to parsing Nginx log therefore I use logstash filter method . Below filter for Nginx

filter {
  if [fileset][module] == "nginx" {
    if [fileset][name] == "access" {
      grok {
        match => { "message" => ["%{IPORHOST:[nginx][access][remote_ip]} - %{DATA:[nginx][access][user_name]} \[%{HTTPDATE:[nginx][access][time]}\] \"%{WORD:[nginx][access][method]} %{DATA:[nginx][access][url]} HTTP/%{NUMBER:[nginx][access][http_version]}\" %{NUMBER:[nginx][access][response_code]} %{NUMBER:[nginx][access][body_sent][bytes]} \"%{DATA:[nginx][access][referrer]}\" \"%{DATA:[nginx][access][agent]}\""] }
        remove_field => "message"
      }
      mutate {
        add_field => { "read_timestamp" => "%{@timestamp}" }
      }
      date {
        match => [ "[nginx][access][time]", "dd/MMM/YYYY:H:m:s Z" ]
        remove_field => "[nginx][access][time]"
      }
      useragent {
        source => "[nginx][access][agent]"
        target => "[nginx][access][user_agent]"
        remove_field => "[nginx][access][agent]"
      }
      geoip {
        source => "[nginx][access][remote_ip]"
        target => "[nginx][access][geoip]"
      }
    }
    else if [fileset][name] == "error" {
      grok {
        match => { "message" => ["%{DATA:[nginx][error][time]} \[%{DATA:[nginx][error][level]}\] %{NUMBER:[nginx][error][pid]}#%{NUMBER:[nginx][error][tid]}: (\*%{NUMBER:[nginx][error][connection_id]} )?%{GREEDYDATA:[nginx][error][message]}"] }
        remove_field => "message"
      }
      mutate {
        rename => { "@timestamp" => "read_timestamp" }
      }
      date {
        match => [ "[nginx][error][time]", "YYYY/MM/dd H:m:s" ]
        remove_field => "[nginx][error][time]"
      }
    }
  }
}

filebeat.yml
#================================ Outputs =====================================
output.logstash:

The Logstash hosts

hosts: ["172.28.14.238:5044"]

nginx.yml

  • module: nginx

    Access logs

    access:
    enabled:
    var.paths: ["C:\nginx\logs\access.log"]
(system) closed #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.