Filebeat with Logstash and Elasticsearch failed to create Index in ES

Hallo Everbody

I have installed ELK and FIlebeat on my local machine. ELK is working fine with Fileinput and filter but Filebeat is not working properly. Filebeat configurations are working fine and it send data to Logstash but Logstash throw exception "index_not_found_exception => no such index"

Here is the logstash exception log

[2017-01-19T11:53:20,213][WARN ][logstash.outputs.elasticsearch] Failed action. {:status=>404, :action=>["index", {:_id=>nil, :_index=>"kudofilebeat-2017.01.13", :_type=>"KudoAppLog", :_routing=>nil}, 2017-01-13T06:55:22.821Z %{host} Message=Starting KuDo Application...], :response=>{"index"=>{"_index"=>"kudofilebeat-2017.01.13", "_type"=>"KudoAppLog", "_id"=>nil, "status"=>404, "error"=>{"type"=>"index_not_found_exception", "reason"=>"no such index", "resource.type"=>"index_expression", "resource.id"=>"kudofilebeat-2017.01.13", "index_uuid"=>"_na_", "index"=>"kudofilebeat-2017.01.13"}}}}

Here is my Filebeat.yml file

filebeat.prospectors:
- input_type: log
  paths:
    - D:\logs\tmp\*.log
#================================ General =====================================
name: "kudobeat"
fields:
  env: local
#----------------------------- Logstash output --------------------------------
output.logstash:
  hosts: ["localhost:5044"]

Here is my Logstash conf

input {
  beats {   
   port => 5044      
   type => "KudoAppLog"
   codec => multiline {
	  pattern => "^%{WORD};"
	  negate => true
	  what => "previous"
	}
  }
}
filter {  
	mutate {
	 gsub => ["message", "\n", " "]
     gsub => ["message", "\t", " "]  		 
	}   
	grok {    
		match => ["message", "(?m)%{WORD:LOGLEVEL}\;%{WORD:Machine}\;%{GREEDYDATA:Logtimestamp}\;%{WORD:}\=;%{WORD:}\=;%{WORD:}\=;%{WORD:}\;%{WORD:}\=;%{GREEDYDATA:message}"]		
		overwrite => [ "message" ]
		add_field => { "ApplicationName" => "Kudo" } 	
		remove_field => ["WORD"]	
	}
  #Set the Event Timesteamp from the log
  date {	
	match => ["Logtimestamp","dd.MM.yyyy HH:mm:ss,SSS"]		
	remove_field => ["Logtimestamp"]	
  }	 
}
output {
if [type] == "KudoAppLog" {
	  elasticsearch { 
	  hosts => "localhost:9200"
	  manage_template => false
	  index => "kudofilebeat-%{+YYYY.MM.dd}"   
	  }   
  }
  stdout { codec => rubydebug } 
}

this is how my logline looks like which filebeat sending to logstash.
Information;OLESRV741;11.01.2017 09:19:19,546;Area=;SubArea=;SessionId=;StepId;User=;Message=Starting KuDo Application...

i am using ELK 5.1.2 and Filebeat 5.1

can any one tell me why don't it create index in Elasticsearch. under following URL i fond all indexes except "kudofilebeat-*"

http://localhost:9200/_cat/indices?v

Thanks in advance
regards

Not sure yet why the index is not created automatically. Could you try out if the default config works as expected?

input {
  beats {
    port => 5044
  }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }
}

Like this we can figure out if it is related to any other parts in the config.

Hi @ruflin

Thanks for your feedback, but it did not work. I even tried to use ES directly from Filebeat but ES can't create the index. Something is wrong with ES. i think Filebeat and Logstash are working good. i see Filebeat is sending data to Logstash which is trying to send ES but ES can not create index.

cluster.name: dev
node.name: My_Node
node.attr.rack: r1
path.data: D:/elasticsearch.store/data
path.logs: D:/elasticsearch.store/logs

These are the only active settings in elasticsearch.yml. I also have installed x-pack but i have disabled that section to localize the issue.

is there any thing else which is missing in ES?

this is what i am using in Filebeat.yml now instead of Logstash.

#=========================== Filebeat prospectors =============================

filebeat.prospectors:

# Each - is a prospector. Most options can be set at the prospector level, so
# you can use different prospectors for various configurations.
# Below are the prospector specific configurations.

- input_type: log

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - D:\logs\tmp\*.log 
#-------------------------- Elasticsearch output ------------------------------
output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["localhost:9200"]
  index: "filebeat-%{+yyyy.MM.dd}"

#================================ Logging =====================================

# Sets log level. The default log level is info.
# Available log levels are: critical, error, warning, info, debug
logging:
  level: debug     #or info if you want less verbose output
  to_files: true
  files:
    path: C:\ELK\filebeat\logs
    rotateeverybytes: 10485760 # = 10MB

any thing broken here?

thanks in advance

Hi @ruflin

I found the problem now. It was X-Pack which caused it. i have following setting from x-pack which now i have deactivated.

# ---------------------------------- X-Pack Configuration -----------------------------------
#action.auto_create_index: .security,.monitoring*,.watches,.triggered_watches,.watcher-history*
#xpack.watcher.index.rest.direct_access : true
xpack.security.enabled: false

#xpack.notification.email.account:
#    exchange_account:
#        profile: standard
#        email_defaults:
#            from: xxx@xxx.org
#        smtp:
#            host: xxx-xxx.xxx.org
#            port: 25            

i am not sure which line was the problem but i think it can be the first one with auto create index.

any thing which i need to configure in x-pack before using filebeat?

thanks

1 Like

Yeah, you must make sure that the user which is used for indexing has index creating rights. See https://www.elastic.co/guide/en/x-pack/current/securing-aliases.html

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.