I'm trying to fix/upgrade my Elasticsearch cluster and i think i'm nearly there i have a few massive problems so hopefully someone will be able to assist. I am not familiar with Elasticsearch or Logstash but i'm working my way around it so if I get any responses i'd like them to be very simple.
I'm using beats 7.2, Logstash 7.2 and Elasticsearch 6.8.1 (I need to update but I've got another issue which I will post later!) So.. I can pickup files from a specific location and write them to the console from Logstash, all looks good - When I redirect them to Elasticsearch I get the following error:
[2019-07-26T00:16:05,404][WARN ][logstash.outputs.elasticsearch] Could not index
event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>
"filebeat-2019.04", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x32cd5f48>
], :response=>{"index"=>{"_index"=>"filebeat-2019.04", "_type"=>"doc", "_id"=>"R
pRrK2wBD4NQB0I_DH4o", "status"=>400, "error"=>{"type"=>"illegal_argument_excepti
on", "reason"=>"Rejecting mapping update to [filebeat-2019.04] as the final mapp
ing would have more than 1 type: [log, doc]"}}}}
Now reading the documentation I've read that "type" is going (or something) but from the guides on the websites, and other blogs I've read, i'm not referencing type in any of my beats or Logstash configuration, or from any of the exampled. The indices that I'm trying to update are brand new and I've deleted them so I think Logstash is trying to create them somehow!
My Logstash config is as follows:
input {
beats {
port => "5044"
}
}
filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{IPORHOST:clientip} %{NOTSPACE:port} %{NOTSPACE:username} %{NOTSPACE:method} %{NOTSPACE:page} %{NOTSPACE:query} %{NOTSPACE:scstatus} %{NOTSPACE:bytes} %{NOTSPACE:sbytes} %{NOTSPACE:sname} %{NOTSPACE:sport}"}
}
date {
match => [ "timestamp", "YYYY-MM-dd HH:mm:ss" ]
target => "@timestamp"
timezone => "Europe/London"
}
geoip {
source => "clientip"
target => "geoip"
database => "C:\ELK-Stack\logstash\vendor\bundle\jruby\1.9\gems\logstash-filter-geoip-4.0.4-java\vendor\GeoLite2-City.mmdb"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
remove_field => [ "timestamp"]
}
}
output {
elasticsearch {
hosts => [ "LOGSTASH:9200" ]
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM}"
}
}
My Filebeat config is as follows:
#=========================== Filebeat inputs =============================
filebeat.inputs:
- type: log
enabled: true
paths:
- C:\DataLogs\*
exclude_lines: ["^#"]
fields:
server: SECRETSERVER
#============================= Filebeat modules ===============================
filebeat.config.modules:
# Glob pattern for configuration loading
path: ${path.config}/modules.d/*.yml
# Set to true to enable config reloading
reload.enabled: false
# Period on which files under path should be checked for changes
#reload.period: 10s
#----------------------------- Logstash output --------------------------------
output.logstash:
# The Logstash hosts
hosts: ["localhost:5044"]
I've been pulling my hair out for about 5 days now, more hours than i'd like to admit, but I can't just forget about it, for my own sanity!!!!!
Please someone help!!!