Hello there. Before someone comes posting that this is a duplicate, the threads that I found here does not solve my issue. Here it goes:
My stask works like this: Filebeat -> Logstash -> Elasticsearch
I have lots and lots of Filebeats versions, from 1.2.3 to 6.6.* and, to solve this, I'm trying to normalise all versions under 6.5.4 (the same version that of my Logstashs).
The thing is, I have some indexes (I believe, created from FB 1.2.3) that, when I updated to v6.5.4, stopped indexing. If I go back to 5.6.*, it works. Updating the Filebeat breaks indexing.
The mapping of the index I'm working on right now: https://pastebin.com/PHJuG1h9
The logs that I get from Logstash are:
[2019-03-18T18:29:25,331][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"59f8bbf18d64dcc41bd1bd8eb60c73b87491c682", :_index=>"callback-2019.03.18", :_type=>"callbac
k", :_routing=>nil}, #<LogStash::Event:0x57a77fc3>], :response=>{"index"=>{"_index"=>"callback-2019.03.18", "_type"=>"callback", "_id"=>"59f8bbf18d64dcc41bd1bd8eb60c73b87491c682", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception
", "reason"=>"failed to parse [host]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:563"}}}}}
So (from the mapping)...
"host" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
}
My ES version is:
{
"name" : "FPTaOlj",
"cluster_name" : "252450725677:logs",
"cluster_uuid" : "QixOjWG0QuqjcaZhLfFPQg",
"version" : {
"number" : "5.5.2",
"build_hash" : "363575f",
"build_date" : "2018-07-31T10:54:15.297Z",
"build_snapshot" : false,
"lucene_version" : "6.6.0"
},
"tagline" : "You Know, for Search"
}
My Logstash configuration:
input {
beats {
port => 10095
add_field => { "hostname" => "%{[beat][hostname]}" }
add_field => { "logstash-server" => "ip-{{ ansible_default_ipv4.address | replace('.', '-')}}" }
}
}
filter {
if "json" in [tags] {
if [message] =~ /^\s*$/ {
drop { }
}
if [message] =~ /^raven@.* alert: failed to send exception to sentry.*/ {
drop { }
}
json {
source => "message"
}
if [timestamp] {
date {
match => [ "timestamp", "ISO8601","YYYY-MM-dd HH:mm:ss.SSSSSS"]
}
}
}
}
output {
elasticsearch {
hosts => ["{{ es_url }}:{{ es_port }}"]
index => "%{type}-%{+YYYY.MM.dd}"
document_id => "%{message_fingerprint}"
manage_template => false
}
}
My Filebeat config:
filebeat:
prospectors:
- paths:
- "/var/log/messages"
- "/var/log/secure"
- "/var/log/eb-activity.log"
fields_under_root: true
fields:
type: "callback"
tags: ["syslog", "staging", "callback-qa-new"]
alert_email: "router@luc.id"
-
paths:
- "/var/log/nginx/access.log"
fields_under_root: true
fields:
type: "callback"
tags: ["nginx", "staging", "callback-qa-new"]
alert_email: "router@luc.id"
-
paths:
- "/var/log/nodejs/nodejs.log"
fields_under_root: true
fields:
type: "callback"
tags: ["json", "staging", "callback-qa-new"]
alert_email: "abcde@poiu.com"
output:
logstash:
hosts:
- "XXX.YYY.ZZZ.AAA:PPPP"
loadbalance: true
worker: 1
Could you help me with that?