Good morning all, I am ingesting some logs and while specific records make it over, I am seeing errors in the logs and they are not making it over to Kibana outside of your standard fields (doc_id, index, time, etc) I have provided my conf.d below along with the errors and logs that are being used for ingestion. Any information to rectify this would greatly be appreciated.
logstash conf.d
input {
file {
path => "/var/log/logstash/casb_storage/proofpoint/10KLines.txt"
sincedb_path => "/var/log/logstash/sincedb_path/proofpoint_pos_file"
start_position => "end"
mode => "tail"
}
}
filter {
grok { match => { "message" => "\[(?<timestamp>[^\]]+)\] %{GREEDYDATA:proofpoint}" }
}
kv {
source => "proofpoint"
field_split => " "
include_brackets => true
recursive => "true"
value_split => "="
whitespace => "strict"
}
date { locale => "en"
match => [ "timestamp","YYYY-MM-dd HH:mm:ss.SSSSSS ZZ" ] }
mutate {
remove_field => [ "proofpoint","message", "syslog_hostname","path","@index","@version","host","port","tags" ]
}
}
output {
elasticsearch {
hosts => [ "http://es.server:9200" ]
user => "elastic"
password => "elastic_test_p@ssw0rd"
index => "proofpoint-%{+YYYY.MM}"
}
}
Errors in logs (this is repetitive for a lot of different fields)
[ WARN ] 2019-06-07 10:31:18.198 [[main]>worker8] elasticsearch - Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"proofpoint-2019.06", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x67d691f3>], :response=>{"index"=>{"_index"=>"proofpoint-2019.06", "_type"=>"_doc", "_id"=>"GD6QMmsBh_NbzFsD0AjZ", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [dict] of type [text] in document with id 'GD6QMmsBh_NbzFsD0AjZ'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:102"}}}}}
[WARN ] 2019-06-07 10:31:18.199 [[main]>worker5] elasticsearch - Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"proofpoint-2019.06", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x5824460d>], :response=>{"index"=>{"_index"=>"proofpoint-2019.06", "_type"=>"_doc", "_id"=>"fD6QMmsBh_NbzFsD0AXK", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [version] tried to parse field [version] as object, but found a concrete value"}}}}
Logs for ingestion-
[2019-06-06 17:06:18.402388 -0400] rprt s=2sunues3yv m=1 x=2sunues3yv-1 mod=av cmd=run rule=clean vendor=fsecure version="vendor=fsecure engine=2.50.10434:,, definitions=2019-06-06_14:,, signatures=0" duration=0.000
[2019-06-06 17:08:03.667159 -0400] info mod=regulation type=mail cmd=refresh id=0 action=load dict=CJ27_ExtClass_B file=/opt/server/pps-8.0.1.1446/etc/regulation/user/CJ27_ExtClass_B