Hi,
I need few clarifications regarding logstash templates,
I have a usecase where I need to select the set of data(ipfix flow attributes) that are exported in ipfix messages.
Whenever I change the set of data, I restart my exporter process so that it sends the new set of templates. But, I don't see the templates (ipfix.templates.cache) getting refreshed in logstash eventhough the template packet reaches the server. As a result, the flows with new set of data are dropped. I need help in knowing whether I'm missing something basic.
If I have a set of flows with metadata A, B, C,D and I change them to A, C, D, then the new flows are not showing up in kibana and getting dropped in logstash with this error
[2019-06-18T11:56:16,442][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"elastiflow-3.5.0-2019.06.18", :_type=>"_doc", :routing=>nil}, #LogStash::Event:0x556d79c5], :response=>{"index"=>{"_index"=>"elastiflow-3.5.0-2019.06.18", "_type"=>"_doc", "_id"=>"e-5xamsB8DwcUKl84Bi2", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [ipfix.CryptoFlowPolicyName] of type [long] in document with id 'e-5xamsB8DwcUKl84Bi2'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"For input string: "testPolicy""}}}}}
Exporter: pmacctd
Logstash version: 7.0
CryptoFlowPolicyName: defined as string but the index datatype gets corrupted (attached the image). After this, all flows are getting dropped from any exported with this field.
I have enabled template caching on logstash
codec => netflow {
target => "ipfix"
netflow_definitions => "{ELASTIFLOW_DEFINITION_PATH:/.............}/netflow.yml"
ipfix_definitions => "{ELASTIFLOW_DEFINITION_PATH:.....................}/ipfix.yml"
cache_save_path => ".............../logstash/data/"
cache_ttl => "720000"
include_flowset_id => "true"
}