Bonsoir Monsieur fbaligand,
J'ai fait comme vous l'avez suggéré, cependant j'ai comme sorties des messages d'erreurs:
[2018-02-23T17:57:48,376][WARN ][logstash.outputs.elasticsearch] Could not index event to
Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"test-pr-pc_logiciel-2018.02.23",
:_type=>"logs", :_routing=>nil}, 2018-02-23T16:57:48.338Z %{host} %{message}], :response=>
{"index"=>{"_index"=>"test-pr-pc_logiciel-2018.02.23", "_type"=>"logs",
"_id"=>"AWHDmXA7dcwBLBnIf8UK", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception",
"reason"=>"failed to parse [logiciel]", "caused_by"=>{"type"=>"illegal_state_exception",
"reason"=>"Can't get text on a START_OBJECT at 1:88"}}}}}
[2018-02-23T17:57:48,376][WARN ][logstash.outputs.elasticsearch] Could not index event to
Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"test-pr-pc_logiciel-2018.02.23",
:_type=>"logs", :_routing=>nil}, 2018-02-23T16:57:48.338Z %{host} %{message}], :response=>
{"index"=>{"_index"=>"test-pr-pc_logiciel-2018.02.23", "_type"=>"logs",
"_id"=>"AWHDmXA7dcwBLBnIf8UL", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception",
"reason"=>"failed to parse [logiciel]", "caused_by"=>{"type"=>"illegal_state_exception",
"reason"=>"Can't get text on a START_OBJECT at 1:87"}}}}}
alors que ma configuration me semble correcte:
input {
elasticsearch {
hosts => ["localhost:9200"]
index => "z76-pr-2018.02.19"
query => '{ "query": { "query_string": { "query": "*" } },"sort" :[{"beat.hostname":{"order"
:"asc"}}],"size":20,"_source": ["beat.hostname","system.process.name"]}'
size => 10
ssl => true
ca_file => "/etc/logstash/ssl/ca.pem"
docinfo => true
docinfo_target => "@metadata"
scroll => "5m"
tags => "pc_logiciel"
}
}
filter {
if "pc_logiciel" in [tags] {
mutate {
replace => { "logiciel" => "%{[system][process][name]}"}
}
mutate {
replace => { "pc" => "%{[beat][hostname]}"}
}
aggregate {
task_id => "%{pc}"
code => "
map['pc'] = event.get('pc')
map['logiciel'] ||= []
map['logiciel'] << {'logiciel' => event.get('logiciel')}
"
push_previous_map_as_event => true
timeout => 5
timeout_tags => ["aggregate"]
}
}
else {
drop{}
}
}
output {
if "pc_logiciel" in [tags] {
elasticsearch {
hosts => ["localhost:9200"]
index => "pc_logiciel-%{+YYYY.MM.dd}"
}
}
stdout { codec => rubydebug }
}
Je sais pas ce qui cloche vraiment car tout semble correct. Quand j'enlève le plugin aggregate j'importe les données correctement. Mais avec le plugin j'ai des erreurs.
Où se trouve l'erreur dans ma config ou c'est le plugin qui marche pas correctement?
Ps: j'ai changé mon logstash la nouvelle version est 5.8.1
et le plugin aggreagte:2.7.2
Merci par avance de votre aide !