I'm trying to use Logstash 7.4.1 to parse XML files that are being generated each minute upon request to the Dynatrace server via REST api. The output will be sent to ElasticSearch.
My goal is to be able to generate a visualisation in Kibana similar to the Dynatrace Dashboard.
x-axis : timestamps
y-axis: values (request_count or average_response_time)
My approach is to aggreate all measurements belonging to the same measure according to their aggregation attribute. If aggregation type is Count, i'll sum up all count values and write it to ES as a single record, if aggregation type is Average, i'll compute the average using sum and count and write it to ES as another record.
I expect to have one document for each measure and the output should be similar to the following:
doc/1
chartdashlet: Operasyon Adet
measure : SGT_1
measure_type: operation_count
measure_time: 01.11.2019 14:01
value: somenumber
doc/2
chartdashlet: Operasyon Adet
measure : SGT_2
measure_type: operation_count
measure_time: 01.11.2019 14:01
value: somenumber
doc/3
chartdashlet: Operasyon Sure
measure : SGT_3
measure_type: operation_responsetime
measure_time: 01.11.2019 14:01
value: somenumber
How can i do it with logstash? In order to join parent and child nodes, i need Xpath 2.0 string_join function but the used library does only support Xpath 1.0 according to the documentation. Do i need to write Ruby scripts to make for loops over measures etc?
For now I'm able to get the whole file as a single event and i can extract measures of count type with the following configuration in a testing environment: (with a single input file)
$ /home/someuser/logstash-7.4.1/bin/logstash -f dynatrace-dashboard.conf
$ cat dynatrace-dashboard.conf
input {
file {
id => "dynatrace_dashboard_values"
mode => "read"
path => "/home/someuser/dynatrace-input.xml"
codec => multiline {
pattern => "<?xml"
negate => "true"
what => "previous"
}
}
}
filter {
xml {
source => "message"
xpath => [ "/dashboardreport/data/chartdashlet/measures/measure[@aggregation='Count']", "measure_count" ]
store_xml => "false"
}
}
output {
file {
path => "/home/someuser/logst.out"
codec => "rubydebug"
}
}
I'm stucked at this point.
Another alternative for me is to write a Python script to do all this stuff that I've described above and send it directly to ES but since we have so many pipelines centralized on logstash containers, I would prefer to do it with Logstash rather than this custom solution.
Any alternative recommendation to satify my goal will be appreciated. Maybe it is possible to write all xml file to ES and do some high level querying to achieve my goal.