Hi,
I use Logstash to read a Kafka topic and to send the events to Influxdb. The events in the Kafka topic are sent by Telegraf.
Here are a example of events received from Kafka :
{
"message" => "disk,all_servers=myserver,device=dm-0,fstype=xfs,host=myserver,mode=rw,path=/ used=55471497216i,used_percent=26.463792867611136,inodes_total=102400000i,inodes_free=102336513i,inodes_used=63487i,total=209612800000i,free=154141302784i 1631183074000000000\n",
"@version" => "1",
"@timestamp" => 2021-09-09T16:49:14.029Z
}
...
...
{
"message" => "cpu,all_servers=myserver,cpu=cpu-total,host=myserver usage_user=4.694598687520277,usage_system=1.9182231197899142,usage_idle=92.57950530586864,usage_steal=0,usage_guest_nice=0,usage_nice=0,usage_iowait=0,usage_irq=0,usage_softirq=0.8076728925508541,usage_guest=0 1631183074000000000\n",
"@version" => "1",
"@timestamp" => 2021-09-09T16:49:14.029Z
}
To send the events to Influxdb, I use the http output
http {
url => "http://xxx:xxx@localhost:8086/write?db=telegraf"
http_method => "post"
headers => ["Accept-Encoding: gzip", "Content-Encoding: gzip"]
message => "%{[message]}"
format => "message"
}
It is working but it is very slow.
I tried to use the influxdb output :
influxdb {
host => "localhost"
port => 8086
user => "xxx"
password => "xxx"
db => "telegraf"
retention_policy => "autogen"
use_event_fields_for_data_points => true
exclude_fields => ["@timestamp","@version"]
data_points => {}
}
But it does not work. The events are consumed but not stored in Influxdb.
Any idea ?
M.