Logstash output not sending data to Influxdb

Hi All,

I have installed metricbeat (Version 6.3.0) in my ubuntu machine and sending data to logstash output and I am running logstash (5.6.10 Version) to send metricbeat data to influxdb and below is my logstash configuration,

Installed the logstash-output-influxdb plugin as like below
./logstash-plugin install --version 5.0.3 logstash-output-influxdb

Logstash configuration

input {
  beats {
    port => 5044
  }
}


output {
  stdout {codec => rubydebug}
  influxdb {
    host => "localhost"
    port => 8086
    user => "admin"
    password => "password"
    db => "metrics"
    codec => "json"
    use_event_fields_for_data_points => true
    exclude_fields => ["@timestamp", "@version", "sequence", "message", "type"]
    data_points => {
    }
  }
}

However I am able to run logstash successfully but metrics data were not sending to influxdb and below is the exception i am getting in console,

[2018-07-03T09:17:59,344][WARN ][logstash.outputs.influxdb] Non recoverable exception while writing to InfluxDB {:exception=>#<InfluxDB::Error: {"error":"unable to parse 'logstash,beats_input_raw_event=true,host={\"name\"\\=\u003e\"localhost\"} system={\"memory\"=\u003e{\"hugepages\"=\u003e{\"total\"=\u003e0, \"default_size\"=\u003e2097152, \"surplus\"=\u003e0, \"reserved\"=\u003e0, \"used\"=\u003e{\"pct\"=\u003e0, \"bytes\"=\u003e0}, \"free\"=\u003e0}, \"actual\"=\u003e{\"free\"=\u003e4079476736, \"used\"=\u003e{\"pct\"=\u003e0.567, \"bytes\"=\u003e5342081024}}, \"total\"=\u003e9421557760, \"swap\"=\u003e{\"total\"=\u003e0, \"used\"=\u003e{\"pct\"=\u003e0, \"bytes\"=\u003e0}, \"free\"=\u003e0}, \"used\"=\u003e{\"pct\"=\u003e0.6941, \"bytes\"=\u003e6539534336}, \"free\"=\u003e2882023424}},beat={\"name\"=\u003e\"localhost\", \"hostname\"=\u003e\"localhost\", \"version\"=\u003e\"6.3.0\"},@version=\"1\",metricset={\"name\"=\u003e\"memory\", \"rtt\"=\u003e326, \"module\"=\u003e\"system\"} 1530609478299': invalid boolean"}

Please let me know your thoughts and it would be very helpful and correct me if my configuration is wrong.

Regards,
Ganeshbabu R

Hi @magnusbaeck

I am unable to resolve this issue and if you could pls check the logstash configuration and let me know if there is anything needs to changes.

The same metricbeat data is going to elasticsearch successfully. Please let me know your feedback on this.

Thanks,
Ganeshbabu R

@magnusbaeck

I tried changing this use_event_fields_for_data_points to false and also I have given some fields in datapoints,

   use_event_fields_for_data_points => false 
   data_points => {
        "hostname" => "%{[hostname]}"
   }

Then I checked in the influx database and below is the response I got in the influx,

> show measurements
name: measurements
name
----
logstash
> select * from logstash limit 5
name: logstash
time                environment hostname
----                ----------- --------
1530698167441000000             %{[hostname]}
1530698167442000000             %{[hostname]}
1530698167443000000             %{[hostname]}
1530698167459000000             %{[hostname]}
1530698167484000000             %{[hostname]}

I read the documentation by default use_event_fields_for_data_points is false and if we change it true by automatically use fields from the event as the data points sent to Influxdb.

But for me its not working as expected if I changed to true the metricbeat data were not sending to influxdb throught logstash.

Please let me know your thoughts

Thanks
Ganeshbabu R

Hi @Christian_Dahlqvist

Could you please help me on this issue. I didn't find anyone responding to this request and please check and correct me If I made any mistake in the configuration.

Thanks,
Ganeshbabu R

This forum is manned by volunteers, so please do not ping people not already involved in the thread. Also please be patient and leave the post for a few days before bumping it.

You are using a plugin I have myself never used. I would suspect this is not a very commonly used plugin, so it may take a while for someone with relevant experience/knowledge to respond.

Alright !!
Okay will wait for it..

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.