Logstash InfluxDB bad timestamp error

Hello,

I've almost got the configuration right but I am uncertain what is going on between the conversion of my data. I'm currently trying to turn mysql into influx.
My input: is a jdbc, select * from table query outputing to stdout:
{
"field1" => 2.92,
"field2" => "SomeString",
"time" => 2018-05-09T18:15:25.000Z,
...
"field10" => "SomeString"
}

For my influxdb output I have:
influxdb {
host => "localhost"
#port => 8086
user => "username"
password => "somepassword"
codec => "json"
db => "dbname"
use_event_fields_for_data_points => true
allow_time_override => true
exclude_fields => ["@timestamp", "@version", "sequence", "message", "type"]
data_points => {}
}

The error I get is:
jsons'
Jsons'

[2018-09-13T15:21:28,281][WARN ][logstash.outputs.influxdb] Non recoverable exception while writing to InfluxDB {:exception=>#<InfluxDB::Error: {"error":"unable to parse 'logstash field1=0.34E0,field2=0.41389E5,field3="somestring",field4="somestring",field5="somestring",field6="somestring",field7=0.3E1 2018-05-09T18:05:28.000Z': bad timestamp"}>}

I didn't miss write that error. I've noticed that instead of a comma after field7 there is space instead of a comma and field name.
Any help here would be awesome.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.