Csv output is not working

I am trying to use elasticsearch input and CSV output to query my index and output a CSV file with data. However, the CSV file does't have the any data after logstash finishes processing. All I see is timestamp and %{message} (see below)

Is this a bug?

2017-01-13T17:58:27.540Z %{host} %{message}2017-01-13T17:58:39.205Z %{host} %{message}2017-01-13T17:58:39.207Z %{host} %{message}2017-01-13T17:58:39.208Z %{host} %{message}2017-01-13T17:58:39.209Z %{host} %{message}2017-01-13T17:58:39.213Z %{host} %{message}2017-01-13T17:58:39.214Z %{host} %{message}2017-01-13T17:58:39.216Z %{host} %{message}2017-01-13T17:58:39.217Z %{host} %{message}2017-01-13T17:58:39.218Z %{host} %{message}2017-01-13T17:58:39.220Z %{host} %{message}2017-01-13T17:58:39.221Z %{host} %{message}2017-01-13T17:58:39.222Z %{host} %{message}2017-01-13T17:58:39.224Z %{host} %{message}2017-01-13T17:58:39.225Z %{host} %{message}2017-01-13T17:58:39.226Z %{host} %{message}2017-01-13T17:58:39.228Z %{host} %{message}2017-01-13T17:58:39.230Z %{host} %{message}2017-01-13T17:58:39.230Z %{host} %{message}2017-01-13T17:58:39.231Z %{host} %{message}2017-01-13T17:58:39.232Z %{host} %{message}2017-01-13T17:58:39.235Z %{host} %{message}2017-01-13T17:58:39.235Z %{host} %{message}2017-01-13T17:58:39.236Z %{host} %{message}2017-01-13T17:58:39.237Z %{host} %{message}2017-01-13T17:58:39.238Z %{host} %{message}2017-01-13T17:58:39.239Z %{host} %{message}2017-01-13T17:58:39.241Z %{host} %{message}2017-01-13T17:58:39.244Z %{host} %{message}2017-01-13T17:58:39.246Z %{host} %{message}2017-01-13T17:58:39.247Z %{host} %{message}2017-01-13T17:58:39.249Z %{host} %{message}2017-01-13T17:58:39.250Z %{host} %{message}2017-01-13T17:58:39.250Z %{host} %{message}2017-01-13T17:58:39.253Z %{host} %{message}2017-01-13T17:58:39.254Z %{host} %{message}2017-01-13T17:58:39.254Z %{host} %{message}2017-01-13T17:58:39.255Z %{host} %{message}2017-01-13T17:58:39.255Z %{host} %{message}2017-01-13T17:58:39.257Z %{host} %{message}2017-01-13T17:58:39.257Z %{host} %{message}2017-01-13T17:58:39.258Z %{host} %{message}2017-01-13T17:58:39.258Z %{host} %{message}2017-01-13T17:58:50.077Z %{host} %{message}2017-01-13T17:58:50.095Z %{host} %{message}2017-01-13T17:58:50.097Z %{host} %{message}2017-01-13T17:58:50.098Z %{host} %{message}2017-01-13T17:58:50.100Z %{host} %{message}2017-01-13T17:58:50.114Z %{host} %{message}2017-01-13T17:58:50.115Z %{host} %{message}2017-01-13T17:58:50.119Z %{host} %{message}2017-01-13T17:58:50.119Z %{host} %{message}2017-01-13T17:59:09.152Z %{host} %{message}2017-01-13T17:59:13.141Z %{host} %{message}%

Can you please post a minimal test case for this?

1 Like

Here is the logstash config example. The JDBC input simply does a SELECT on users table.

input {
jdbc {
jdbc_driver_library => "ojdbc6.jar"
jdbc_driver_class => "Java::oracle.jdbc.OracleDriver"
jdbc_connection_string => "${DB_CONNECTION_STRING}"
jdbc_user => "${DB_USER}"
jdbc_password => "${DB_PASSWORD}"
jdbc_fetch_size => 10000
statement_filepath => "/conf/user.sql"
lowercase_column_names => false
}
}
filter {
mutate {
rename => { "URL" => "url" }
}
}
output {
csv {
path => "/conf/users.csv"
columns = ["firstName", "lastName", "employeeID", "phone", "url"]
}
}

If you're using Logstash 5.x then it might be because of this open issue. If that's the case, you may downgrade to Logstash 2.x until this gets resolved.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.