Hi All,
I have a logstash config that connects to a JDBC connection and I want to output to a Kafka Cluster. I am able to connect fine , I am also seeing messages expect they look like the below rather than my actual message.
2017-11-30T19:23:54.639Z %{host} %{message}
However, if i configure it to read from a csv file instead of example messages show correctly. See example:
2017-11-30T19:49:35.827Z MB-C02TF1QUGTFM 4,5
2017-11-30T19:49:35.827Z MB-C02TF1QUGTFM 3,4
2017-11-30T19:49:35.826Z MB-C02TF1QUGTFM 2,3
2017-11-30T19:49:35.826Z MB-C02TF1QUGTFM 1,2
2017-11-30T19:49:35.828Z MB-C02TF1QUGTFM 7,8
2017-11-30T19:49:35.828Z MB-C02TF1QUGTFM 5,6
Below is my config for CSV:
input {
file {
path => "example.csv"
}
}
filter {
csv {
autodetect_column_names => "true"
}
}
output {
stdout { codec => rubydebug }
kafka {
topic_id => "media"
bootstrap_servers => "172.20.85.174:9092"
}
}
Below is my JDBC Configuration. Any ideas?
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://db:3306/db"
jdbc_user => "user"
jdbc_password => "password"
jdbc_driver_library => "/opt/logstash-5.3.0/bin/mysql-connector-java-5.1.42-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
statement => "select * from table"
}
}
filter {
csv {
autodetect_column_names => "true"
}
}
output {
stdout { codec => rubydebug }
kafka {
topic_id => "media"
bootstrap_servers => "172.20.85.174:9092"
}
}
Thank you!