Hey guys,
I'm using logstash and jdbc to import few tables. But all tables are running to the field limit (>1000).
I tried to import several other tables with several hundred columns (still each <1000) but but they add up together.
the following table has only 7 columns and no foreign keys.
logstash_sample.conf: |
# all input will come from filebeat, no local logs
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://URL"
jdbc_user => "USER"
jdbc_password => "PW"
jdbc_driver_library => "/usr/share/mysql-connector-java-8.0.21.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
schedule => "*/20 * * * *"
statement => "SELECT * FROM 7columns"
}
}
output {
elasticsearch {
hosts => [ "elasticsearch-client:9200" ]
index => "sample"
}
}
Running this somehow results in an index with >1000 fields.
Every .conf has its own output index.
What's the schema behind?
Thanks in advance!