Indexes get assigned new fields from other indexes ?!

Hello,

I am currently pushing mysql data to elastic search via the logstash jdbc plugin.
The index gets created and the proper field names (the column names) and data types.

But as soon as I create another .conf file with another mysql source and elastic index. The fields seem to get appended to all the mysql indexes...

Can you see whats wrong with my config ??

input {
jdbc {
jdbc_driver_library => "/etc/logstash/mysql/mysql-connector-java-5.1.39-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://...:3306/xyz"
jdbc_user => "user"
jdbc_password => "something"
jdbc_fetch_size => 1000
schedule => "* * * * *"

tracking_column => id
use_column_value => true
last_run_metadata_path => "/var/lib/logstash/jdbc_last_run_discussionmsg"

#clean_run => true
#sql_log_level => "debug"
statement => "
    select
     id,
     source,
     dateEntry
    from
     discussionmsg
    where
     id > 180000000
     and id > :sql_last_value
"

}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "discussionmsg-%{+YYYY.MM.dd}"
document_type => "mysql"
}
}

Thanks.

I moved the post to #logstash.
(Please format your code)

Logstash has a single event pipeline. Events from all inputs in all configuration files are fed into that pipeline and all filters from all configuration files process those events that are then fed to all outputs from all configuration files. If you don't like this you need to use conditionals to choose which filters and outputs to send particular events to. See https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html.

Thanks magnusbaeck,

I just did not get that about logstash! All events are feed to all configurations !!

I have been learning just so many moving parts in the Elastic Stack adventure that I did not get that.
So to fix my field issues I set the jdbc type to a unique name and added a conditional in the output part.

input {
  jdbc {
    jdbc_driver_library => "/etc/logstash/mysql/mysql-connector-java-5.1.39-bin.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_connection_string => "jdbc:mysql://...:3306/db"
    jdbc_user => "user"
    jdbc_password => "*****"
    jdbc_fetch_size => 1000
    schedule => "* * * * *"
    type => "mysql_discmsg" #<<< here
    tracking_column => id
    use_column_value => true
    last_run_metadata_path => "/var/lib/logstash/jdbc_last_run_discussionmsg"
    #clean_run => true
    #sql_log_level => "debug"
    statement => "
        select
         id,
         source,
         dateEntry
        from
         discussionmsg
        where
         id > 180000000
         and id > :sql_last_value "
  }
}
output {
  if [type] == "mysql_discmsg" { #<<< here
    elasticsearch {
        hosts => ["localhost:9200"]
        index => "discussionmsg-%{+YYYY.MM.dd}"
        document_type => "mysql"
        #manage_template => true
    }
    #stdout { codec => json_lines }
  }
}

Thanks again.
Chris