Mysql server to elasticsearch


I am trying to input the data my sql server to the elasticsearch. i am running ELK in one server and mysql is in the other server they both are in the same network.I installed the mysql JDBC connector to the Server where i am running ELK. I am not sure what changes do i have to do with the logstash.yml file and with the config file.

hope you are very new to elasticsearch. Using logstash you can load data from MySQL server to elasticsearch.
Try to explore n logstash.
We can load data based on query[Data mismatch in elasticsearch when loading data from database using logstash]

did you make any changes to logstash.yml file like did you change the logstash bind address? @balumurari1

like balumurari said you seems fresh to this. same happen to me.
key is to read few thing and test. but here is some pre example. i am reading oracle data and hence it should be pretty much same

[root@localhost logstash]# cat logstash.yml |grep -v '#' mytest /data/logstash
pipeline.workers: 4
pipeline.batch.size: 256
config.reload.automatic: false
config.debug: false "localhost"
path.logs: /log/logstash false

[root@localhost conf.d]# cat p1-jobs.conf |grep -v '#'
input {
jdbc {
jdbc_validate_connection => true
jdbc_connection_string => "jdbc:oracle:thin:@ora01:1521/<db_name>"
jdbc_user => "user"
jdbc_password => "user1"
jdbc_driver_library => "/usr/lib/oracle/12.2/client64/lib/ojdbc8.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
statement => "select JOB, PROJECT from JOBS where STATUS='running' "
last_run_metadata_path => "/tmp/logstash-db1.lastrun"
record_last_run => true
schedule => "*/2 * * * *"

filter {

output {
stdout { codec => rubydebug }

now run this config file from command line every time when you want to test it

/usr/share/logstash/bin/logstash -f p1-jobs.conf

yeah i am new to this, thank you.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.