Migrating Mysql data to elasticsearch using logstash

Hi,
I'm new to kibana. i need a brief explanation of how can i convert MySQL data to Elastic Search .I have just installed all the requirements and tried to convert data using jdbc input plugin but, i'm unable to do it.i didn't even understand how the process is going i just copy pasted the code from one particular website.can anyone explain the steps of how to do conversion using logstash step by step .

TIA,
padmaja

Have you read https://www.elastic.co/blog/logstash-jdbc-input-plugin?

It's highly likely that nobody is going to write a general and detailed explanation of this just for you. If you ask a more specific question you'll have a bigger chance of getting useful responses.

Comment out your elasticsearch output for now and verify that the stdout outputs are producing data, indicating that the jdbc input is working. Introduce complexity gradually.

Here, is the logstash.conf file where i specified the input and output:
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:3306/kibana"
jdbc_user => "xxx"
jdbc_password => "xxxxx"
jdbc_driver_library => "/root/mysql-connector-java-5.1.30/mysql-connector-java-5.1.30-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
statement => "SELECT * FROM datalog"

}
}
output {
elasticsearch {
"hosts" => "localhost:9200"
}
stdout { codec => rubydebug }
}
after running the above file by using ./logstash -f logstash.conf we are getting the below error:

WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
{
"name" => "abc",
"id" => 1,
"@timestamp" => 2018-09-24T08:48:56.067Z,
"@version" => "1",
"testid" => 10
}
{
"name" => "xyz",
"id" => 2,
"@timestamp" => 2018-09-24T08:48:56.068Z,
"@version" => "1",
"testid" => 10
}
{
"name" => "pqr",
"id" => 3,
"@timestamp" => 2018-09-24T08:48:56.068Z,
"@version" => "1",
"testid" => 30
}

But, index is not generated for the above file.suggest me if there any modifications in the above file

In your elasticsearch output there shouldn't be any double quotes around hosts. That might not be why it doesn't work for you but start by fixing it anyway.

after removing the double quotes we are not getting the errors but the index pattern is not getting generated.

Output:

OpenJDK 64-Bit Server VM warning: If the number of processors is expected to inc rease from one, then you should configure the number of parallel GC threads appr opriately using -XX:ParallelGCThreads=N
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Sending Logstash's logs to /usr/share/logstash/logs which is now configured via log4j2.properties
{
"id" => 1,
"name" => "abc",
"testid" => 10,
"@timestamp" => 2018-09-25T04:44:41.239Z,
"@version" => "1"
}
{
"id" => 2,
"name" => "xyz",
"testid" => 10,
"@timestamp" => 2018-09-25T04:44:41.239Z,
"@version" => "1"
}
{
"id" => 3,
"name" => "pqr",
"testid" => 30,
"@timestamp" => 2018-09-25T04:44:41.240Z,
"@version" => "1"
}

the indices generated:
[root@localhost bin]# curl localhost:9200/_cat/indices?v

health status index uuid pri rep docs.count docs.deleted store.size pri.store.size
yellow open .kibana sIUptrmVQaWxrA3b8NYWoA 1 1 154 1 168.6kb 168.6kb

You can refer below tutorial also if you like to.

Thanks for response.
I have provided inputs in my file as mentioned in your tutorial .Although, index is not getting generated.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.