Mysql database shows only latest record


(Mamta ) #1

Hi,
I am using ELK 5.6.8 version. My ELK is running on RedHat machine. When I first run logstash.conf for MySQL database then I was able to see all the records of one table but now I am not able to see all records of one able. Kibana shows only last/latest row data. When I run logstash.conf file from the terminal on terminal I am able to see all the records of one table but on kibana only last records. When I used curl command as shown as below for elasticsearch to see MySQL index data then It shows only that last records which show on kibana.

curl -XGET 'http://localhost:9200/mysql/_search?pretty'

How can I Identify whether elasticsearch is able to store all the records or it is storing only last records?


(Eduardo González de la Herrán) #2

Hi @mamta,

By default, the _search API will show you only 10 hits if you don't specify otherwise. But in the response you should also be able to see the total number of hits that the search returned.

Take a look to search URI and BODY requests in the documentation.
https://www.elastic.co/guide/en/elasticsearch/reference/current/search-uri-request.html

And you can always take a look to the stats API to see general information about the index.
https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-stats.html

In your case it would be something like :9200/mysql/_stats
There you should be able to see the amount of docs of your index for example, and you could compare it with your expectations from logstash.

Another possible explanation for not having all the documents in elasticsearch could be this:

  • If you have ever deleted your index in elasticsearch maybe logstash hasn't sent all the information back (as the information was already sent). in such case I would recommend to force logstash to send again all the info.

Regards!
Eduardo


(Mamta ) #3

Okay, let me rephrase the question. I have a config file to connect to a MySQL database and retrieve all the records from a table. The table only contains 15 records. However, only the latest record (only one record) is being pulled and displayed from the table .


(Christian Dahlqvist) #4

What does your config file look like? Are you by any chance setting the document ID based on a parameter that does not exists, causing multiple updates of the same document? (You can check this by looking at the ID of the document you can see through Kibana.)


(Mamta ) #5

I am uploading my logstash.conf file.I am not able to understand the problem. When I run conf file first time then kibana was able to show all the 15 records but now when I run it, it shows only latest record.
Thank you.


(Christian Dahlqvist) #6

When you look at the output to stdout are fields in the document upper-case?


(Mamta ) #7

I am not able to get your question?


(Christian Dahlqvist) #8

Can you show me what the record you see looks like in Kibana?


(Mamta ) #9

sure!!
May I ask you one help from you?


(Mamta ) #10

Is it possible you to join me on zoom?
That will help me to understand the problem and will give you more clearification.
Thank you


(Christian Dahlqvist) #11

No, that is not possible. I prefer discussions to be public so other users can benefit.


(Mamta ) #12

Ok.

I am uploading kibana output of mysql database.


(Christian Dahlqvist) #13

As you can see there, all field names in the document are lower-cased, which is why document_id => "%{LOG_ID}" is not working, which causes the same document to be updated repeatedly. Change this to document_id => "%{log_id}" and it should work.


(Mamta ) #14

Let me try now. Will you let you know.
Thank you.


(Mamta ) #15

Hey,

That's worked for me. Thank you so much.

I have one more question. I have 4-5 conf file for different logs so every time I have to run logstash conf file for current log. I want to continuously run the conf file in the backend. I have tried to run from bin#./logstash -f logstash.conf and also tried to run from /etc/logstash/confd directory but not successful. I also tried nohup and crontab to run conf file in the backend but at a time only one file is running continuously.

Thank you so much.


(system) #16

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.