When using logstash to index data from MySQL to ElasticSearch,only the first row is being displayed

This is my logstash configuration file:

      input
{
    jdbc 
    {

     jdbc_driver_library => "/home/vatsa/logstash/mysql-connector-java-5.1.38-bin.jar"


     jdbc_driver_class => "com.mysql.jdbc.Driver"

     jdbc_connection_string => "jdbc:mysql://localhost:3306/stud"

     jdbc_user => "root"
     jdbc_password => ""
      statement => "select * from det"
   }
}

output

{
elasticsearch
{

index => "det"
document_type => "contact"
document_id => "%{uid}"
hosts => "localhost:9200"

}stdout { codec => json_lines }
}

Logstash startup and shutdown is created.But when i do a get request using,

curl -XGET 'localhost:9200/det/_search?pretty&q=*' ,this is the output:

"hits" : { "total" : 1, "max_score" : 1.0, "hits" : [ { "_index" : "det", "_type" : "contact", "_id" : "%{uid}", "_score" : 1.0, "_source":{"ID":5,"NAME":"SHUBH","USN":"099","@version":"1","@timestamp":"2016-01-26T05:42:08.362Z"} } ] }

But there are 5 entities in the table.What is the reason for this??How to solve this??

Your database record does not seem to contain a field called 'uid', which means that the string '%{uid}' is used for all records, causing the same document to be updated multiple times. Use the stdout output plugin with a ruby debug codec to see the exact structure of your event and the fields available in Logstash.

Hey,
Thanks a lot Sir.It worked.

But my MySQL table has 191000 rows.when i try "logstash/bin/logstash -f /home/vatsa/logs/conts-out.conf",it runs for a while and shows this error.

java.lang.OutOfMemoryError: Java heap space
Dumping heap to /home/vatsa/logstash/heapdump.hprof
Exception flushing buffer at interval! {:error=>"Java heap space", :class=>"Java::JavaLang::OutOfMemoryError", :level=>:warn}
Error: Your application used more memory than the safety cap of 1G.
Specify -J-Xmx####m to increase it (#### = cap size in MB).
Specify -w for full OutOfMemoryError stack trace

I dint understand the error.How to fix this.
Thanks in advance.

Does it make a difference if you e.g. set the jdbc_fetch_size parameter, e.g. to 10000?

No sir.The java heapspace error is thrown..

this is the error :
java.lang.OutOfMemoryError: Java heap space
Dumping heap to /home/vatsa/logstash/heapdump.hprof
Exception
flushing buffer at interval! {:error=>"Java heap space",
:class=>"Java::JavaLang::OutOfMemoryError", :level=>:warn}
Error: Your application used more memory than the safety cap of 1G.
Specify -J-Xmx####m to increase it (#### = cap size in MB).
Specify -w for full OutOfMemoryError stack trace

I couldnt understand the error.