How to search text in a table containing 100M rows?

I have a database that contains over 100 million rows in one table.

At the beginning we have used MySQL but it's been difficult to do a search (PHP, CakePHP) from within - it takes an hour to search words in a field that contains text.

After we migrated to MongoDB but we did not have the same search problem - takes 40 minutes to find a word in a field that contains text.

We are currently thinking about moving to ElasticSearch for better search time. Is that this is a good approach or is there a better tool for search in less than 3 min knowing my dedicated server contains just a Intel Xeon W3520 c 4/8 t + 2.66 GHz RAM 32 GB ECC under Debian and ispconfig3.

I want to know is that his is the right solution to go with my server and managed well the 100 million data.

Its very different then WHERE LIKE in MySQL but you can absolutely search
100 million rows in tens of milliseconds so long as you don't try to return
a huge number of them. You should spend a while reading the getting started
so you understand what is going on before you make a decision though.


thank you for your answer, in your opinion what is the best way to get my response in time is 3 min max confuguration with my server

Read the docs like I said, stick to term queries where you can - work on
making sure your analysis spits out the terms you want to search for.