Can I use ES for 100 GB of Data?

We have around 100 GB of data, so can i use Elasticsearch for searching in this huge amount of data, will it give response in ms ? If yes than what should be the machine configuration to make it respond within 200ms?

100 GB isn't a lot of data. The machine configuration you need for reasonable query performance depends on various factors, including how the data is mapped, what kind of queries you make, and how many concurrent queries you expect. To give you some kind of idea, I have ~300 GB data on each 64 GB RAM VM node with four cores. The Kibana query performance is adequate but on the slow side.

Sorry, first of all our data is 300 GB, and as far as query is concerned we are not using aggregation or the prefix queries, but ngram tokenizers are used, and we have 5 node cluster 2 client node (load balancer) and 3 master+data node.
As far as machines are concern they have 4 cores & 4 GB of RAM each, will it suffice or we need to upgrade?

That's very hard to say, but my hunch is that it'll be enough. If you have any amount of non-analyzed fields you can decrease the RAM need by enabling doc values.

cool then
thanks