In order to get some kind of answer you need to describe your use-case and provide an indication of how much data you intend to index every day. If you have a logging and/or metrics use case with time-based data, this will typically depend on the expected retention period.
I would recommend you have a look at the following resources:
I have gone though above links and elastic use cases and we observe our use case will be Log monitoring and analysis using dashboards with full text search.
In our case it would be Log analysis, where we will input data on a daily basis (~appx 1 TB per day ) and run searches on them this process will be continuing for next 2 years and I think we should go with Hot-warm architecture.
Can you suggest me what are the hardware requirements to set up ES cluster along with kibana?
If you want to ingest 1TB per day and keep it searchable for 2 years, that will be a large cluster (or even multiple clusters). Given the scale I would recommend you run some tests to determine how much space your type of data will take up on disk and how much data your nodes will be able to hold while still meeting search latency requirements. This will allow you to more accurately estimate the size of the cluster(s) required than anyone here can provide. The links I provided earlier should guide you on how to do this.