Tuning Elasticsearch for high volume of data

I have around eight fields in my queries with ELK, viz. bank name, transaction id, timestamp, response code, transaction type, notes, transaction channel, transaction amount. In Kibana I have used the following in different dashboards:
exists:bankname
exists:transactionid
Besides I have different pie charts based on transaction channel ( example: mobile, internet, ATM ), transaction type and response code ( success/failure).

Can you please suggest whether I need to tune something in Elastic search, such that the dashboard gives good performance with large amount of data?

Are you not getting good performance now?
What does your setup consist of?

Hi Mark,
The performance is not bad in our development environment, with 1000 transactions getting per 10 minutes. In production we are expecting 5000 transactions per second and the dashboard will display data for last two weeks. Hence wanted to check if I need to do some tuning to maintain the performance of the dashboard.

The setup is as below: The java application is adding entries to a log file for each transaction. These are fed to logstash. One logstash instance will have 3 such input log files. KIbana will render the dashboard from elasticsearch.