That depends on several factors, including the performance of the host, filters used, etc.
elasticsearch output streaming limit per/sec ?
if many log events come in same sec, this may cause performance issue in logstash and kibana searching? may cause lost logs?
If you enable the persistent queue you shouldn't lose any events. Even without the persistent queue you'll be fine if you only have inputs that handle backpressure well. The file input, for example, will just stop reading from the input file and in that sense has its own queue system while the udp input needs to deal with whatever gets sent to it.
using 50-line grok match filter cause slow the logstash performance?
Large grok filters will of course be detrimental to performance but whether that actually matters for you is another story.
This blog post provides some very good guidelines on how to best use grok. In addition to this I would add that it helps being aware of other types of filters, so you do not try to use grok to parse content where better and more efficient options exist, e.g. lists of key-value pairs and JSON content.