There's a binary version of json that is supported by mongodb called BSON. There's a add on project for jackson that adds BSON support to jackson and this allows you to easily parse and serialize to BSON: https://github.com/michel-kraemer/bson4jackson. I've actually integrated that recently into my jsonj library (which uses jackson). But I don't recommend using it unless you really need it (e.g. to integrate with mongodb).
There are a few counter intuitive things about BSON. If size is you main concern: don't use it. You'll actually use up more space typically. If parsing overhead is your main concern, the biggest penalty is mapping to in memory object structures. This doesn't change a lot between bson and json. Jackson includes a streaming JSON parser which is pretty much as fast as it gets (for json). As far as I can measure BSON support in jackson is more verbose and not a whole lot faster. Which, is why recommend not using it.
As far as I understand the Elasticsearch architecture, it uses a streaming JSON parser. I don't think the performance would differ hugely if they were to use BSON (might actually degrade a little). Much of the benefit from dedicated binary protocols comes from their reduced size. You can probably get similar improvements from simply enable gzip compression (which es supports). However binary protocols are difficult to deal with in terms of evolving the API, adding new features and using the API over HTTP.
Elasticsearch avoids a lot of parsing and serializing overhead. For example, it doesn't reconstruct in memory trees for your documents in most cases and the way it fetches and stores document json is pretty efficient already. Also, given that it is actually a json document store, it would be kind of weird to not have a json based API.
It is true that parser overhead is typically lower for binary protocols. However, parsing overhead is unlikely a big factor in overal ES performance for most setups. This is why e.g. Logstash now recommends using the http protocol as of 2.0. The advantage of using an embedded node and its internal binary protocol is just not enough to justify the complexity of that solution.