As @Christian_Dahlqvist said, you could send your logs directly to Elasticsearch via the HTTP API, but using an intermediate log file can be useful as a persistent buffer and queue.
If you choose to write a log file and ingest it via filebeat, the filebeat configuration documentation can get you started. In particular the inputs documentation, the elasticsearch output documentation and the json decoding documentation could be of use.
For the new fields added by decoding your json data, I would strongly recommend to add them to the index template using the setup.template.append_fields setting. This ensures your data types are correctly interpreted by Elasticsearch.
If you run into any roadblocks along the way, please don't hesitate to ask for specific advice.