I want to use filebeat to send logs to elasticsearch with simple structure (date in custom format, http code, processing time in ms, query text).
So log line should be parsed and these data should go to different fields in Index.
Also I want to add another field (length of query text in symbols, provided it is in UTF-8 encoding), and I want to truncate the actual text so it fits to 32Kb (because of ES limitation).
As far as I understand I can do all these things in Logstash (even add custom handler written in Ruby).
The question is: is it possible to avoid using logstash at all and achieve these transformations using filebeat only (and possibly in ES using ingest API, pipelining. etc).