Analyze in LogStash or Index in ElasticSearch

I have a "best practices" question.
Should I try to analyze/index in logstash and then send successful analysis as extra data to ES
OR
should I read data in logstash as DATA and then analyze using a mapping template and associated analyzer on the ES side?

Let's take a particular example. Suppose my web server logs have a field called request.
The value of this field is the middle part of the first line of the HTTP request.
In other words, my request field is
"a/b/c?id=1&tag=2"
for a first line of "GET /a/b/c?id=1&tag=2 HTTP/1.1".

I wish to able to use an ES search query to search for request with path of "a/b/c" where the value of id is 1.
I could try to parse the request on the logstash side and send all path and query params as extra fields to ES, or I could leave the request as string and try to write code to match "id == 1" in my search request.

Note, that while I chose request as my example, I am interested in the meta question of

  • analyze in logstash and add fields OR
  • analyze using a mapping template on insert into ES OR
  • do no analysis for log fields, and write regular expressions for searching

As I am new to logstash and ES, I'm not sure about the pros and cons of those 2 choices.
Any and all feedback is much appreciated.

I'd do it in LS myself, because I reckon most of the data should be enriched and structured before sending it to ES.

Mark Walcom: Thank you for replying.

In my use case, asking for the value of a query param will be rare, but required.
So, again, my inclination is to avoid the verbosity of expanding query params and paths in LS,
and when I need to search by query param, I can use the approach outlined in:
https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-query-string-query.html