Dumping requests

I'm reverse engineering and optimizing ES instance.
One of the things that would help me enormously is to understand what operations (updates and queries) are coming in.
What's the best way do dump & analyze the operations initiated by clients?

Additionally I'd like to audit client calls to make sure some of my prescriptions are followed — want to catch "rogue" requests that don't follow rules we set. For whatever reasons code reviews won't do.

There isn't really a way to do this directly within Elasticsearch itself. You can get some information from GET _nodes/stats and GET <index>/_stats but that seems like it won't be enough detail to answer your questions. The simplest way to get hold of the requests themselves is probably to stick a proxy in front of ES and capture the requests there.

actually there is. It's two-step process:

put /<index>/_settings/
{
  "index.search.slowlog.threshold.query.trace": "0s",
  "index.indexing.slowlog.threshold.index.trace": "0s",
  "index.indexing.slowlog.source": true,
}

plus

PUT _cluster/settings
{
  "persistent": {
    "logger.org.elasticsearch.http.HttpTracer": "trace",
    "http.tracer.include": "*"
  },
  "transient": {
    "logger.org.elasticsearch.http.HttpTracer": "trace",
    "http.tracer.include": [
      "*"
    ]
  }
}

If that's enough detail for you then great :+1: It won't show you all operations, nor can you see the body of HTTP requests using that tracer.

the slow log shows the body (I guess the parsed one).

installing reverse proxy so I just get idea what's going on with my server is ridiculous. I see you are team member — maybe instead of telling me what won't work, file & prioritise issue that would fix it? I doubt I'm the only one needing this.