Is querying elasticsearch directly from NextJS server-side a bad pattern?

Hi everyone,

I’m building a Next.js app (App Router) and trying to decide on the right way to integrate Elasticsearch.

I know Next.js allows direct server-side data access without needing an API layer, which makes this approach convenient. However, I’ve also seen recommendations to place Elasticsearch behind a separate backend/API.

My main questions:

  • Is querying Elasticsearch directly from Next.js server-side considered a bad practice?

  • Are there concerns around security, scaling, or maintainability with this approach?

  • At what point does it make sense to introduce a dedicated API layer instead?

For context:

  • This is a production scale app

  • Queries include text heavy searches with some aggregations and ranking for sponsored results

Would appreciate any guidance or real-world experiences.

Thanks!

Moderator: Likely AI Generated.

Not a bad pattern in principle, but there are specific gotchas that matter at production scale worth thinking through.

Security: Querying from Next.js server-side (Server Components, Route Handlers, getServerSideProps) is fine. Your ES credentials never leave the server. What you want to watch is any path where raw ES query structure could be derived from unsanitised client input. If you pass user search terms directly into query DSL without validation you can end up with query injection scenarios. A thin validation layer helps even if it lives in the same Next.js codebase.

Connections and connection pooling: This is the most practical concern at production scale. The official @elastic/elasticsearch client maintains a connection pool internally, but that depends heavily on your deployment model. With Vercel or similar serverless runtimes, each function invocation may spawn a fresh client instance, meaning no persistent pool and cold start overhead per request. With a long-lived Node process (self-hosted, containers, Render, Railway) the pool persists and works well. A module-level singleton helps:

import { Client } from '@elastic/elasticsearch';
let client = null;
export function getESClient() {
  if (!client) client = new Client({ node: process.env.ES_URL });
  return client;
}

Under heavy serverless load the per-request connection cost becomes measurable.

When to add a dedicated API layer: For your specific use case (text search plus aggregations plus sponsored ranking), the logic for composing queries, applying ranking rules, and injecting sponsored results gets complex enough that isolation is worth it. If you also have other consumers (mobile app, dashboard, third-party integrations) needing the same search, a service pays for itself immediately. For a single Next.js app with stable requirements, co-located queries are simpler and entirely workable.

Practical approach: start with direct queries in Route Handlers rather than Server Components to keep response times predictable, use a singleton client, and extract to a separate service when you hit (1) multiple consumers, (2) need to scale the search backend independently, or (3) query composition logic grows past a few hundred lines and becomes hard to test in isolation.