SQL too large in big data Indices

Hi dear community I hope there's a solution for this specific problem, I just connected my ElasticCloud Production Environment to ODBC and I connected and tested Successfully, Now when I try to connect my indices into power bi show this error:

DataSource.Error: ODBC: ERROR [HY000] [Elastic][EsODBC 6.7.0(1b41af1+,u,r) Driver]parsing_exception: line 201:6: SQL statement too large; halt parsing to prevent memory errors (stopped at depth 200)

Detalles:
DataSourceKind=Odbc
DataSourcePath=dsn=Elastic Production Environment
OdbcErrors=Table

I'm not sure why this Happens, In minor data size indices works perfectly, I tried to look around if I got a solution but the complexity shoot in my face, It may be something to do with the size of the data, there's a parameter that I need to put into my indices at advanced settings?

1 Like

Hi @Turnover_Dev,

That error comes from a protection mechanism we introduced following this report where a large-ish query was leading to StackOverflowErrors.

What exactly does "try connect my indices into power bi" mean? Is it a query you are running?

Hi @Turnover_Dev.
As @Andrei_Stefan mentioned, that's triggered by a protection mechanism. But to understand if there's some workaround we could suggest, what have been the steps leading to the error? Does it occur as you just point PowerBI to your index, or does it occur at a later step, when you maybe try to generate a report based on the data?
Also, although it might not be directly related, but how many fields does the index failing to load have?

I connected Through ODBC driver into my elastic cloud production environment, the problem comes when I try to connect an Indice that have a lot of information, in other hand I cannot go forward more than just connect Elastic to Power BI, some Indices got information correctly I can create Visualizations, generate reports with that data, but other Indices that have like 1.3GB of data, error appear, I don't know exactly how many fields I have in the specific Index, but here is the conf at the settings:

    {
  "settings": {
    "index": {
      "mapping": {
        "total_fields": {
          "limit": "3000"
        }
      },

Maybe there's an issue with Power BI trying to load big size data?
Thank's for your reply guys.

@Turnover_Dev any chance you could share the mapping of the index that cannot be loaded? GET [index_name]/_mapping. If it's too large for a forum post, you can share it via gist.

here is the index mapping:

thank you Andrei.

@Turnover_Dev We have decided to replace this protection mechanism as it's error prone and also often gives false positives. This is tracked here.

Yet still I have the issue, large datasets are prone error as you said it, and we cannot connect to external software like Power BI directly without that error of protecttion mechanism I found another way out but its not convenient in my case and with poor performance, I dont know what's happening under the hood, Thank you.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.