Elastic Integrations fail to install and lead to broken Dashboards when used with multiple Kibana Spaces

Production Environment:

  • Elastic-Stack 8.8.2
  • Debian 12
  • 7 Elasticsearch Nodes
  • 3 Kibana Nodes
  • 2 Fleet Agents
  • 5 Kibana spaces
  • 70 Elastic-Agents

Since about Elastic-Stack 8.8.0 we have issues installing or reinstalling Elastic Integrations into our Kibana Spaces. it seems as there is an issue with duplicated or existing IDs of objects used in the integrations. The result are dashboards, that have broken links to the data view.

We've already tried to compealty remove all integrations from a certain type and also remove it from all agent policies, but this issue appears again as soon as we install the integration. Related integrations are Cisco Duo, Cisco ISE, System, Windows. We expect that probably all integrations are affected by this issue and it might not be a problem with the integration themselfs but more about how the artifacts are handled during the installation with multiple kibana spaces.

Kibana.log when reinstalling an Integration over the Integration Page (each log line is formatted to make it better readable)

{
  "service": {
    "node": {
      "roles": [
        "background_tasks",
        "ui"
      ]
    }
  },
  "ecs": {
    "version": "8.6.1"
  },
  "@timestamp": "2023-06-30T00:08:25.054+02:00",
  "message": "Saved object [epm-packages-assets/1cabe36a-0068-547d-aaa3-886518b6e60a] not found",
  "log": {
    "level": "WARN",
    "logger": "plugins.fleet"
  },
  "process": {
    "pid": 972
  },
  "trace": {
    "id": "e2ad42192a95a17c128335f50ffbf6be"
  },
  "transaction": {
    "id": "ceb5620d0d3d0303"
  }
},

{
  "service": {
    "node": {
      "roles": [
        "background_tasks",
        "ui"
      ]
    }
  },
  "ecs": {
    "version": "8.6.1"
  },
  "@timestamp": "2023-06-30T00:09:10.385+02:00",
  "message": "Failure to install package [cisco_ise]: [Error: Non-unique import objects detected: [dashboard:cisco_ise-44afda90-3991-11ed-bb16-3b8b6259e7b8,dashboard:cisco_ise-506e8200-39a5-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-6b611af0-39a0-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-a42bdb60-39a8-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-db12cdf0-3a2f-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-e0ca7fa0-398e-11ed-bb16-3b8b6259e7b8,dashboard:cisco_ise-e2146a20-39a1-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-fcb9bc40-3a32-11ed-a2b2-1d4b9d412e28,search:cisco_ise-2c7c0eb0-a505-11ec-ab9d-4b8e737a22d9,search:cisco_ise-39e47010-a09b-11ec-a0a2-1598702abf83,search:cisco_ise-47c77dc0-a065-11ec-a0a2-1598702abf83,search:cisco_ise-5f739b70-a0a6-11ec-a0a2-1598702abf83,search:cisco_ise-ac5b9ba0-a02d-11ec-a0a2-1598702abf83,search:cisco_ise-d1ba7b80-a075-11ec-a0a2-1598702abf83,search:cisco_ise-eecf4510-a058-11ec-a0a2-1598702abf83,search:cisco_ise-f681d1f0-a09f-11ec-a0a2-1598702abf83]]",
  "log": {
    "level": "WARN",
    "logger": "plugins.fleet"
  },
  "process": {
    "pid": 972
  },
  "trace": {
    "id": "45544d845cdb1212ff1b330385f5703e"
  },
  "transaction": {
    "id": "735b97bae9a8ad25"
  }
}
{
  "service": {
    "node": {
      "roles": [
        "background_tasks",
        "ui"
      ]
    }
  },
  "ecs": {
    "version": "8.6.1"
  },
  "@timestamp": "2023-06-30T00:09:10.385+02:00",
  "message": "uninstalling cisco_ise-1.9.0 after error installing: [Error: Non-unique import objects detected: [dashboard:cisco_ise-44afda90-3991-11ed-bb16-3b8b6259e7b8,dashboard:cisco_ise-506e8200-39a5-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-6b611af0-39a0-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-a42bdb60-39a8-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-db12cdf0-3a2f-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-e0ca7fa0-398e-11ed-bb16-3b8b6259e7b8,dashboard:cisco_ise-e2146a20-39a1-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-fcb9bc40-3a32-11ed-a2b2-1d4b9d412e28,search:cisco_ise-2c7c0eb0-a505-11ec-ab9d-4b8e737a22d9,search:cisco_ise-39e47010-a09b-11ec-a0a2-1598702abf83,search:cisco_ise-47c77dc0-a065-11ec-a0a2-1598702abf83,search:cisco_ise-5f739b70-a0a6-11ec-a0a2-1598702abf83,search:cisco_ise-ac5b9ba0-a02d-11ec-a0a2-1598702abf83,search:cisco_ise-d1ba7b80-a075-11ec-a0a2-1598702abf83,search:cisco_ise-eecf4510-a058-11ec-a0a2-1598702abf83,search:cisco_ise-f681d1f0-a09f-11ec-a0a2-1598702abf83]]",
  "log": {
    "level": "ERROR",
    "logger": "plugins.fleet"
  },
  "process": {
    "pid": 972
  },
  "trace": {
    "id": "45544d845cdb1212ff1b330385f5703e"
  },
  "transaction": {
    "id": "735b97bae9a8ad25"
  }
}
{
  "service": {
    "node": {
      "roles": [
        "background_tasks",
        "ui"
      ]
    }
  },
  "ecs": {
    "version": "8.6.1"
  },
  "@timestamp": "2023-06-30T00:09:10.388+02:00",
  "message": "failed to uninstall or rollback package after installation error Error: unable to remove package with existing package policy(s) in use by agent(s)",
  "log": {
    "level": "ERROR",
    "logger": "plugins.fleet"
  },
  "process": {
    "pid": 972
  },
  "trace": {
    "id": "45544d845cdb1212ff1b330385f5703e"
  },
  "transaction": {
    "id": "735b97bae9a8ad25"
  }
}
{
  "service": {
    "node": {
      "roles": [
        "background_tasks",
        "ui"
      ]
    }
  },
  "ecs": {
    "version": "8.6.1"
  },
  "@timestamp": "2023-06-30T00:09:10.389+02:00",
  "message": "Non-unique import objects detected: [dashboard:cisco_ise-44afda90-3991-11ed-bb16-3b8b6259e7b8,dashboard:cisco_ise-506e8200-39a5-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-6b611af0-39a0-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-a42bdb60-39a8-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-db12cdf0-3a2f-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-e0ca7fa0-398e-11ed-bb16-3b8b6259e7b8,dashboard:cisco_ise-e2146a20-39a1-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-fcb9bc40-3a32-11ed-a2b2-1d4b9d412e28,search:cisco_ise-2c7c0eb0-a505-11ec-ab9d-4b8e737a22d9,search:cisco_ise-39e47010-a09b-11ec-a0a2-1598702abf83,search:cisco_ise-47c77dc0-a065-11ec-a0a2-1598702abf83,search:cisco_ise-5f739b70-a0a6-11ec-a0a2-1598702abf83,search:cisco_ise-ac5b9ba0-a02d-11ec-a0a2-1598702abf83,search:cisco_ise-d1ba7b80-a075-11ec-a0a2-1598702abf83,search:cisco_ise-eecf4510-a058-11ec-a0a2-1598702abf83,search:cisco_ise-f681d1f0-a09f-11ec-a0a2-1598702abf83]",
  "error": {
    "message": "Non-unique import objects detected: [dashboard:cisco_ise-44afda90-3991-11ed-bb16-3b8b6259e7b8,dashboard:cisco_ise-506e8200-39a5-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-6b611af0-39a0-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-a42bdb60-39a8-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-db12cdf0-3a2f-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-e0ca7fa0-398e-11ed-bb16-3b8b6259e7b8,dashboard:cisco_ise-e2146a20-39a1-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-fcb9bc40-3a32-11ed-a2b2-1d4b9d412e28,search:cisco_ise-2c7c0eb0-a505-11ec-ab9d-4b8e737a22d9,search:cisco_ise-39e47010-a09b-11ec-a0a2-1598702abf83,search:cisco_ise-47c77dc0-a065-11ec-a0a2-1598702abf83,search:cisco_ise-5f739b70-a0a6-11ec-a0a2-1598702abf83,search:cisco_ise-ac5b9ba0-a02d-11ec-a0a2-1598702abf83,search:cisco_ise-d1ba7b80-a075-11ec-a0a2-1598702abf83,search:cisco_ise-eecf4510-a058-11ec-a0a2-1598702abf83,search:cisco_ise-f681d1f0-a09f-11ec-a0a2-1598702abf83]",
    "type": "Error",
    "stack_trace": "Error: Non-unique import objects detected: [dashboard:cisco_ise-44afda90-3991-11ed-bb16-3b8b6259e7b8,dashboard:cisco_ise-506e8200-39a5-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-6b611af0-39a0-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-a42bdb60-39a8-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-db12cdf0-3a2f-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-e0ca7fa0-398e-11ed-bb16-3b8b6259e7b8,dashboard:cisco_ise-e2146a20-39a1-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-fcb9bc40-3a32-11ed-a2b2-1d4b9d412e28,search:cisco_ise-2c7c0eb0-a505-11ec-ab9d-4b8e737a22d9,search:cisco_ise-39e47010-a09b-11ec-a0a2-1598702abf83,search:cisco_ise-47c77dc0-a065-11ec-a0a2-1598702abf83,search:cisco_ise-5f739b70-a0a6-11ec-a0a2-1598702abf83,search:cisco_ise-ac5b9ba0-a02d-11ec-a0a2-1598702abf83,search:cisco_ise-d1ba7b80-a075-11ec-a0a2-1598702abf83,search:cisco_ise-eecf4510-a058-11ec-a0a2-1598702abf83,search:cisco_ise-f681d1f0-a09f-11ec-a0a2-1598702abf83]\n    at Function.nonUniqueImportObjects (/usr/share/kibana/node_modules/@kbn/core-saved-objects-import-export-server-internal/src/import/errors.js:32:12)\n    at collectSavedObjects (/usr/share/kibana/node_modules/@kbn/core-saved-objects-import-export-server-internal/src/import/lib/collect_saved_objects.js:73:43)\n    at runMicrotasks (<anonymous>)\n    at processTicksAndRejections (node:internal/process/task_queues:96:5)\n    at importSavedObjectsFromStream (/usr/share/kibana/node_modules/@kbn/core-saved-objects-import-export-server-internal/src/import/import_saved_objects.js:38:37)\n    at retryImportOnConflictError (/usr/share/kibana/node_modules/@kbn/fleet-plugin/server/services/epm/kibana/assets/install.js:201:18)\n    at installKibanaSavedObjects (/usr/share/kibana/node_modules/@kbn/fleet-plugin/server/services/epm/kibana/assets/install.js:232:9)\n    at installKibanaAssets (/usr/share/kibana/node_modules/@kbn/fleet-plugin/server/services/epm/kibana/assets/install.js:109:27)\n    at installKibanaAssetsAndReferences (/usr/share/kibana/node_modules/@kbn/fleet-plugin/server/services/epm/kibana/assets/install.js:135:26)"
  },
  "log": {
    "level": "ERROR",
    "logger": "plugins.fleet"
  },
  "process": {
    "pid": 972
  },
  "trace": {
    "id": "45544d845cdb1212ff1b330385f5703e"
  },
  "transaction": {
    "id": "735b97bae9a8ad25"
  }
}

When i try to force a reinstall of a integration that is installed (agents have policies with integration) but is NOT registered as installed on the Fleet integration page with the following command. We've tried to restart the services but this was not a solution either.

Trying to force a install

POST kbn:/api/fleet/epm/packages/system/1.34.0
{"force":true,
  "ignore_constraints": true
}

I get the return

{
  "statusCode": 409,
  "error": "Conflict",
  "message": "Concurrent installation or upgrade of system-1.34.0 detected, aborting. Original error: Saved object [tag/fleet-managed-default] conflict"
}

Trying to force another integration that is stuck

POST kbn:/api/fleet/epm/packages/cisco_ise/1.9.0
{"force":true,
  "ignore_constraints": true
}

I get the return

{
  "statusCode": 500,
  "error": "Internal Server Error",
  "message": "Non-unique import objects detected: [dashboard:cisco_ise-44afda90-3991-11ed-bb16-3b8b6259e7b8,dashboard:cisco_ise-506e8200-39a5-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-6b611af0-39a0-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-a42bdb60-39a8-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-db12cdf0-3a2f-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-e0ca7fa0-398e-11ed-bb16-3b8b6259e7b8,dashboard:cisco_ise-e2146a20-39a1-11ed-a2b2-1d4b9d412e28,dashboard:cisco_ise-fcb9bc40-3a32-11ed-a2b2-1d4b9d412e28,search:cisco_ise-2c7c0eb0-a505-11ec-ab9d-4b8e737a22d9,search:cisco_ise-39e47010-a09b-11ec-a0a2-1598702abf83,search:cisco_ise-47c77dc0-a065-11ec-a0a2-1598702abf83,search:cisco_ise-5f739b70-a0a6-11ec-a0a2-1598702abf83,search:cisco_ise-ac5b9ba0-a02d-11ec-a0a2-1598702abf83,search:cisco_ise-d1ba7b80-a075-11ec-a0a2-1598702abf83,search:cisco_ise-eecf4510-a058-11ec-a0a2-1598702abf83,search:cisco_ise-f681d1f0-a09f-11ec-a0a2-1598702abf83]"
}

Furthermore, we see that some of the broken integrations are randomly installed to certain spaces. Randomly because the created timestamp of the Dashboards and the Kibana Spaces that usually have nothing to do with the suddendly appeard Dashboards.

Furthermore, there are a lot of redundant data view that are generated when a routine tries to reinstall the integration. In the picture you can see from the timestamp and name that data views are continuously added.

I suspect this either to be a bug introduced somewhere around Elastic-Stack 8.8.0 or/and a result from a broken upgrade. However, I don't have a direct indication from the second guess. Furthermore, it seems as the elastic integration are not really space aware (see Multiple issues - Fleet integrations across multiple spaces confuses Kibana/Fleet · Issue #3434 · elastic/integrations (github.com)). However, this is crucial to create adption as in the enterprise we're using the platform internally shared so everyone can work in his space and profit from the integrations predefined dashboards. We are using this elastic stack since about 7.14.0 and upgraded about every 1-2 minor version along the way. so maybe this issue was introduced by a certain upgrade. But again a wild guess.

I'm happy for any assistance and I'm able to provide more logs/troubleshooting information if needed.

we have the same problem. Duplicated Data view and Dashboard could not locate it. Problems started at Version 8.8.0

Fixed it. Kibana 8.9.0 same Problems. Remove all Logs-* and Metrics-* Dataview in default Space. Add Logs-* and Metrics-* Data View to my integration Space (Advanced Settings --> Custom data view ID logs-) and Allow hidden and system indices. All integration works now well. I have remove Windows integration. If i add "Windows" integration again i get logs- and metrics-* Data view in default Space (after Kibana reboot) Tags and saved Objects must also delete in default Space. If some Assets not in the Space Reinstall the Integration...

So Windows integration make this Problem!!!

Seems as the problem gets traction and will be handled in the next sprint: [Fleet] [Kibana] Integrations fail to upgrade or (un)/(re)-install when using multiple Kibana Spaces · Issue #161804 · elastic/kibana (github.com)

Your solution seems close to mine. I needed to do exactly the same steps to somehow fix this. I had the Windows Integration too. Not sure if it is really related. But the logs-* and metrics-* needed to be deleted

1 Like

Same Problem in 8.9.1 not fixed.
grafik

If you update or reinstall some Integrations and restart Kibana after this you get logs-* and metrics-* Data Views in the default Space. And logs-* and metrics-* is deleted in other Spaces.

Thanks for reporting the issue, we are investigating and will post updates to the public kibana issue here: Not fixed in 8.9.1 / Integrations in multiple Kibana Spaces · Issue #164243 · elastic/kibana · GitHub

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.