Kibana : Error SavedObject view: { [Error] res: {}, body: null } when modifying the vistate section of a saved object

Hello team,

I am encountering an issue when trying to modify the json in the vistate section of a saved object, i encounter the following error : SavedObject view: { [Error] res: {}, body: null }

So what i'm trying to do is create a visualisation where the aggregation is count and the buckets is filters, these filters are coordinates like the following "(1,1)", ..., "(245,188)".

Could you please help me understand this error and what i could do to fix it ?
Is it due to the massive amount of filters ?

Thanks and regards

We really don't support direct modification of the JSON. Can you update the visualization through the UI and inspect the object?

What i do is add a couple of filters in the visualisation's UI, save the visualisation, then go to the associated saved object to modify the json in vi state.
Adding all the filters through the UI would be way to time consuming ..
Btw, is there a limit to the number of filters one cana add ?

It seems like if you're having to fall back to editing the JSON directly, something is wrong. Would you mind giving some information as to what you're trying to accomplish that is resulting is needing to define so many filters on a visualization?

Ok so for exemple here i have a grid with robots moving on it and each cell of the grid has an RFID tag that the bot reads to get coordinates.
When the bot can't read the tag, it raises an error that is logged with the bot id and the coodinates that should have been read.
But this is all logged under the same field message in kibana so i end up having to manually use filters with each coordinate to extract in a table the number of errors for each coordinate.

One idea would be to use a custom tile map. An example of this with a hockey rink is here: https://www.elastic.co/blog/kibana-and-a-custom-tile-server-for-nhl-data

The other option would be to change it at index time. The easiest way to do this would be to use an ingest pipeline. Data being ingested would go through a pipeline, giving you the ability to manipulate the data before it's indexed.

A New Way To Ingest - Part 1
Ingest node documentation

Thank you, i will study all this and see if i can implement it when i get back to work

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.