Unable to bulk insert JSON object with no error message

Hi! I've set up a small local docker instance, where I'm trying to run a POC for my project. Though it has been quite intuitive and easy to set it up, and getting started with inserting data into the indices, I still have some issues.
And the issue is with a single JSON object. The file is quite large though, and is has a good deal of nested objects, and I suspect this is why?
When I'm inserting the data it doesn't really give me any error messages, it just tells me that 1 document has failed. I've seen other places, where people were struggling with it, there were attached an error message to the failed insert. I've tried to increase the nested settings to ridiculous size, but no luck, this being index.mapping.nested_objects.limit for example
Please keep in mind this is not my actual setup, but just a setup that reproduces the error on that single object. Here's the error message that I get:

Traceback (most recent call last):
  File "C:\repo\..\elastic\test.py", line 41, in <module>
    success, failed = helpers.bulk(client, actions)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\username\.pyenv\pyenv-win\versions\3.11.9\Lib\site-packages\elasticsearch\helpers\actions.py", line 531, in bulk
    for ok, item in streaming_bulk(
  File "C:\Users\username\.pyenv\pyenv-win\versions\3.11.9\Lib\site-packages\elasticsearch\helpers\actions.py", line 445, in streaming_bulk
    for data, (ok, info) in zip(
  File "C:\Users\username\.pyenv\pyenv-win\versions\3.11.9\Lib\site-packages\elasticsearch\helpers\actions.py", line 359, in _process_bulk_chunk
    yield from gen
  File "C:\Users\username\.pyenv\pyenv-win\versions\3.11.9\Lib\site-packages\elasticsearch\helpers\actions.py", line 276, in _process_bulk_chunk_success
    raise BulkIndexError(f"{len(errors)} document(s) failed to index.", errors)
elasticsearch.helpers.BulkIndexError: 1 document(s) failed to index

Here's the very simple code that I use to get the error message.

from elasticsearch import Elasticsearch, helpers
import warnings
import json
warnings.filterwarnings("ignore")


client = Elasticsearch(
  "https://localhost:9200/",
  api_key="API_KEY",
  verify_certs=False
)

with open("./large_file.json", "r") as file:
    data = json.load(file) # Single object, not a list

actions = [
    {
        "_index": "test_index",
        "_source": [data]
    }
]

success, failed = helpers.bulk(client, actions)

Removed language-clients

Removed docker

Welcome!

Could you share some of the first lines of your json file?

Thanks! So I've actually found out the cause of it, though I don't know how to fix it...
I needed to wrap my code in a try-except, and specify the BulkIndexError, like so
Edit: I think this should be printed out along side the exception message, the message doesn't really help in this case.

try:
    success, failed = helpers.bulk(client, actions)
except BulkIndexError as e:
    print(json.dumps(e.errors))

So I actually have the correct error now, and it seems to be a mapping error

"error": {
                "type": "illegal_argument_exception",
                "reason": "mapper [data.Model.Nodes.Network.Ports.Statistics.Be.Tx.Fps.LastMinute] cannot be changed from type [float] to [long]"
            }

It's because Nodes is a list of objects, and for some reason the LastMinute property is sometimes a number with decimals and sometimes not. So I need to pre-specify that it's a floating point number.
And this is where I'm at, any suggestions on how to predefine this? I've tried to do this "data": { "Model": { .... "LastMinute": {"type": "float"}} ... }} But this results in a mapper_parsing_error

Is your file structured exactly as outlined in the documentation, e.g. one action and metadata line followed by a JSON document on a single line?

I'm not using client.bulk directly, I'm using the helper function helpers.bulk. So no not quite as shown in the documentation, but this is the actual function I use, when there's more than a single document I need to insert.

    def insert_data(self, index, data):
        actions = [
            {
                "_index": index,
                "_source": data
            } for data in data
        ]
        success, failed = helpers.bulk(self.client, actions)
        
        return success, failed

Edit: And my data is varying quite a lot, so they are semi-strucuted, in the sense that I can find the predefined pattern of it. But since there's multiple hundreds if not thousands of different patterns. I'd like not specify them fully.