I push documents to elastic successfully but they are not visualized in Kibana's Discover

Hi!
I have a 3-node installation (1 master, 2 data) , installed into the OpenStack VMs. I want to push data (logs) from an application to the stack and find them in Discover section of Kibana. (I have installed Kibana on server where master node lives.) The application may lives in another VM under the same network of the ELK installation, or in the outside world (which is my task case)
I use elasticsearch python module, and particularly es.index() to just push some data to ELK (create or update - if they already exist - 10 documents)
The issue: When I push the docs from other VMs in the network they are visualized correctly. But when I send the same logs from my laptop, the updated documents are being vanished. In case that the documents are not existed, they are created successfully into ELK, but (once again) they are not visualized.
I cannot understand why this happens since Kibana just reads and visualize data from Elastic, doesn't it?
Once again, the script is the same across all machines. The only thing I noticed is that the issue happens only when I push data from my laptop (and not from VMs in the network).
But how is that possible since all documents are created or updated successfully (in my elastic DB) and Kibana does not see the ones created or updated from my laptop?!?!?! Any ideas???
ps: I have modified my security group and opened the corresponding ports of Elastic and Kibana to ingest data from outside world (in my case: my laptop)
Thanks a lot in advance!

Hello @orfeas2021

What version of the Elastic stack are you running?

How are you visualizing your documents?

Are you able to share the script that is creating or updating documents?

Thanks,
Matt

Hi @mattkime !

My elasticsearch version is 7.11.2. (same for kibana:7.11)

I am visualizing my docs in Kibana>Discover. Here are the docs I push from a VM. In case I push the same docs form my laptop (without changing the script at all), they are not visualized

Here is my script to push logs:

from elasticsearch import Elasticsearch
from datetime import datetime
from time import time
import subprocess as sp
hostname = sp.getoutput('cat /proc/sys/kernel/hostname')
# ELK

nodes = dict(internal={'master': 'INTERNAL_IP_MASTER', 'data_1': 'INTERNAL_IP_DATA1', 'data_2': 'INTERNAL_IP_DATA2'},
             external={'master': 'EXTERNAL_IP_MASTER', 'data_2': 'INTERNAL_IP_DATA2',
                       'data_1': 'INTERNAL_IP_DATA1'})
elk_port = 9200


def check_document_exists(index_name: str, doc_id: str):
    try:
        res = es.get(index=index_name, id=doc_id)  # int
        if res['_id'] == doc_id:
            print('OK - Document exists')
            return True
    except Exception as e:
        print('check_document_extists-Exception:{}'.format(str(e)))
        return False


def my_timer(period: int, stop_secs: int):
    '''

    :param period: loop time in seconds
    :param stop_secs: stop after stop_secs seconds
    :return:
    '''
    t0 = time()
    counter = 0
    i = 1
    while i == 1:
        t = time()
        if t - t0 >= period:
            counter += 1
            t0 = time()
            yield str(counter)
        if counter == stop_secs:
            break
    return 'timer finished'


# connect to Stack
try:
    es = Elasticsearch([
        {'host': nodes['external']['master'], 'port': elk_port},
        # {'host': nodes['external']['data_1'], 'port': elk_port},
        # {'host': nodes['external']['data_2'], 'port': elk_port}
    ])

except Exception as error:
    print("Elasticsearch Client Error: ", error)
    raise

indexing_timer = my_timer(period=1, stop_secs=10)
for i in indexing_timer:
    # do sth
    print(i)
    document = {
        "hostname": hostname,
        "description": "document {}".format(i),
        "timestamp": datetime.now()
    }

    res = es.index(index="test-index-orfeas-newtests", id=i + '-aidono-test', body=document)
    print('es.index result: {}'.format(res['result']))

# validating
check_document_exists(index_name="test-index-orfeas-newtests", doc_id='1-aidono-test')

I suspect the request isn't making it to Elasticsearch somehow. Perhaps turning on Audit logging would help you see when requests are hitting ES. Enable audit logging | Elasticsearch Guide [7.12] | Elastic

I think in both cases the request to elastic - nodes does work. I see that documents created or updated through elastic API but not in kibana in case of pushing data from my laptop.
Ok I ll check also Autdit logs.
Thanks for your time!