How to index a log content in order to filter a specific field?

Hi guys,

I have a python application that raise logs in the json format, find example below:

{
    "levelname": "INFO",
    "message": "REQUEST STARTED",
    "name": "requests_middleware.middlewares", 
}

I was wondering if is it possible map those json fields from my logs to a kibana indexes, in order to be possible select the visualization of specific fields.

For example, in the image below, I can select message as a field of my log, I would like to filter the information inside of my log

[Install elasticsearch library (pip install elasticsearch).
Send logs directly:

from elasticsearch import Elasticsearch

es = Elasticsearch([{'host': 'localhost', 'port': 9200}]) # Update host/port if needed

log_data = {"levelname": "INFO", "message": "REQUEST STARTED
, "name": "middleware"}
es.index(index="logs", body=log_data) # Index named "logs"]

Create an index pattern (optional but recommended): logs-*
Go to Discover tab.
Select index pattern (if created).
Visualize/filter using the Kibana interface:
Visualize fields (Terms, Histogram, etc.)
Filter logs by fields (search bar)