Dec 22nd, 2023: [EN] Santa Claus Meets GenAI: Deciphering Handwritten Christmas Letters with LLM, LangChain and Elasticsearch


This post is also available in portuguese.

In the heart of the North Pole, Santa's team of elves faced a formidable logistical challenge: how to handle millions of letters from children around the world. With a determined look, Santa Claus decided it was time to incorporate artificial intelligence into the Christmas operation.

Sitting at his computer, equipped with the latest in AI technology, Santa Claus began working on a Python script in Jupyter Notebook. The goal was simple yet ambitious: to use the power of Generative AI and LLMs to interpret handwritten letters and extract the necessary data, organizing it in Elasticsearch.

!pip install python-dotenv elasticsearch langchain openai

The first step was to set up the environment variables which would be used as credentials for accessing the OpenAI and Elasticsearch APIs.

import os

from dotenv import load_dotenv

# Replace 'path/to/your/.env' with the correct path to your .env file on Google Drive
env_path = '/content/drive/MyDrive/@Blogs/04-Advent-2023/env_advent'
load_dotenv(env_path)

# OpenAI API Key
OPENAI_API_KEY = os.getenv('OPENAI_API_KEY')
OPENAI_API_URL = "https://api.openai.com/v1/chat/completions"

# Elastic cloud credentials
es_cloud_id = os.getenv('cloud_id')
es_user = os.getenv('cloud_user')
es_pass = os.getenv('cloud_pass')

Next, with a digitized image of a Christmas letter, Santa Claus wrote a script to extract the text using "gpt-4-vision-preview". This crucial step transformed the handwritten writing into digital text. "GPT-4-vision-preview" is an experimental version of OpenAI's GPT-4 language model, expanded to include image processing and analysis capabilities.

from PIL import Image
import requests
import numpy as np

from langchain.chat_models import ChatOpenAI
from langchain.schema.messages import HumanMessage, SystemMessage

image_path = 'https://i.imgur.com/IxC9lgd.png'

chat = ChatOpenAI(model="gpt-4-vision-preview", max_tokens=512)
result = chat.invoke(
    [
        HumanMessage(
            content=[
                {"type": "text", "text": "What is in the picture? Please provide a detailed introduction."},
                {
                    "type": "image_url",
                    "image_url": {
                        "url": image_path,
                        "detail": "auto",
                    },
                },
            ]
        )
    ]
)


print(result.content)

Then, LangChain came into action, analyzing the text and identifying key elements like the child's name and wishlist.


from langchain.prompts import PromptTemplate
from langchain.chat_models import ChatOpenAI
from langchain.schema import StrOutputParser

chain = ChatOpenAI(model="gpt-3.5-turbo", max_tokens=1024)

prompt = PromptTemplate.from_template(
"""
Extract the list and child's name from the text below and return the data in JSON format using the following name:
- "child_name", "wishlist".

{santalist}

"""
)

letter = result.content
wishlist = runnable.invoke({"santalist": letter})
print(wishlist)

{
   "child_name": "Maria",
   "wishlist": [
      "Barbie Dreamhouse Adventures",
      "My Little Pony"
  ]
}

Santa Claus decided to enrich the database a bit, and also asked the AI to estimate the weight of these gifts. This way, he could generate a list in Kibana with the children's gifts divided into each bag and fitting within the space of a sleigh - what organization!!

chain = ChatOpenAI(model="gpt-3.5-turbo", max_tokens=1024)

prompt = PromptTemplate.from_template(
"""

{santalist_json}

From the JSON above, include a new attribute in the JSON called 'weight',
which will calculate the total estimated weight of each item in the list in kilograms.

You will first need to estimate the weight of each item individually.
After that, sum these values to obtain the total weight.
Extract only the numerical value.

"""
)

runnable = prompt | chain | StrOutputParser()

new_wishlist = runnable.invoke({"santalist_json": wishlist})
print(new_wishlist)

{
    "wishlist": [
        "Barbie Dreamhouse Adventures",
        "My Little Pony"
    ],
    "child_name": "Maria",
    "weight": 2.5
}

Now, with the data structured, it was time to move them to Elasticsearch.

from elasticsearch import Elasticsearch
import json

es = Elasticsearch(cloud_id=es_cloud_id,
                  basic_auth=(es_user, es_pass)
                  )
es.info() # should return cluster info

# Parse the JSON string
json_string = new_wishlist
data = json.loads(json_string)

# Index name
index_name = "santa_claus_list"

# Index the document
response = es.index(index=index_name, document=data)

# Print the response from Elasticsearch
print(response)
        

Using the Dev Console, a tool integrated into Kibana, Santa Claus and the elves could then easily search and analyze the data. This allowed for a clear view of this year's gift trends, the most frequent locations of the letters, and even identify those letters that expressed particular or urgent wishes, and of course, the weight of the gifts. As in this query using ES|QL.

POST /_query?format=txt
{
  "query": """
FROM santa_claus_list
| STATS  sum_toy = SUM(weight) BY child_name
| LIMIT 100
  """
}

# result
    sum_toy    |  child_name   
---------------+---------------
30.5           |Maria
1.5            |Mike
3.0            |Theo
2.5            |Isabella
40.0           |William
30.0           |Olivia       

Thanks to this innovative solution, Santa Claus was not only able to fulfill requests more efficiently, but also gained valuable insights into the joys and hopes of children around the world, all thanks to the power of AI, LangChain, and Elasticsearch. This Christmas promised to be the most magical and well-organized of all!

If you want to follow and execute the code while reading, access the file Python code running on a Jupyter Notebook (Google Collab).

Merry Christmas and Happy Holidays

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.