Build a Basketball SMS Chatbot with LangChain Prompt Templates in Python

Lizzie Siegle - May 31 '23 - - Dev Community

This blog post was written for Twilio and originally published on the Twilio blog.

As I've played with the OpenAI API (and DALL·E and fine-tuning models) and gotten more into ML, developers I meet (as well as my wonderful coworker Craig Dennis who advised me on this tutorial) keep telling me to use LangChain, a powerful and flexible framework for developing applications powered by language models.

Read on to learn how to build an SMS chatbot using LangChain prompt templates, OpenAI, Twilio Programmable Messaging, and Python.
sms example asking for a haiku about summer, who the greatest nba finals winners are, how many times Steph Curry has been to the nba finals with answers

Large Language Models

Large Language Models are trained on large quantities of textual data (i.e. in ChatGPT's case, the entire internet up to 2021–so you can provide more context or data that the model is missing with prompt engineering and Prompt Templates–more on those later) to produce human-like responses to dialogue or other natural language inputs.

To yield these natural language responses, LLMs use deep learning (DL) models, which use multi-layered neural networks to process, analyze, and make predictions with complex data.

LangChain

LangChain logo image
LangChain came out late last year and already has over 43,000 stars on GitHub and a thriving community of contributors.

It is many things, but ultimately at its core is an open-source framework that simplifies the development of applications using large language models (LLMs), like OpenAI or Hugging Face. Developers can use it for chatbots, Generative Question-Answering (GQA), summarization, and more.

With LangChain, developers can “chain” together different LLM components to create more advanced use cases around LLMs. Chains can consist of multiple components from several modules:

  • Prompt Templates: Prompt templates are templates for different types of prompts. Like “chatbot” style templates, ELI5 question-answering, etc
  • LLMs: Large language models like GPT-3, Hugging Face, BLOOM, etc
  • Agents: Agents use LLMs to decide what actions should be taken. Tools like web search or calculators can be used, and all are packaged into a logical loop of operations.
  • Memory: Short-term memory, long-term memory. LangChain recognizes the power of prompts and has built an entire set of objects for them. It also provides a wrapper around different LLMs so you can easily change models, swapping them out with different templates. The chat model could be different, but running and calling it is the same–a very Java-like concept! ## LangChain Twilio Tool Recently, LangChain came out with a Twilio tool so your LangChain Agents are able to send text messages. For example, your LLM can understand the input in natural language, but Agents can let you complete different tasks like calling an API.

You'll need your own Twilio credentials and to install it with pip install twilio. The code for the tool would look something like this:

from langchain.utilities.twilio import TwilioAPIWrapper

twilio = TwilioAPIWrapper(
     account_sid="YOUR-ACCOUNT-SID",
     auth_token="YOUR-TWILIO-AUTH-TOKEN",
     from_number="YOUR-TWILIO-NUMBER,"
)
​​twilio.run("hello world", "NUMBER-TO-TEXT")
Enter fullscreen mode Exit fullscreen mode

This tutorial will, however, show you how to use LangChain Prompt Templates with Twilio to make a SMS chatbot.

LangChain Prompt Templates

"Prompts” refer to the input to the model and are usually not hard-coded, but are more often constructed from multiple components. A Prompt Template helps construct this input. LangChain provides several classes and functions to make constructing and working with prompts easy.

Prompts being input to LLMs are often structured in different ways so that we can get different results. For Q&A, you could take a user’s question and reformat it for different Q&A styles, like conventional Q&A, a bullet list of answers, or even a summary of problems relevant to the given question. You can read more about prompts here in the LangChain documentation.

Prompt templates offer a reproducible way to generate a prompt. Like a reusable HTML template, you can share, test, reuse, and iterate on it, and it will update. When you update a prompt template, it updates for anyone else using it or the whole set of apps that could use it.

Prompt templates contain a text string (AKA “the template”) that can take in a set of parameters from the user and generate a prompt.

Importing and initializing a LangChain PromptTemplate class would look like so:

from langchain import PromptTemplate

template = """Question: {question}

Answer: """
prompt = PromptTemplate(
        template=template,
    input_variables=['question']
)

# user question
question = "Which NBA team won the finals in 1996?"
Enter fullscreen mode Exit fullscreen mode

Since OpenAI LLMs lack data after September 2021, its models can't answer anything that occurred after without additional context. Prompt templates help provide additional context, but differ from fine-tuning - fine-tuning is like coaching the model with new data to get certain output, but prompt engineering with Prompt Templates provides the model specific data to help it get the output you want.

Let's use this template to build the chatbot.

Prerequisites

  1. A Twilio account - sign up for a free one here
  2. A Twilio phone number with SMS capabilities - learn how to buy a Twilio Phone Number here
  3. OpenAI Account – make an OpenAI Account here
  4. Python installed - download Python here
  5. ngrok, a handy utility to connect the development version of our Python application running on your machine to a public URL that Twilio can access.

⚠️ ngrok is needed for the development version of the application because your computer is likely behind a router or firewall, so it isn’t directly reachable on the Internet. You can also choose to automate ngrok as shown in this article.

Configuration

Since you will be installing some Python packages for this project, you will need to make a new project directory and a virtual environment.

If you're using a Unix or macOS system, open a terminal and enter the following commands:

mkdir lc-sms 
cd lc-sms 
python3 -m venv venv 
source venv/bin/activate 
!pip install langchain
!pip install openai
pip install Flask
pip install twilio
pip install load_dotenv
Enter fullscreen mode Exit fullscreen mode

If you're following this tutorial on Windows, enter the following commands in a command prompt window:

mkdir lc-sms 
cd lc-sms 
python -m venv venv 
venv\Scripts\activate 
pip install langchain
pip install openai
pip install Flask
pip install twilio
pip install load_dotenv
Enter fullscreen mode Exit fullscreen mode

The last command uses pip, the Python package installer, to install the three packages that you are going to use in this project, which are:

Create a .env file in your project’s root directory and enter the following line of text, making sure to replace <OPENAI_API_KEY> with your actual key:

OPENAI_API_KEY= <YOUR-OPENAI-KEY>
Enter fullscreen mode Exit fullscreen mode

Make sure that the OPENAI_API_KEY is safe and that you don't expose your .env file in a public location such as GitHub.

Now, your Flask app will need to be visible from the web, so Twilio can send requests to it. ngrok lets you do this: with ngrok installed, run ngrok http 5000 in a new terminal tab in the directory your code is in.
ngrok terminal with forwarding ngrok url
You should see the screen above. Grab that ngrok Forwarding URL to configure your Twilio number: select your Twilio number under Active Numbers in your Twilio console, scroll to the Messaging section, and then modify the phone number’s routing by pasting the ngrok URL in the textbox corresponding to when A Message Comes In as shown below:
configure Twilio phone number when a message comes in with webhook aka ngrok url
Click Save and now your Twilio phone number is configured so that it maps to your web application server running locally on your computer. Let's build that application now.

Generate text through LangChain with OpenAI in Python via SMS

Inside your lc-sms directory, make a new file called app.py.

Copy and paste the following code into app.py to start off the ChatGPT-like SMS app to import the required libraries.

from flask import Flask, request, redirect
from dotenv import load_dotenv

from twilio.twiml.messaging_response import MessagingResponse

from langchain.llms import OpenAI
from langchain import PromptTemplate

import os
Enter fullscreen mode Exit fullscreen mode

Then, make the OpenAI LLM object (which could be another LLM--this is where LangChain can make reusability easier!), passing it the model name and API key from the .env file, and create a Flask application.

load_dotenv()


llm = OpenAI(
    model_name="text-davinci-003",
    openai_api_key=os.environ.get('OPENAI_API_KEY')
)


app = Flask(__name__)
Enter fullscreen mode Exit fullscreen mode

Make the start of the /sms webhook containing a TwiML response to respond to inbound SMS with.

@app.route("/sms", methods=['GET', 'POST'])
def sms_reply():
    # Start our TwiML response
    resp = MessagingResponse()
Enter fullscreen mode Exit fullscreen mode

Add the following template code below to help shape the queries and answers from the LLM. In this tutorial, it's Warriors basketball-themed. (Yes, I know the 2023 NBA Finals are the Heat versus the Nuggets. A girl can dream.)

 template = """Answer the question based on the context below. If the question cannot be answered using the information provided, answer with "I don't know, but the Warriors are the best team in the NBA".
    Context: Steph Curry has won 4 NBA Finals series. His Golden State Warriors defeated the Cleveland Cavaliers three times and the Boston Celtics once. 

In 2015 Steph Curry and the Warriors defeated the Cleveland Cavaliers. The Cavs featured LeBron James, Kyrie Irving and not much else! 
In 2017 Steph Curry, Kevin Durant and the Warriors defeated the Cavs again. The Cavs still had Lebron and Kyrie.
In 2018 the Warriors, featuring Steph and KD again, defeated the Cavs for the third time in four years. The Cavs still had Kyrie and Lebron. 
In 2022 Steph and the Warriors defeated the Boston Celtics for his fourth title. The Celtics featured Jayson Tatum and Jaylen Brown. Steph Curry and the Golden State Warriors lost one NBA Finals series to the Cleveland Cavaliers and one to the Toronto Raptors. 

In 2016 Steph, Klay, Draymond and the rest of the Warriors lost to the Cleveland Cavaliers.  The Cavs starred LeBron James and Kyrie Irving. 
In 2019 the Steph and the Warriors, missing an injured KD, lost to the Toronto Raptors. The Raptors featured Kawhi Leonard in his only season in Canada alongside Kyle Lowry and Pascal Siakham. Steph Curry has only won one NBA Finals MVP up to this point in his career. . 

In 2022 Steph Curry averaged 31 points, 6 rebounds and 5 assists per game to win the MVP award in the Warriors 6-game defeat of the Boston Celtics.
In 2015 Andre Iguodala won the MVP in the Warriors defeat of the Cavs.
In both 2017 & 2018 Kevin Durant was Finals MVP in the Warriors victories over the Cavs. Steph Curry's 4-2 NBA Finals record puts him ahead of many NBA greats including Larry Bird (3-2) and LeBron James (4-6). Steph still comes up short of the greatest NBA Finals winners including Bill Russell (11-1) and Michael Jordan (6-0).  
    Question: {query}
    Answer: """
    basketball_query_template = PromptTemplate(
        input_variables=["query"],
        template=template
    )
Enter fullscreen mode Exit fullscreen mode

Get the user's question via inbound text message, print out the answer by passing the question to the prompt template and formatting it, and then pass that to Twilio as the text message to send back to the user.

question = request.form['Body'].lower().strip()
    print(llm(
        basketball_query_template.format(
            query=question
        )
    ))
    resp.message(openai(
        basketball_query_template.format(
            query=question
        )
    ))

    return str(resp)

if __name__ == "__main__":
    app.run(debug=True)
Enter fullscreen mode Exit fullscreen mode

In a new terminal tab (while the other terminal tab is still running ngrok http 5000), run python app.py. You can now text your Twilio number (questions about the Warriors because we provided the model Warriors context) configured from above.
sms example asking for a limerick about the warriors beating the celtics and who did the warriors beat in 2015 it was the Cavs
The complete code can be found here on GitHub.

What's Next for Twilio and LangChain

LangChain can be used for chatbots (not just Warriors/basketball-themed), chaining different tasks, Generative Question-Answering (GQA), summarization, and so much more. Stay tuned to the Twilio blog for more LangChain and Twilio content, and let me know online what you're building with AI!

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Terabox Video Player