Build Generative AI Chatbot

JhonnyARM - Jul 9 - - Dev Community

Introduction

In this article, you will learn how to build a generative AI chatbot that leverages OpenAI's GPT-3.5 to provide personalized and accurate responses to user queries. By integrating a pretrained LLM with a dynamic SQLite database, you'll create an intelligent chatbot capable of handling domain-specific questions. This tutorial will cover setting up the environment, configuring the AI model, and building an interactive user interface using Gradio.

Why Use Generative AI in Chatbots?

Using Generative AI in chatbots offers several key advantages, including:

Personalized Responses:

Generative AI enables chatbots to generate responses that are tailored and relevant to each user's specific queries, leading to a more engaging and satisfying user experience.

Natural Language Understanding:

Generative AI models like GPT-3.5 have advanced natural language understanding capabilities, allowing the chatbot to comprehend complex queries and provide accurate answers.

Contextual Awareness:

These AI models can maintain context across interactions, which helps in providing coherent and contextually appropriate responses even in multi-turn conversations.

Requirements

To successfully build and deploy this generative AI chatbot, you should have a basic understanding or knowledge of the following:

  • Basic understanding of Large Language Models (LLMs) and how they can be applied to natural language processing tasks.
  • Basic knowledge of Python programming, including working with libraries and writing scripts.
  • Familiarity with how to use APIs, including making requests and handling responses.

Tools and Technologies

  • OpenAI GPT-3.5: Used for interpreting natural language queries and generating SQL queries.
  • LangChain: Facilitates integration with OpenAI and helps create SQL agents.
  • SQLAlchemy: Manages database interactions and operations.
  • Pandas: Handles reading and processing of Excel files.
  • Gradio: Creates interactive user interfaces.
  • SQLite: Acts as the database to store and query data.
  • Dotenv: Loads environment variables from a .env file.

Flow diagram

The following diagram illustrates how this chatbot operates.

FlowDiagram

Project Setup

  • Create requirements.txt and copy the following code.
openai==0.27.0
langchain==0.0.152
sqlalchemy==1.4.41
pandas==1.4.3
gradio==2.7.5
python-dotenv==0.20.0

Enter fullscreen mode Exit fullscreen mode
  • Install Dependencies: Using requirements.txt.
pip install -r requirements.txt
Enter fullscreen mode Exit fullscreen mode
  • Create the .env File:

OPENAI_API_KEY=OPENAI-TOKEN-HERE
OPENAI_API_BASE=https://api.openai.com/v1
Enter fullscreen mode Exit fullscreen mode

Replace YOUR-OPENAI-API-KEY with your actual API key obtained from OpenAI.

  • Create a file called "app"
import os
import gradio as gr
from langchain_community.chat_models import ChatOpenAI
from langchain_community.agent_toolkits.sql.base import create_sql_agent
from langchain_community.utilities import SQLDatabase
from langchain_community.tools import Tool
from langchain.memory import ConversationBufferMemory
from dotenv import load_dotenv
from sqlalchemy import create_engine, inspect, text
from loader import create_demand_table

# Load environment variables
load_dotenv()

# Configure the language model with OpenAI directly
llm = ChatOpenAI(
    api_key=os.getenv("OPENAI_API_KEY"),
    model_name="gpt-3.5-turbo-0125",
    temperature=0.1,
)

# Initialize the database engine
engine = create_engine('sqlite:///db.sqlite3', echo=True)

# Function to process the uploaded file and create a table
def process_file(file):
    db = create_demand_table(engine, 'dynamic_table', file.name)
    return db

# Define a function to execute SQL queries
def run_sql_query(query):
    with engine.connect() as connection:
        result = connection.execute(text(query))
        return [dict(row) for row in result]

# Create a tool that executes SQL queries
sql_tool = Tool(
    name="SQLDatabaseTool",
    func=run_sql_query,
    description="Tool for executing SQL queries on the database."
)

# Configure the conversational agent's memory
memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)

# Function to handle user queries and process them with the SQL tool
def query_fn(input_text, file):
    db = process_file(file)
    conversational_agent = create_sql_agent(
        llm=llm,
        db=db,
        tools=[sql_tool],
        memory=memory,
        verbose=True,
        dialect='ansi',
        early_stopping_method="generate",
        handle_parsing_errors=True,
    )
    response = conversational_agent.run(input=input_text)

    # Check if the response is a list and contains valid data
    if isinstance(response, list) and len(response) > 0:
        # Convert each row of the result into a string
        result_text = ', '.join([str(row) for row in response])
    elif isinstance(response, str):
        # If the response is a string, use it directly
        result_text = response
    else:
        result_text = "No data found for the query."

    return {"result": result_text}

# Set up the user interface with Gradio
iface = gr.Interface(
    fn=query_fn,
    inputs=[gr.Textbox(label="Enter your query"), gr.File(label="Upload Excel file")],
    outputs=gr.JSON(label="Query Result"),
    title="Domain-specific chatbot"
)

# Launch the application
iface.launch(share=False, server_port=8080)

Enter fullscreen mode Exit fullscreen mode
  • Create a file called "loader"
import pandas as pd
from sqlalchemy import create_engine, Table, Column, Integer, String, Date, MetaData
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
from langchain_community.utilities import SQLDatabase

def create_demand_table(engine, table_name, excel_file):
    # Read the Excel file
    df = pd.read_excel(excel_file)

    # Create a base class for the table model
    Base = declarative_base()
    metadata = MetaData()

    # Map pandas data types to SQLAlchemy data types
    dtype_mapping = {
        'object': String,
        'int64': Integer,
        'float64': Integer,
        'datetime64[ns]': Date,
    }

    # Dynamically define the table structure
    columns = []
    if 'ID' not in df.columns:
        columns.append(Column('id', Integer, primary_key=True, autoincrement=True))

    for col_name, dtype in df.dtypes.items():
        col_type = dtype_mapping.get(str(dtype), String)
        columns.append(Column(col_name, col_type))

    # Dynamically create the table
    demand_table = Table(table_name, metadata, *columns)

    # Drop the existing table in the SQLite database, if it exists
    metadata.drop_all(engine, [demand_table])

    # Create the table in the SQLite database
    metadata.create_all(engine)

    # Create a session to interact with the database
    Session = sessionmaker(bind=engine)

    with Session() as session:
        # Insert data into the table
        df.to_sql(table_name, con=engine, if_exists='append', index=False)

    db = SQLDatabase(engine)
    return db
Enter fullscreen mode Exit fullscreen mode

Running the Application

To run the application use the following code.

python app.py
Enter fullscreen mode Exit fullscreen mode

This command will start the server, and you can interact with the chatbot through the provided Gradio interface. In the console, you will see the URL where you can access your chatbot.

TerminalResults

Gradio Interface

The generated interface consists of three parts:

  • Query Input Box: This is where you enter your queries in natural language.
  • File Upload Box: This is where you upload your Excel files.
  • Result Display Area: This is where the results of your queries will be displayed.

Final

Conclusion:

AI in chatbots enhances user interactions with personalized responses and advanced natural language understanding, ensuring more engaging and contextually relevant conversations.

. . . .
Terabox Video Player