Welcome back for the fourth article in the Pieces Copilot Series Building Your Own Copilot with Pieces OS Client where we’ll dive into adding Microsoft's Phi-2 model into our project. This is a shorter example that is specific to Phi-2, with code snippets that you can copy & paste into your project to use. Check out the other articles in this series to learn about different model usage and creating your initial copilot.
Prerequisites
We recommend getting familiar with our Overview on Copilots.
If you want to build along with this tutorial, you will need Pieces OS. If you are interested in a different language implementation, you can check out the other resources found on our open-source repo.
Let’s get into the ins and outs of adding a new model to a simple project based in Typescript using Node and ESBuild. Remember that you can apply this same structure to your project with the snippets included here and the proper inputs.
Why Use Microsoft’s Phi-2?
Phi-2 is a 2.7 billion-parameter language model that demonstrates outstanding reasoning and language understanding capabilities, showcasing state-of-the-art performance among base language models with less than 13 billion parameters
Due to its size, Phi-2 is a great tool for researchers to use while exploring interoperability, fine-tuning experiments, and diving into safety improvements they can implement in their respective environments. As more models are introduced, the need for smaller (and more powerful) models rises on devices with less resource allocation. Most users are not operating on a top-of-the-line processor or do not have access to an external GPU.
Lightweight models are taking the copilot ecosystem by storm, and provide even further benefits in an offline environment.
Get Started with the Copilot
Let’s get into the code and start adding this new model to our Copilot Starter Project. If you want to follow along and start your project, you can use this guide, just be sure to have Pieces OS installed and the SDK added to your repo.
In previous articles, we covered several topics, and this will follow a similar path. The only difference about this article is that once we download the model itself, we will turn off our wifi to show the power of Phi-2’s offline functionality.
Catching Back Up
So far in this series, we have:
- Created our own copilot that can swap between cloud LLMs
- Added in local model downloading that monitors the progress of the download
- Used a specific model once and then changed to another model to compare answers
- Added local context to leverage our local data in copilot chats
- Added downloadable LLLM's Mistral and Llama7b (in both GPU and CPU versions)
In this article, we will download the new Phi-2 Model from Microsoft, then use it offline and compare the results with other models’ results to show the power of these smaller models.
Downloading Phi-2
Before we turn off our internet, we should use the Pieces Client to download our models after we get the initial logic added. Let’s start in the index.html
file where we have our radio buttons that control model selection. Each radio corresponds to a model and allows any available models to be used with the text prompt.
Let’s add the new radio buttons in the <form>
elements with your labels:
<input type="radio" name="models" id="phi2-cpu-radio">
<label for="phi2-cpu-radio">Phi-2 (CPU)</label>
<input type="radio" name="models" id="phi2-gpu-radio">
<label for="phi2-gpu-radio">Phi-2 (GPU)</label>
These will let you select a model just by clicking the radio. Now that we have created these, we want to move deeper into the project and get the model
information that we can use to control the download or show information. In the /src/index.ts
file you can add your model variables underneath the others used for Mistral and Llama2:
// Phi-2 Local Models
const phi2Cpu = models.iterable.find((model) => model.foundation === Pieces.ModelFoundationEnum.Phi2 && model.cpu)!;
const phi2Gpu = models.iterable.find((model) => model.foundation === Pieces.ModelFoundationEnum.Phi2 && !model.cpu)!;
The snippet includes both of the model selectors for GPU and CPU. Let's go down the CPU route and use that as our example. Remember: the CPU and GPU logic and steps to set this up are identical except for the naming.
Create & Enable Phi-2 Download Buttons
Now that the radios have been added for model selection, we need to make sure that something happens when the radio is clicked. Before the model is downloaded, it also needs to be disabled
so it cannot be set as an active model by clicking the radio. Then we can be sure to add the appropriate model and a few elements to the UI to assist with the conditionals. Start by creating the phi2CpuRadio
variable that uses .getElementById
to target the radio button we created before:
const phi2CpuRadio: HTMLElement | null = document.getElementById("phi2-cpu-radio") as HTMLInputElement | null;
Then we can create the full check to see:
- Whether or not the model is downloaded with
if(!phi2Cpu?.downloaded)
- What buttons should be added to the dom with
document.createElement
- Start the download with
Pieces.ModelApi().modelSpecificModelDownload()
- Create the id using the model
id
value so we can reference it uniquely - How to delete a model using
ModelsApi.modelDeleteSpecificModelCache()
if the model is already downloaded
Creating Model Download Buttons
Since we created the radio buttons to include a few helpful qualities like id
s and the ability to use appendChild
to attach elements later on, we can create a powerful flow that will either allow for a download of the Phi-2 model or the deletion of the same model.
Take a look at this snippet and its comments:
// Phi2 Cpu
const phi2CpuRadio: HTMLElement | null = document.getElementById("phi2-cpu-radio") as HTMLInputElement | null;
if (!phi2Cpu?.downloaded) {
phi2CpuRadio?.setAttribute('disabled', 'true');
// download container where we store the button for downloading.
const downloadPhi2CpuContainer = document.createElement("div");
modelDownloadsContainer.appendChild(downloadPhi2CpuContainer);
// the button that performs the action itself
const downloadPhi2CpuButton = document.createElement("button");
downloadPhi2CpuButton.innerText = "Download Phi-2 CPU"
downloadPhi2CpuContainer.appendChild(downloadPhi2CpuButton);
// download action
downloadPhi2CpuButton.onclick = () => {
new Pieces.ModelApi().modelSpecificModelDownload( { model: phi2Cpu.id});
}
// appends our container where we show the status of download.
const phi2CpuDownloadProgress = document.createElement("div");
downloadPhi2CpuContainer.appendChild(phi2CpuDownloadProgress);
// this id is how we identify the correct download progress box and get updates from the
// webhook.
phi2CpuDownloadProgress.id = `download-progress-${phi2Cpu.id}`
} else {
// if the model is not downloaded, then the delete button will be added,
// and the model files can be deleted to free up space, or redownload.
const deletePhi2CpuButton = document.createElement("button");
modelDownloadsContainer.appendChild(deletePhi2CpuButton);
deletePhi2CpuButton.innerText = 'Delete Phi2 (CPU)';
// this delete button takes the models id and then removes the model files through the
// this Pieces.ModelsApi usage.
deletePhi2CpuButton.onclick = () => {
new Pieces.ModelsApi().modelsDeleteSpecificModelCache({
model: phi2Cpu.id, modelDeleteCacheInput: {}}).then(() => {
window.location.reload()
})
}
}
Now that the buttons are added, refresh your page. You’ll see the buttons there to "Download Phi-2 CPU". Go ahead and let that download finish, as you can watch it take place until it completes with the percentage updates coming from the ModelProgressController
that is covered in depth in the article How to Build a Copilot Using Local Large Language Models with Pieces Client.
At this point, you can turn your WIFI off if you would like, as I am for the remainder of this article, to show the offline power and capability of a lightweight model in combination with Pieces Client.
Using Phi-2 With Your Queries
We can take the model that was just downloaded and append it to the data we are sending when we ask the copilot a question. We can pass the value of our selected model into the stream controller we set up. If you are building your project, you can get the entire stream controller file. Just add the CopilotStreamController.tsx
file to the same folder as your index.ts.
Let's take a look at the specifics of asking a question with the Phi-2 models in askQGPT
:
const input: Pieces.QGPTStreamInput = {
question: {
query,
// this is where the model itself is passed in.
model: CopilotStreamController.selectedModelId
},
};
The CopilotStreamController.selectedModelId
value is updated when the download finishes by simply reloading the page. This allows for the model to be detected when the DOM loads:
// when the models complete downloading, we refresh the page can detect what models are now downloaded.
if (event.status === Pieces.ModelDownloadProgressStatusEnum.Completed) window.location.reload();
Alternatively, if you wanted to pass in your model ID in the same file with the stream input, you could pass in your ID directly to the model parameter:
const input: Pieces.QGPTStreamInput = {
question: {
query,
// this is where the model itself is passed in.
model: phi2Cpu.id
},
};
Now the model can be accessed and utilized by the Pieces.QGPTStreamInput
when a question is asked, as long as the radio button is selected. Then you can start to ask different questions and see the Phi-2 model's responses in this offline environment.
Conclusion
Now that you have seen some of the basic functionality of the Client - downloading and then using that model specifically in the chat question - you can start to understand the capabilities in an offline environment. To go along with this blog, you can get started with Adding Context to Copilot Conversations to take this one step further and reference your local data to craft more specific copilot responses. All of the above examples are available inside of the Pieces Copilot Vanilla Example Project.
Thanks for reading along, and be sure to stay up to date with the newest blogs on using the Pieces Client and communicating with your local copilot after you check out the rest of the series and join our open source program!