BYOM(odel)

Pieces 🌟 - Dec 19 '23 - - Dev Community

You’re underutilizing your OpenAI Key.

It’s clear that the use of LLMs is the next frontier for nearly every industry, and there are copilots and other tools based on major LLMs everywhere. We’ve found that these copilots are still limited in their functionality; they are great for auto-complete or generating basic boilerplates, or pointing you in the right direction and giving a general sense of the information you’re asking for. So yes, they can definitely save you time. But there are two key things they lack:

  1. Integration across your toolchain
  2. Context into what you are working on, also known as grounding

This is where Pieces for Developers comes in. We’ve talked a lot lately about how the Pieces Copilot is integrated across your whole workflow, running at the operating system level and using the power of retrieval augmented generation to be aware of your entire workflow and make contextualized suggestions based on all your developer materials such as files, repos, saved snippets, and websites. But recently, we started thinking about enterprise development, and how very important context is when it comes to using AI productively while in the development process at a company.

We know that major organizations are utilizing OpenAI to customize models for their specific needs and use cases, particularly around privacy and scalability. They might be fine-tuning these models by incorporating specific knowledge and biases towards their unique data, feeding the model everything from code to emails. This can create a personalized and highly effective tool for your company, if the developers in the company are able to fully leverage it— which they likely aren’t.

We also know developer workflows, and that there is a never-ending need for more, better, and faster when it comes to enterprise development. Our goal is to enable companies to fully leverage LLMs by bringing them as close to these workflows as possible, integrated deeply with the tools they are already using: their IDE, their browser, the Pieces app, etc. With this integration comes deep context passed back to the enterprise’s LLM to help communication with customers become up-to-date and more relevant.

With this in mind, we are excited to release the first version of Bring Your Own Model support for OpenAI.

Accessing BYOM in the Pieces Desktop App.

Fully Utilized Enterprise LLMs. Integrated and Contextualized.

When using Pieces Copilot today, you can choose what LLM you would like to use for processing. We currently offer several choices, both on-device (Llama2, Mistral coming soon) or in the cloud (OpenAI, PaLM 2). With BYOM, we aim to make our platform compatible with your specific LLM, whether it is fine-tuned or off-the-shelf, on-device or cloud-based. It goes without saying that all of this is supported by our privacy-focused implementation model.

With this first version, you can add your personal or enterprise API key from OpenAI, allowing you to use your own quota and rate limits. In our next release, the platform will accommodate various BYOM scenarios, with a significant emphasis on supporting those who have fine-tuned GPT models based on their organizational or team-specific requirements. This will all be done automatically through SSO.

When you or your company selects a custom LLM, the Pieces Copilot will use this refined model when providing contextual workflow assistance. This all-important context makes everything you do more effective for the problem you are trying to solve.

There are a few more features coming up in Pieces that we think enterprise customers are going to love. Our “related people” feature will point you to subject matter experts for all of your snippets. Curated snippet databases will speed up new hires’ paths to productivity immensely and set coding quality and standards. Snippet Discovery will help developers get up to speed with any individual project they walk into. The Feed will get to know your workflow and start to pull in relevant materials and standards from your team to assist. We are only growing our workflow and collaboration tools and can’t wait to see development teams try them out.

So far, Pieces for Developers has largely been a tool for the everyday developer, and it continues to be one; everything we’ve discussed today can be used on an individual basis as well. But we are starting to think bigger: what can Pieces do to accelerate large enterprise development teams, just like we’ve done for thousands of developers across the globe?

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Terabox Video Player