local meeting summaries using screenpipe + ollama

Louis Beaumont - Sep 19 - - Dev Community

hey devs,

i've been working on a tool called screenpipe that lets you generate meeting summaries locally, even for whatsapp calls. thought i'd share my setup in case it's useful to anyone else.

the problem

taking notes during meetings is a pain, and most transcription services send your data to the cloud. i wanted something that could run entirely on my macbook.

the setup

here's what i'm using:

  1. screenpipe: open-source tool for continuous screen/audio capture, written in rust, cross-platform
  2. ollama: runs llms locally
  3. solar pro 22b: the language model i'm using (via ollama)

how it works

  1. screenpipe captures my screen and audio 24/7
  2. after a meeting (zoom, whatsapp, whatever), i query the last hour of data
  3. send that data to solar pro 22b running locally via ollama
  4. get back a concise summary

here's some pseudo code to illustrate:

// 1h ago
const startDate = "<some time 1h ago..>"
// 10m ago
const endDate = "<some time 10m ago..>"

// get all the screen & mic data from roughly last hour 
const results = fetchScreenpipe(startDate, endDate)

// send it to an LLM and ask for a summary
const summary = fetchOllama("{results} create a summary from these transcriptions")

// add the meeting summary to your notes
addToNotion(summary)
Enter fullscreen mode Exit fullscreen mode

why it's cool

  1. privacy: all data stays on your machine
  2. works for any app: zoom, teams, whatsapp, in-person meetings, etc.
  3. customizable: you can create plugins in TS to connect LLMs to your 24/7 screens & mics recording

try it out

if you want to experiment:

  1. install screenpipe: github.com/mediar-ai/screenpipe
  2. set up ollama with solar pro 22b
  3. use the api to query your data and generate summaries

curious if anyone else is working on similar local setups for meeting summaries. what has your experience been?

.
Terabox Video Player