Build and deploy a Whisper chat app to Hugging Face in 15 minutes - Launch Week 1 - Day 4
Welcome to the fourth day of Livebook Launch Week! 🎉
Today we will build a Whisper chat app with Livebook and deploy it to Hugging Face.
In this chat app, users can communicate only by sending audio messages, which are then automatically converted to text by the Whisper Machine Learning model:
This demo covers a lot of cool features from Livebook, Nx, and Bumblebee, like:
Deploy Livebook notebooks as apps
On day 1 of our Launch Week, we saw how to deploy your notebook as an interactive web app. This time we’ll go further and deploy a Machine Learning-based app to Hugging Face.
Realtime and multiplayer apps
Livebook’s modern approach to code notebooks brings interactivity and collaboration. So, apps built with Livebook are also interactive and multiplayer by default! The Whisper chat app we’ll be an excellent example of that.
Concurrent Machine Learning
On day 2 of our Launch Week, we talked about the new Machine Learning models you can use with just a few clicks through Livebook’s Neural Network Smart Cell. In today’s video, you’ll see how to customize the code generated by the Smart cell and build an app on top of it.
We’ll add concurrency and batching to improve the app’s scalability and performance, allowing multiple users to submit audio messages simultaneously and efficiently process them using the Whisper model.
Want to run the Whisper chat app by yourself?
First, ensure you’ve installed the latest version of Livebok.
Then, click the button below to run the notebook yourself:
If you have any comments or want to share what you’ve built using Livebook, you can tweet using the #LivebookLaunchWeek hashtag.
Stay tuned for the final announcement of the Livebook Launch Week!