Livebook.dev The Livebook Blog logo

The Livebook Blog

Back to Homepage Subscribe to Updates

Labels

  • All Posts
  • releases
  • tutorials
  • announcements
  • launch week

Jump to Month

  • September 2024
  • July 2024
  • March 2024
  • October 2023
  • August 2023
  • July 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • May 2022
  • January 2022
  • April 2021
Powered️ byAnnounceKit

Create yours, for free!

announcements
a year ago

Welcoming Tigris as a new sponsor of Livebook!

We're thrilled to announce Tigris as the latest Livebook sponsor!

Tigris is a globally distributed S3-compatible object storage service that provides low latency anywhere in the world.

We've prepared a short demo showcasing the Tigris API through Livebook (of course 😉) to highlight its capabilities.

Quick intro to Tigris

You'll find Tigris intuitive if you're familiar with other object storage services like AWS S3. Here's an example of saving an object to a Tigris bucket:

ExAws.S3.put_object("my_tigris_bucket_name", "cat.jpeg", image)
|> ExAws.request!()

Notice we're using an AWS S3 client. Thanks to Tigris's S3-API compatibility, transitioning your code requires minimal changes.

Let's say the object is uploaded to a public bucket; to retrieve it, it's just a simple HTTP GET:

object_url = "https://fly.storage.tigris.dev/my_tigris_bucket_name/cat.jpeg"
Req.get!(object_url)

Simple, right? Now for the real game-changer…

Imagine uploading a file from São Paulo (Brazil). The object is stored automatically in the nearest Tigris region, GRU (São Paulo).

When a user in Mumbai (India) requests the object, it's initially served from GRU. However, Tigris automatically caches the object in the region closest to the request's origin, significantly improving subsequent access times — all without extra configuration!

Here are some real numbers for that example:

RequestResponseResponse time (milliseconds)
gru (São Paulo)  gru (São Paulo)3.018
bom (Mumbai)gru (São Paulo)  1207.364
bom (Mumbai)sin (Singapore)141.98
bom (Mumbai)sin (Singapore)134.972
bos (Boston)gru (São Paulo)  1393.971
bos (Boston)iad (Washington, D.C.)29.399
bos (Boston)iad (Washington, D.C.)21.344

Tigris combines object storage with CDN capabilities. That means great UX for end-users and great DX for developers.

Let's look at a demo notebook that showcases this.

Demo

In our demo, we use Livebook's integration with MapLibre to plot the HTTP request and response locations on a map. We also log the response times in a table.

Additionally, we leveraged the Livebook Apps feature to visualize the notebook as an app.

Here's the result:


Wrap-up

Tigris brings some cool innovations to the object storage space. For example, here's how you could use it as a global key-value store from Elixir.

If you want to try using Tigris with Elixir, they have official Elixir support.

announcements
a year ago

Livebook is sponsoring SpawnFest again

We’re happy to announce that we’re sponsoring Spawnfest for the second time!

SpawnFest is an annual 48-hour online software development contest in which teams worldwide get one weekend to create the best BEAM-based applications they can.

We are collaborating with them again to create a bracket of the event dedicated to Livebook projects. Last year, most of the entries involved Kino and Smart Cells, but this year we are changing things up. Our Spawnfest bracket this year will focus on Livebook Apps.

Livebook Apps is a feature we launched earlier this year that enables you to turn your notebook into an interactive web application.

What you can build with Livebook Apps

Since Livebook is a general-purpose tool, there aren’t many constraints on what you can build with Livebook Apps. So, it’s up to your imagination. But here are some examples to give you some inspiration.

A multi-user real-time app

Livebook comes with real-time collaboration out of the box. You can use that power in your Livebook apps as well.

Here’s a video showing how to build a chat app.


A machine learning app

Livebook integrates with Hugging Face to bring you pre-trained models and helps you to use them with just a few clicks.

Here’s a video showing how to build an app that uses the Whisper machine learning model to transcribe audio messages in a chat.


A data visualization app

Livebook integrates with Vegalite and Maplibre to make it super easy to build lots of different kinds of data visualization.

Here’s an example of an app that plots a chart with the number of stars a Github project got over time.

This app is deployed, so you can try it live. And here is the source code.

A workflow automation app

One month ago, we expanded Livebook Apps with a feature called Multi-Session Livebook Apps. We believe that this new feature is excellent for the automation of technical and business workflows.

You can think of it as a way to transform a script into a UI and share that with others by just sharing an URL.

Here’s a video showing how to build an app that gets data from Github’s API, generates a report, and sends it to a Slack channel.


How to deploy a Livebook App

Livebook Apps run inside a Livebook instance. So, you can run them in any Livebook installation, be it in the localhost or the cloud.

One of the easiest ways to install Livebook in the cloud is using Docker. Let’s see two examples of deploying a Livebook App this way.

How to deploy a Livebook App to Fly.io

First, make sure you have Fly’s command-line installed on your machine.

Now, clone the following template repo:

git clone https://github.com/hugobarauna/livebook-apps-on-fly-template.git my-livebook-apps

Add a file with the source code of your Liveobok App to the public-apps/ directory of your repository.

Then, follow Fly’s instructions to Deploy via Dockerfile.

After that, Livebook and your Livebook App will be running inside Fly.

Here’s a video showing how that works.


How to deploy a Livebook App to Hugging Face

To deploy a Livebook App to Hugging Face, we’ll use Hugging Face Docker Spaces.

First, install Livebook on Hugging Face by following these instructions.

Second, add a file with the source code of your Liveobok App to the public-apps/ directory of your Space and make a commit.

After that, Hugging Face will rebuild your Space, and your Livebook app will be deployed.

Here’s a video showing how that works.


How to participate in Spawnfest

Participation in SpawnFest is 100% free of charge. To register, go to their website and follow the instructions.

We’re looking forward to seeing what you will build with Livebook Apps!

announcements
2 years ago

Livebook inside Hugging Face Spaces

We are thrilled to introduce Livebook on Hugging Face Spaces! 🎉

Each Hugging Face Space environment offers up to 16GB of RAM and 2 CPU cores for free.

If you’re unfamiliar with Hugging Face (HF), it’s a platform for building, sharing, and collaborating on machine learning applications.

This is our second integration with Hugging Face; the first was through Bumblebee, which brings pre-trained neural network models from Hugging Face to the Elixir community.

We’ve been collaborating with HF to make using Livebook on Spaces a breeze. If you have a Hugging Face account, simply click the one-click deployment button below:

Alternatively, follow the step-by-step tutorial to install Livebook in a Hugging Face Space.

Livebook with GPU acceleration on Hugging Face Spaces

One of the great features of HF Spaces is its ease of upgrading to hardware with GPU accelerators. This is particularly useful for Machine Learning applications, which can be highly parallelized and computationally intensive.

Check out this video to see how easy it is to run Stable Diffusion with Livebook on top of a Hugging Face Space powered by a GPU:

Ready to give it a try for yourself? Start experimenting with Livebook on Hugging Face Spaces now.

announcements
2 years ago

Announcing Bumblebee: GPT2, Stable Diffusion, and more in Elixir

We are glad to announce a variety of Neural Networks models are now available to the Elixir community via the Bumblebee project.

We have implemented several models, from GPT2 to Stable Diffusion, in pure Elixir, and you can download training parameters for said models directly from Hugging Face.

To run your first Machine Learning model in Elixir, all you need is three clicks, thanks to our integration between Livebook and Bumblebee. You can also easily embed and run said models within existing Elixir projects.

Watch the video by José Valim covering all of these topics and features:

Running Machine Learning models with Livebook

Here are some examples of what it looks like to use Livebook to run Machine Learning models:

Text to image

Image classification

Text classification

Text generation

Incorporating models into any Elixir project

Thanks to the new Nx.Serving functionality, you should be able to incorporate those models into any existing project with little effort and run them at scale.

You should be able to embed and serve these models as part of your existing Phoenix web applications, integrate them into data processing pipelines with Broadway, and deploy them alongside Nerves embedded systems - without needing 3rd-party dependencies.

Here’s an example of using Bumblebee to run an image classification model inside a Phoenix app:

Integration with Hugging Face

Bumblebee allows you to download and use trained models directly from Hugging Face. Let’s see an example of how we can generate text continuation using the popular GPT-2 model:

{:ok, gpt2} = Bumblebee.load_model({:hf, "gpt2"})
{:ok, tokenizer} = Bumblebee.load_tokenizer({:hf, "gpt2"})

serving = Bumblebee.Text.generation(gpt2, tokenizer, max_new_tokens: 10)

text = "Yesterday, I was reading a book and"
Nx.Serving.run(serving, text)

#=> %{
#=>    results: [
#=>        %{
#=>            text: "Yesterday I was reading a book and I was thinking, 'What's going on?'"
#=>        }
#=>    ]
#=> }

As you can see, we just load the model data and use a high-level function designed for the text generation task, that’s it!

Compiling to CPU/GPU

All of our Neural Networks can be compiled to the CPU/GPU, thanks to projects such as EXLA (based on Google XLA) and Torchx (based on Libtorch).

A massive milestone in our Numerical Elixir effort

This release is a huge milestone in our Numerical Elixir effort, which started almost two years ago. These new features, in particular, are possible thanks to the enormous efforts of José Valim, Jonatan Kłosko, Sean Moriarity, and Paulo Valente. We are also thankful to Hugging Face for enabling collaborative Machine Learning across communities and tools, which played an essential role in bringing the Elixir ecosystem up to speed.

Next, we plan to focus on training and transfer learning of Neural Networks in Elixir, allowing developers to augment and specialize pre-trained models according to the needs of their businesses and applications. We also hope to publish more on our progress in developing traditional Machine Learning algorithms.

Your turn

If you want to give Bumblebee a try, you can:

  • Download Livebook v0.8 and automatically generate “Neural Networks tasks” from the “+ Smart” cell menu inside your notebooks.
  • We have also written single-file Phoenix applications as examples of Bumblebee models inside Phoenix (+ LiveView) apps. Those should provide the necessary building blocks to integrate Bumblebee into your production app.
  • For a more hands-on approach, read some of our notebooks.

If you want to help us build the Machine Learning ecosystem for Elixir, check out the projects above, and give them a try. There are many exciting areas, from compiler development to model building. For instance, pull requests that bring more models and architectures to Bumblebee are welcome. The future is concurrent, distributed, and fun!

announcements
2 years ago

Introducing the Livebook Desktop app 🎉

We want Livebook to be accessible to as many people as possible. Before this release, installing Livebook on your machine could be considered easy, especially if you already had Elixir installed.

But imagine someone who's not an Elixir developer. They had to either install Docker or Elixir before getting started with Livebook. And if they are in their first steps as developers, even using a terminal could be demanding.

That's why we built Livebook desktop. It's the simplest way to install Livebook on your machine.

Livebook desktop app

Livebook desktop doesn't require the person to have Elixir previously installed on their machine. And it works both on the Mac and Windows.

We hope that the desktop app enables way more people to try Livebook. 

For example, let's say you want to help your friend to start learning Elixir. You can tell that person to download Livebook desktop and follow the built-in "Distributed portals with Elixir" notebook, a fast-paced introduction to Elixir.

Or maybe for the data scientist that heard about Livebook and wants to try it. That person can download Livebook desktop and learn how to do data visualization in Livebook with the built-in "Plotting with VegaLite" and  "Maps with MapLibre" notebooks.

If you want to try the Livebook desktop app, it's just one click away.

We hope you enjoy it! 😊