Livebook.dev The Livebook Blog logo

The Livebook Blog

Back to Homepage Subscribe to Updates

Labels

  • All Posts
  • releases
  • tutorials
  • announcements
  • launch week

Jump to Month

  • September 2024
  • July 2024
  • March 2024
  • October 2023
  • August 2023
  • July 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • May 2022
  • January 2022
  • April 2021
Powered️ byAnnounceKit

Create yours, for free!

announcements
a year ago

Welcoming Tigris as a new sponsor of Livebook!

We're thrilled to announce Tigris as the latest Livebook sponsor!

Tigris is a globally distributed S3-compatible object storage service that provides low latency anywhere in the world.

We've prepared a short demo showcasing the Tigris API through Livebook (of course 😉) to highlight its capabilities.

Quick intro to Tigris

You'll find Tigris intuitive if you're familiar with other object storage services like AWS S3. Here's an example of saving an object to a Tigris bucket:

ExAws.S3.put_object("my_tigris_bucket_name", "cat.jpeg", image)
|> ExAws.request!()

Notice we're using an AWS S3 client. Thanks to Tigris's S3-API compatibility, transitioning your code requires minimal changes.

Let's say the object is uploaded to a public bucket; to retrieve it, it's just a simple HTTP GET:

object_url = "https://fly.storage.tigris.dev/my_tigris_bucket_name/cat.jpeg"
Req.get!(object_url)

Simple, right? Now for the real game-changer…

Imagine uploading a file from São Paulo (Brazil). The object is stored automatically in the nearest Tigris region, GRU (São Paulo).

When a user in Mumbai (India) requests the object, it's initially served from GRU. However, Tigris automatically caches the object in the region closest to the request's origin, significantly improving subsequent access times — all without extra configuration!

Here are some real numbers for that example:

RequestResponseResponse time (milliseconds)
gru (São Paulo)  gru (São Paulo)3.018
bom (Mumbai)gru (São Paulo)  1207.364
bom (Mumbai)sin (Singapore)141.98
bom (Mumbai)sin (Singapore)134.972
bos (Boston)gru (São Paulo)  1393.971
bos (Boston)iad (Washington, D.C.)29.399
bos (Boston)iad (Washington, D.C.)21.344

Tigris combines object storage with CDN capabilities. That means great UX for end-users and great DX for developers.

Let's look at a demo notebook that showcases this.

Demo

In our demo, we use Livebook's integration with MapLibre to plot the HTTP request and response locations on a map. We also log the response times in a table.

Additionally, we leveraged the Livebook Apps feature to visualize the notebook as an app.

Here's the result:


Wrap-up

Tigris brings some cool innovations to the object storage space. For example, here's how you could use it as a global key-value store from Elixir.

If you want to try using Tigris with Elixir, they have official Elixir support.

releaseslaunch week
a year ago

Vim and Emacs key bindings - Launch Week 2 - Day 5

Welcome to the 5th day of Livebook Launch Week 2! 🎉

For those of us who found ourselves instinctively trying to use Vim or Emacs shortcuts in Livebook, only to realize that these weren’t supported, feeling frustrated. We hear you.

We’re excited to announce that now you can use Vim and Emacs key bindings while coding in Livebook.

Let’s see how it works.

Using Vim or Emacs key bindings in Livebook

Go to the settings page in Livebook and choose the “Key bindings” you prefer:

Once that’s done, you can use it in any Livebook cell with a code editor, like in a code cell or a markdown cell. Here’s an example using Vim keybindings:

From now on, you can bring your Vim/Emacs muscle memory to Livebook for a more pleasant and productive experience. Thanks to Kenichi Nakamura, who contributed that feature.

What now?

To start using that new feature, install the latest version of Livebook and have fun!

More of Launch Week 2

  • Day 1: Remote execution Smart cell
  • Day 2: Speech-to-text with Whisper: timestamping, streaming, and parallelism, oh-my!
  • Day 3: Introducing File Integration
  • Day 4: Integration with SnowFlake and Microsoft SQL Server

And if you want to discover everything that was added/changed in this release, here’s the changelog.

releaseslaunch week
a year ago

Integration with Snowflake and Microsoft SQL Server - Launch Week 2 - Day 4

Welcome to the 4th day of Livebook Launch Week 2! 🎉

Today’s post is about our two new database integrations: Snowflake and SQL Server.

Together with those new ones, Livebook now comes with built-in integrations for seven databases and data warehouses:

  • PostgreSQL
  • MySQL
  • SQL Server
  • SQLite
  • Google BigQuery
  • Amazon Athena
  • Snowflake

Let’s see how the new integrations work.

Connecting to Snowflake

The Snowflake integration is available through the Database Connection Smart cell. All you need to do is configure the Smart cell with your connection credentials, and you’ll be ready to start executing queries.

Let’s see how that works.

Under the hood, this is different from the other database integrations we already have. Livebook connects to Snowflake through ADBC (Arrow Database Connectivity), using the ADBC hex package, which contains ADBC bindings for Elixir.

The Arrow format is highly efficient, and it’s also integrated with Explorer, so now you can easily query a database and load the result into an Explorer dataframe.

Good news for those of us doing data analysis with Elixir and Livebook. 🎉

Connecting to Microsoft SQL Server

Connecting to SQL Server is also super simple. All you need to do is add a new Database Connection Smart cell, select SQL Server, and fill in your database credentials.

Here’s how it works.

This was a community contribution from Simon McConnell. Curious fact: the pull request contains less than 250 LOC! Cool, right?! 😎

What now?

To start playing with those new features, install the latest version of Livebook and have fun!

More of Launch Week 2

  • Day 1: Remote execution Smart cell
  • Day 2: Speech-to-text with Whisper: timestamping, streaming, and parallelism, oh-my!
  • Day 3: Introducing File Integration
  • Day 5: Vim and Emacs key bindings
releaseslaunch week
a year ago

Introducing File Integration - Launch Week 2 - Day 3

Welcome to the 3rd day of Livebook Launch Week 2! 🎉

Today, we’re excited to announce Livebook v0.11’s most prominent feature, file integration.

Although file integration may not sound exciting at first glance, bear with us; we have some elegant features to show.

Imagine code generation when you drag and drop a file based on the file type. Or quickly do a lazy reading of a multiple gigabyte parquet file stored in S3. That’s what we’re talking about! 😎

Watch the video where José Valim shows a demo of what this feature can do.


Why?

Throughout the last releases, we have been improving how Livebook helps to work with data.

Data comes from different sources. One common source is a database, which Livebook already has good support for. Another source is a file, either on your machine or on S3, which Livebook hasn’t natively integrated with yet.

Don’t get us wrong, you could already write some Elixir code in your notebook that reads a file from the file system or the web. But the Livebook UI had no understanding of it. And that’s what changed in this release.

What’s the Livebook file integration?

The concept is simple. When you’re editing a notebook, you can add a file to it. All files are listed in the sidebar, so everyone using your notebook can quickly glance at your file dependencies. Once you do that, Livebook exposes an API for your notebook to read that file via the Kino.FS module.

But that’s not the only thing. Now that Livebook natively understands files, it can offer code suggestions based on the file type.

For example, when you drag and drop a CSV file to Livebook, it can generate code that creates an Explorer Dataframe from that data:

Or when you drag and drop a SQLite database file, Livebook can generate code that describes that database:

Or when you drag and drop an audio file, Livebook can generate code that uses a Machine Learning model to generate a transcription for you:

One thing you’ll notice in common with all those drag-and-drop examples is that Livebook doesn’t automatically execute the code. Instead, it emits the code and allows you to run it yourself.

This may look like a small detail, but it’s part of our vision for Livebook. We don’t want it to be a magic box that executes some task behind the scenes that you can’t know what it is. We want to enable you to introspect the code, learn from it, and customize it to your needs.

We took that approach with Smart cells. And that’s the approach we’re continuing to apply.

File references and file attachments

When you start adding files to your notebook, you’ll notice that there are two types of files: References and Attachments.

References are files that point to existing resources on your disk, a remote storage, or a URL. It’s like a symbolic link; Livebook doesn’t store the file, it just has a pointer to it.

Attachments are different. They are files stored in the files/ directory alongside your notebook source. So, a file attachment will be kept wherever you save your notebook.

So, if you’re storing a file in an S3-compatible cloud service, in a web server, or if you want to save it and version it alongside your notebook source, Livebook has you covered.

Lazy data reading and manipulation

When you have a dataset with multiple gigabytes of data, you don’t want to load all of it at once to the memory. A common approach for that is lazy loading the data. And this new Livebook release supports that workflow as well.

For example, let’s say your dataset is stored as a Parquet file in S3. When you add that file to your notebook, Livebook will give the option to load that data into a dataframe lazily:

Not only that, but you can also lazily manipulate that data:

When needed, Livebook will download all the data, like when plotting a chart based on the data or when exporting it after some data wrangling:


All that means is now more than ever, you can use Livebook for work with multiple gigabytes of data without needing a machine with dozens of gigabytes of memory.

What now?

Ready to take the new Livebook for a spin?

Install the latest version, drop one of those files into a notebook, and see what happens!

More of Launch Week 2

  • Day 1: Remote execution Smart cell
  • Day 2: Speech-to-text with Whisper: timestamping, streaming, and parallelism, oh-my!
  • Day 4: Integration with SnowFlake and Microsoft SQL Server
  • Day 5: Vim and Emacs key bindings
releaseslaunch week
a year ago

Speech-to-text with Whisper: timestamping, streaming, and parallelism, oh-my! - Launch Week 2 - Day 2

When we announced Bumblebee, a collection of pre-trained models inspired by Hugging Face Transformers, the Whisper speech-to-text model quickly became one of the favorite and most used models within the Elixir community.

Thanks to advancements in the overall Numerical Elixir ecosystem, Livebook v0.11 includes a highly improved integration with Whisper, which we will detail in this article.

If you want to skip ahead and give it a try, install Livebook and start a new notebook. Then click “+ Smart cell” and choose “Neural Network task.” You will find Whisper as Speech-to-text under Audio.


New features

There are three new features in our Whisper integration:

  1. Timestamping: we now include timestamps on audio segments.
  2. Streaming: our previous version of Whisper was limited to 30 seconds of audio, leaving it up to users to break their audio apart. This new version is capable of streaming both inputs and outputs. You can give arbitrarily long files to the model, which will be streamed as input, and the model will proceed to merge and stream transcriptions as they arrive.
  3. Parallelism: in addition to streaming, files with more than 30 seconds will be split and batched according to the Neural Network batch size. For example, with a batch size of 10, up to 5 minutes of audio can be processed in parallel. Thanks to this, we expect our models to perform inference an order of magnitude faster compared to Open AI’s implementation when transcribing larger files on the GPU.

Of course, all of those features work together, providing a delightful experience as you can see below, where we transcribe on the fly one of Thinking Elixir episodes:

When you combine the features above with Nx’s ability to run neural networks distributed across multiple machines and GPUs, Elixir developers now have a first-class, state-of-the-art, speech-to-text model ready to run, enjoy, and scale.

What now?

Try for yourself!

Transcribe an audio file using our built-in Neural Network Task Smart cell. Maybe start with a small file to quickly see the result. Then you can try a bigger one.

Download the latest Livebook version and have fun!

More of Launch Week 2

  • Day 1: Remote execution Smart cell
  • Day 3: Introducing File Integration
  • Day 4: Integration with SnowFlake and Microsoft SQL Server
  • Day 5: Vim and Emacs key bindings
releaseslaunch week
a year ago

Remote execution Smart cell - Launch Week 2 - Day 1

Welcome to the second Livebook Launch Week! 🎉

If this is your first Launch Week with us, let us explain how this works. Starting today, each day of this week, we’ll announce a new feature of this release, Livebook v0.11.

Today, we’ll discuss our newest built-in Smart cell: Remote execution.

Calling code from other nodes before the new Smart cell

Livebook is a multi-purpose tool. And two of the most common use cases we’ve been seeing are debugging and internal tools.

A common requirement for both use cases is to write code that calls functions from a remote node, probably your main Elixir app in the production environment.

One way to do that is by connecting your notebook to the other node using the “Attached node” runtime:

That said, that approach has a limitation. When you use it, all of the code in your notebook will run in the context of that remote node. So, you can’t add Mix dependencies to your notebook, like VegaLite for charts or Kino for user interactions.

The other way we have to run remote functions is by using the default runtime (Elixir standalone) and using the erpc Erlang module, like this:

But, we noticed people have been using the “Attached node” by default when calling a function from a remote node, even when they need to add dependencies to their notebook. That’s a problem because although the “Attached node” setting is powerful for debugging, we believe that most of the time you need to interact with a remote node, you should use the erpc approach.

So, we decided to make the erpc approach a first-class citizen in Livebook. It’s now available as a new Smart cell that will streamline the process of calling functions from another node.

Calling code from other nodes using the Remote execution Smart cell

Here’s how the new Smart cell works:

As with any other Smart cell, one of the things it helps with is generating boilerplate code for you. There is no need to write the code that connects your notebook to a remote node over and over again.

But it goes beyond that.

The Smart cell can store the remote node’s cookie as a Livebook secret, so you don’t need to hardcode sensitive values inside your notebook. And with the upcoming Livebook Teams, it will be easy to share this secret with everyone on the team in a secure manner.

Last, this new Smart cell gives you autocompletion and docs preview of the modules defined in the remote node! Isn’t that awesome?! 🤯

We hope this feature will simplify the development of internal tools that need to communicate with existing nodes.

As an example, here’s a video showing how to use the new feature to connect a Livebook notebook to a Phoenix app, get some data from it, display a chart with those metrics, and share that as an interactive Livebook app:


What now?

All of the new features of this new release are already available today!

To start playing with them, install the latest version of Livebook and have fun!

More of Launch Week 2

  • Day 2: Speech-to-text with Whisper: timestamping, streaming, and parallelism, oh-my!
  • Day 3: Introducing File Integration
  • Day 4: Integration with SnowFlake and Microsoft SQL Server
  • Day 5: Vim and Emacs key bindings
announcements
a year ago

Livebook is sponsoring SpawnFest again

We’re happy to announce that we’re sponsoring Spawnfest for the second time!

SpawnFest is an annual 48-hour online software development contest in which teams worldwide get one weekend to create the best BEAM-based applications they can.

We are collaborating with them again to create a bracket of the event dedicated to Livebook projects. Last year, most of the entries involved Kino and Smart Cells, but this year we are changing things up. Our Spawnfest bracket this year will focus on Livebook Apps.

Livebook Apps is a feature we launched earlier this year that enables you to turn your notebook into an interactive web application.

What you can build with Livebook Apps

Since Livebook is a general-purpose tool, there aren’t many constraints on what you can build with Livebook Apps. So, it’s up to your imagination. But here are some examples to give you some inspiration.

A multi-user real-time app

Livebook comes with real-time collaboration out of the box. You can use that power in your Livebook apps as well.

Here’s a video showing how to build a chat app.


A machine learning app

Livebook integrates with Hugging Face to bring you pre-trained models and helps you to use them with just a few clicks.

Here’s a video showing how to build an app that uses the Whisper machine learning model to transcribe audio messages in a chat.


A data visualization app

Livebook integrates with Vegalite and Maplibre to make it super easy to build lots of different kinds of data visualization.

Here’s an example of an app that plots a chart with the number of stars a Github project got over time.

This app is deployed, so you can try it live. And here is the source code.

A workflow automation app

One month ago, we expanded Livebook Apps with a feature called Multi-Session Livebook Apps. We believe that this new feature is excellent for the automation of technical and business workflows.

You can think of it as a way to transform a script into a UI and share that with others by just sharing an URL.

Here’s a video showing how to build an app that gets data from Github’s API, generates a report, and sends it to a Slack channel.


How to deploy a Livebook App

Livebook Apps run inside a Livebook instance. So, you can run them in any Livebook installation, be it in the localhost or the cloud.

One of the easiest ways to install Livebook in the cloud is using Docker. Let’s see two examples of deploying a Livebook App this way.

How to deploy a Livebook App to Fly.io

First, make sure you have Fly’s command-line installed on your machine.

Now, clone the following template repo:

git clone https://github.com/hugobarauna/livebook-apps-on-fly-template.git my-livebook-apps

Add a file with the source code of your Liveobok App to the public-apps/ directory of your repository.

Then, follow Fly’s instructions to Deploy via Dockerfile.

After that, Livebook and your Livebook App will be running inside Fly.

Here’s a video showing how that works.


How to deploy a Livebook App to Hugging Face

To deploy a Livebook App to Hugging Face, we’ll use Hugging Face Docker Spaces.

First, install Livebook on Hugging Face by following these instructions.

Second, add a file with the source code of your Liveobok App to the public-apps/ directory of your Space and make a commit.

After that, Hugging Face will rebuild your Space, and your Livebook app will be deployed.

Here’s a video showing how that works.


How to participate in Spawnfest

Participation in SpawnFest is 100% free of charge. To register, go to their website and follow the instructions.

We’re looking forward to seeing what you will build with Livebook Apps!

releases
a year ago

What's new in Livebook 0.10 - Introducing Multi-Session Livebook Apps

Today we’re launching Livebook 0.10! 🎉

This major update brings many exciting features, with the spotlight being the introduction of multi-session Livebook apps.

We’ve also added a presentation view, initial Erlang support, Live Doctests, and dataframe file export. Let’s dive in and explore these new features.

Multi-Session Livebook Apps

Livebook 0.9 introduced Livebook apps. This is a way to turn your notebook into an interactive web application. Now, we’re expanding that further.

Initially, Livebook Apps was designed for long-running applications. Behind the scenes, only one instance of a Livebook app could run at any given moment. Since Livebook has built-in support for multiple users, all users accessing an app would be sharing the same instance of the app. We’re now calling that single-session Livebook apps.

This new version introduces multi-session Livebook Apps. What’s different is that when you join a multi-session application, you get a version of that app exclusively for you. Like single-session apps, multi-session apps can run for as long as they want, but most often, they will receive user input, execute several instructions, and then exit.

We believe they are an excellent fit for automating technical and business workflows. You can think of them as something similar to scripts, but instead of running in a terminal, they are interactive web applications accessed through the browser.

For example, instead of repeatedly being asked to run one-off scripts, you can package that script as a Livebook app and makes it accessible to other team members to run it, at any time, by themselves.

Let’s see how that works.

Presentation View

We noticed many people use Livebook for presentations. However, it can be frustrating to switch between Livebook and your slides. Also, showing the whole notebook can distract your audience. But there’s good news!

Franklin Rakotomalala contributed a Presentation view feature that hides the sidebar and focuses on the part of your notebook you want to present. Here’s how it works.

Shout out to Franklin!

Initial Erlang Support

You can now write Erlang code inside Livebook. Not only that, but you can combine it with Elixir in the same notebook. If you define a function or a variable inside one language, you can easily use it in the other.

Watch the video below for an example.

This was a community contribution by Benedikt Reinartz. Thanks to him, Livebook now supports multiple BEAM languages.

Live Doctests

Starting from version 0.8, doctests are integrated with Livebook. This new version comes with exciting improvements in that area.

When you evaluate a cell with doctests, a traffic light-like status appears on the left of each doctest. This gives you a visual indication if it passed or not. Additionally, when the doctest fails, the failing result is directly inlined in the code editor.

Here’s how it works:

This is a step towards bringing Live Programming concepts into Livebook, the idea that for “programming to be more fluid, editing and debugging should occur concurrently as you write code.”

This one started with the code contribution of Jose Vargas and the research work of Szymon Kaliski.

Dataframe File Export

When working on a data analysis task using Livebook, you might need to access the analysis result from another tool or share them with someone who prefers opening it in a spreadsheet.

Now, you can easily do that by exporting your dataframe to a CSV, NDJSON, or Parquet file. Here’s how it works:

Try it!

To play with the new features, follow these steps:

  • Install the latest Livebook version
  • Import the demo notebook that showcases the new features by clicking the badge below

Run in Livebook

And if you want to discover everything that changed in 0.10, here’s the changelog.

Happy hacking!

releaseslaunch week
2 years ago

Data wrangling in Elixir with Explorer, the power of Rust, the elegance of R - Launch Week 1 - Day 5

Welcome to the fifth and last day of the first Livebook Launch Week!

Today we will talk about data wrangling with Livebook and the new data capabilities with this new release.

Watch the video where José Valim shows a demo of those new features.

You can also read an overview of the new features below.

Explorer: series and dataframes for fast data exploration in Elixir

Explorer is a project that brings series (one-dimensional data) and data frames (two-dimensional data) right into Elixir.

It is implemented on top of the Polars project, a highly performant dataframe Rust library. And it’s highly inspired by the dplyr project from R, which is quite expressive. We aim to bring ideas from both communities into Elixir to provide a powerful and elegant tool for data processing.

Let’s play with Explorer a little bit.

The first step of a data exploration project is to import the data. Let’s see how you can import a dataset into an Explorer data frame:

After importing the data, it’s usually a good idea to explore it a little bit. To do that, we’ll use a new feature from Livebook that helps us visualize an Explorer Dataframe as a table.

Visualizing an Explorer dataframe as an interactive table

Kino is the library used by Livebook to render rich and interactive outputs directly from your Elixir code. Livebook has multiple built-in Kinos, but anyone can also build custom Kinos as a way to extend Livebook.

We built a new Kino called Kino Explorer for this release to improve the integration between Explorer and Livebook. Let’s see how we can use it to display an Explorer data frame as an interactive table:

Data transformation using the new Data Transform Smart cell

In Livebook, a Smart cell is a UI-based cell that helps you to accomplish a specific task like creating a database connection, sending a message to a Slack channel, or running a Machine Learning model.

Livebook has various built-in Smart cells, and anyone can build a Smart cell to extend Livebook to their needs.

For this release, we built a new built-in Smart cell; the Data Transform one. Let’s see how we can use it for some data transformation.

What now?

Want to play with all of those new features?

First, ensure you’ve installed the latest version of Livebook.

Then, click the button below to run the notebook that José Valim built in the demo video:

Run in Livebook

If you have any comments or want to share what you’ve built using Livebook, you can tweet using the #LivebookLaunchWeek hashtag.

I hope you got as excited as we did with this new Livebook 0.9 release.

Besides that, we’re already working on much more exciting stuff we’re looking forward to sharing—for example, the upcoming Livebook Teams.

If you use or want to use Livebook at work with your colleagues, you can fill in our form to help us to inform Livebook Team’s roadmap and get updates about it.

Thank you very much for being with us these last five days. This is all for this first Livebook Launch Week!

Building Livebook and sharing what it’s capable of is a joy for us. We hope you can have fun with it too. 😄

More Launch Week

  • Day 1: Deploy notebooks as apps & quality-of-life upgrades
  • Day 2: Distributed² Machine Learning notebooks with Elixir and Livebook
  • Day 3: Hubs and secret management
  • Day 4: Build and deploy a Whisper chat app to Hugging Face in 15 minutes
releaseslaunch week
2 years ago

Build and deploy a Whisper chat app to Hugging Face in 15 minutes - Launch Week 1 - Day 4

Welcome to the fourth day of Livebook Launch Week! 🎉

Today we will build a Whisper chat app with Livebook and deploy it to Hugging Face.

In this chat app, users can communicate only by sending audio messages, which are then automatically converted to text by the Whisper Machine Learning model:

You can access the demo over the next few days on José Valim's Livebook instance on HuggingFace.

This demo covers a lot of cool features from Livebook, Nx, and Bumblebee, like:

  • Deploy Livebook notebooks as apps

    On day 1 of our Launch Week, we saw how to deploy your notebook as an interactive web app. This time we’ll go further and deploy a Machine Learning-based app to Hugging Face.

  • Realtime and multiplayer apps

    Livebook’s modern approach to code notebooks brings interactivity and collaboration. So, apps built with Livebook are also interactive and multiplayer by default! The Whisper chat app we’ll be an excellent example of that.

  • Concurrent Machine Learning

    On day 2 of our Launch Week, we talked about the new Machine Learning models you can use with just a few clicks through Livebook’s Neural Network Smart Cell. In today’s video, you’ll see how to customize the code generated by the Smart cell and build an app on top of it.

    We’ll add concurrency and batching to improve the app’s scalability and performance, allowing multiple users to submit audio messages simultaneously and efficiently process them using the Whisper model.

What now?

Want to run the Whisper chat app by yourself?

First, ensure you’ve installed the latest version of Livebok.

Then, click the button below to run the notebook yourself:

Run the Whisper chat app in your Livebook

If you have any comments or want to share what you’ve built using Livebook, you can tweet using the #LivebookLaunchWeek hashtag.

Stay tuned for the final announcement of the Livebook Launch Week!

More Launch Week

  • Day 1: Deploy notebooks as apps & quality-of-life upgrades
  • Day 2: Distributed² Machine Learning notebooks with Elixir and Livebook
  • Day 3: Hubs and secret management
  • Day 5: Data wrangling in Elixir with Explorer, the power of Rust, the elegance of R