Design a site like this with
Get started

FOSDEM: My first open source event

In an age in which November is a good time to advertise Christmas shopping, March can easily be a good time to talk about FOSDEM. So that’s what I’m going to do. 

My FOSDEM experience

At the beginning of February, I went to FOSDEM, my first open source conference. In fact, it was even my first non-maths conference. In maths, conferences and workshops have to be very topic-specific, since math talks are very concrete, often presenting a proof that can only be followed by people working in the same field (or, sometimes not even by them). FOSDEM was the total opposite to topic-specific. There are people, talks and stands of any kind of open source related areas. So the range of topics covered in talks was immense. But not only the range of topics; also the focus and the feeling given to the talks was very diverse: sometimes the focus was on a certain programming language, sometimes on the tools used, sometimes on the goals that got achieved, sometimes about the directions in which certain developments have been going or seem to be going now and what that implies, sometimes about moral questions involved and so on. It was very interesting to see so many different open source related topics focused from so many different ways in only two days.

But talks were not the only part of the conference that impacted me. The conference was also an opportunity to get to know people from my community: GNOME. It was very nice to meet several people in person, I had only known through chats before, and to get to know people I hadn’t known at all before; both at FOSDEM itself and at a related beer event organized by GNOME. So the social part was very nice, but also productive. For example, at some moment I got help from several people troubleshooting a styling problem I had, caused by an adwaita decision I didn’t know of.

To go to FOSDEM, I also got funded by my Outreachy grant. Thanks! 🙂 It was a great experience.


First milestone, GStreamer pipelines and range requests

This is the second blog post about my Outreachy internship at Fractal. The project I’m working on is the integration of a video player in Fractal.

The progress I’ve made

Like any communication app based on the Matrix protocol, Fractal is structured into rooms. When a user enters a room, they can see the messages that have been sent in that room. I’ll refer to those messages as the room history. During the first weeks of my internship, I’ve integrated a simple video player in the room history: when receiving a video file, the user can play, pause and stop the video right there.

The control you can see in the picture with the play/pause button, the time slider and download button, etc. was already implemented for audio reproduction in Fractal. So basically, my task so far has been to get the video rendered above that box. It might seem simple, but it has been fun. I’ll share just a couple of things that I’ve learned in the process.

Gstreamer Pipelines

A pipeline in GStreamer seems to be one of those concepts whose basic idea is pretty easy to grasp, but that can get as complicated as you want. As its name suggests, a pipeline is a system of connecting pieces that manipulate the media in one way or another. Those connecting pieces are called elements. The element where the media comes from is called source element and the one(s) where it’s rendered is/are called sink element. An example is shown in the drawing in . As you can see there, every element itself again has a source and/or one or more sinks, that connect the elements among each other. The phenomenon, just described, of finding the same concept at the level of elements and at the level of the pipeline is not uncommon. I’ll give two more examples.

The first example is about buffering. On one hand, when pushing data through the pipeline, an element step by step gets access to the media by receiving a pointer to a small buffer in memory from the preceding element (buffers on the level of elements). Before receiving that, the element cannot start working on that piece of media. On the other hand, one can add a buffer element to the pipeline. That element is responsible for letting bigger chunks of data get stored (buffers on the level of the pipeline). Before that’s done, the pipeline cannot start the playback.

The second example concerns external and internal communication. The way a pipeline communicates internally is by sending events from one element to another. There are different kinds of events. Some of them are responsible for informing all pieces of the pipeline about an instruction that might come from outside the pipeline.  An example is wanting to access a certain point of the video and playing the video from there, called seek event. For that to happen, the application can send a seek event to the pipeline (event on the level of pipeline). When that happens, that seek event is put on all sink elements of the pipeline and from there sent upstream, element by element (events on the level of elements), until it reaches the source element, which then pulls the requested data and sends it through the pipeline. But events are just one example of communication. Of course, there are other means. To mention some more: messages the pipeline leaves on the pipeline bus for the application to listen to, state changes and queries on elements or pads.

So I find the concept of pipelines quite interesting. But to practically get media processed the way I want, I’d have to set up a whole pipeline correspondingly. Creating an adequate pipeline and communicating with it and/or its elements can get complicated. But luckily for me, the audio player in Fractal is implemented using a concept called GstPlayer, so that’s what I’ve also used for video. It’s an abstraction of a pipeline that sets up a simple pipeline for you when creating it. It also has a simple API to manipulate certain functionalities of the pipeline once created. And to go beyond those functionalities, you can still extract the underlying pipeline from a GstPlayer and manipulate it manually.

Range requests

In the last section, I’ve briefly mentioned seek events, i.e. events that request to play the video from a certain point. When a source element receives such an event, it tries to pull the requested piece of media. If communicating via http, it tries that sending a range request, which is a request with a header field called Range that specifies which part of the media is requested in bytes (see In order to make sure that range requests are supported, the responses are checked for a header entry “accept-ranges”. Only if that entry exists and its parameter is “bytes” (the other option would be “none”), the support of range requests is guaranteed. Synapse, the standard server of Matrix, does not include the accept-ranges entry in the headers of its response. Therefore seek requests to media files on that server fail.

At some point, I thought I could solve that problem by activating progressive buffering in the pipeline and seek only in the buffered data. But progressive buffering itself uses seeking. So when activating progressive buffering even playback fails. There might be other kinds of buffering that’d do. But for now, our way around the problem is to download the video files and play them locally.

What this blog is about

In this blog, I’m going to post about the progress I make and the things that strike me during my Outreachy internship. Outreachy interns work on an open-source project for three months under the guidance of 1-2 mentors from the community. In my case the community is GNOME, my mentors are danigm and alatiera, and the project is Fractal. Fractal is a pretty cool gtk-desktop application for real-time communication through the Matrix protocol. It’s written in Rust. Here’s a link:

The task

The goal of my internship is to implement a video player in Fractal. Right now receiving a message of a video attachment is handled the same way as receiving a pdf attachment: the functionalities Fractal provides for it are “open” (with another application) and “save”.

I’m going to integrate a video player into the Fractal application that allows the user to directly play the video inside the application. I’ll use GStreamer for that.

About the programming language: Rust

In order to ask for an Outreachy grant for a certain open-source project, applicants first have to contribute to that project for about a month. When choosing a project, I didn’t know any Rust. But the fact that Fractal is written in Rust was an important point in favor due to curiosity. But I also expected to have a hard time at the beginning. Fortunately, that wasn’t really the case. For those who haven’t used Rust, let me give two of the reasons why:

If you just start coding, the compiler takes you by the hand giving you advice like “You have done X. You can’t do that because of Y. Did you maybe mean to do Z?”. I took those pieces of advice as an opportunity to dig into the rules I had violated. That’s definitely a possible way to get a first grip on Rust.

Nevertheless, there are pretty good sources to learn the basics, for example, the Rust Book. Well, to be precise, there’s at least one (sorry, I’m a mathematician, can’t help it, I’ve only started reading that one so far). It’s not short, but it’s very fast to read and easy to understand. In my opinion, the only exception being the topics on lifetimes. But lifetimes can still be understood by other means.

About the GUI library: GTK-rs

The GUI library Fractal uses is GTK-rs, a wrapper in Rust around the C-library GTK. One random interesting fact about GTK-rs, that called my attention at some point reading the Fractal code, was the following. Based on GObject, GTK uses inheritance structures. For example, the class Label is a subclass of Widget. In Rust there aren’t classes. Label is a type and Widget is so as well. So how does Label inherit from Widget in GTK-rs? Well, strictly speaking, it doesn’t. But both types implement a trait called Cast (in Haskell jargon: they are of type class Cast). In fact, any type in GTK-rs coming from GObject implements Cast. The Cast trait allows a type to be converted to types corresponding to superclasses (or subclasses, when that makes sense) in the GObject tree. That’s how you can convert a label to a widget, call a Widget method on it, and -if you want- convert it back.

Converting an instance of a type to a type corresponding to a super- or subclass (in the GObject logic) is called upcast or downcast, respectively. But how does GTK-rs capture the subclass/superclass logic, if the concept of classes doesn’t exist? The answer is: via the trait IsA<T> (here, T is a type parameter). If a type corresponds to a GObject subclass of another type P, then it implements the trait IsA<P> (here, P is a concrete type: the one corresponding to the superclass). For example, Label implements the trait IsA<Widget>. Of course, Widget has far more subclasses than just Label. All of them implement the trait IsA<Widget>

Now, let me come back to the end of the penultimate paragraph and explain why the Cast trait allows a label to be upcasted to a widget. By definition of the trait Cast, saying that Label implements Cast means that it has to have one method upcast<P> for every type P for which it implements IsA<P>. So it has to have a method upcast<Widget>. That method converts a label into a widget.

Downcasting methods are guaranteed very similarly. To start with, whenever a type T implements the trait IsA<P> for some type P (i.e. P corresponds to a GObject superclass of T), the type P implements the trait CanDowncast<T>. Therefore, Widget implements CanDowncast<Label>. Again by definition of the trait Cast, that Widget implements Cast means that it has to have one method downcast<T> for every type T for which it implements CanDowncast<T>. So it has to have a method downcast<Label>. Notice that a widget can only be downcasted to a label, if it comes from a label in the first place. That is captured by the fact that the return type of downcast<Label> on Widget is Result<Label, Widget>.

Of course, if it wasn’t for wrapping around an object- and inheritance-oriented library, one might directly work in a different mindset in Rust. But it’s interesting to see the tricks that have been used to realize this GObject mindset in Rust.