Skip to content
This repository has been archived by the owner on Oct 10, 2021. It is now read-only.

The livestream platform to share your developer experience with others!

License

Notifications You must be signed in to change notification settings

jonahgoldwastaken/devex

 
 

Repository files navigation

DevEx

The built-from-scratch live-streaming platform for all developers. Share your coding sessions live to the world for everyone to enjoy!

Table of contents

Getting started

This project uses workspaces, and prefers Yarn classic over NPM as its package manager. For the best developer experience, use Docker so you don't have to install FFmpeg locally and setup environment variables yourself.

This project also uses Firebase. After you've set-up the project, create a Firebase app, add a service account for the Firestore and put the downloaded JSON file inside the api folder as firebase.json. Docker will do the rest. If you've never worked with Firebase before, get started here.

Install the project

$ git clone https://github.com/theonejonahgold/real-time-web-2021 rtw
$ cd rtw
$ yarn || npm install

Project setup

└─ real-time-web-2021
   ├─ api - NodeJS CRUD api for 'web' and 'stream' packages.
   ├─ docs - Documentation folder.
   ├─ stream - Node-media-server for ingesting RTMP streams and processing them to HLS.
   └─ web - The web app you see at https://devex.jonahgold.dev.

Available commands

$ yarn dev:up # Runs 'stream' and 'api' packages in docker and web outside of it due to a bug with SvelteKit.
$ yarn dev:down # Shuts down docker containers for 'stream' and 'api'
$ yarn prod # Run all packages inside of docker (for production).
$ yarn format # Run prettier to format all project files.
$ yarn lint # Run prettier to check formatting on all files.

These commands are available from the root of the project, but every package has their own build, dev and start script as well.

Tech stack

API package

Stream package

Web package

Features

Must haves

  • Register.
  • Log in.
  • Live stream video from broadcasting software like OBS.
    • Authenticate your stream with a stream key.
  • View streams.
  • Chat on other people's streams.

Should haves

  • Follow other profiles.
  • Set stream title.
  • Set programming language.
  • Discover live channels on programming language.
  • Have a stream thumbnail.
  • Have a nice onboarding experience.

Could haves

  • Profile search
  • Show when someone you follow is live or offline.
  • Show something when there is nothing to show on pages.
  • Chat emotes.
  • Follow notifications in chat.

Would like to haves

  • Dynamically cap resolution based on incoming stream.
  • Chat moderators.
  • Persistent chat message storage.
  • Video on-demand.

Sketches

Sketch of the discovery page

The Discovery has a list of channels that are live. In the always-present sidebar you can see your followed channels. And watch them from anywhere.

Sketch of the languages page

The Languages page shows all programming languages, so you don't have to sift through programming languages you don't like to watch.

Sketch of the stream page

The Stream page shows a live stream, with the corresponding chat next to it.

Data Lifecycles

For live streaming

Data lifecycle of live streaming flow

For website

Data lifecycle of web data flow

Socket events

Socket events are spread over four namespaces:

  1. Following namespace: For handling updates for the channel you follow. This makes the "Following" sidebar real-time.
  2. Watch namespace: For handling updates of the channel you're watching. Viewer count, live status, stream title and programming language are all updated real time through this namespace.
  3. Chat namapsace: This namespace handles the chat data flows, making the chat experience real-time and easy to use.
  4. Me namespace: When you update something about your own profile, this namespace ensures that these changes are sent back to the web app immediately after they're saved, so you have the most up-to-date version of your own profile available locally.

Following namespace

API: update Is sent when a user object that's been followed is updated.
Web: update Is sent when the logged in user's following list has been updated.

Watch namespace

Web: join Is sent when a socket connection has been established. The event is sent with the channel that the user is watching.
API: update Is sent when the user being watched has been updated inside the database in some way. The viewers count, live status, streamTitle and programming language are the four keys that are sent, along with the username.

Chat namespace

Web: join Is sent when the user joins a chat. The chat room, along with the username, is sent as payload for the API to handle.
Web: message Is sent when the user sents a chat message. A chat message can be sent without being logged in (not via the website without tinkering), but will not be sent to all other users in the chat as the user is authenticated before the message is sent.
API: message Either user or server type. With user messages the user is authenticated first, every message, to prevent messaging from spammers. The server messages are for announcements and welcome messages, and are now only sent when a connection is established.

Me namespace

API: update Is fired when the user object in the database has been updated in some sort of way, sending this new version to the client.

What I've learned

For me, the biggest hurdle with this project was figuring out how to make it possible to use software like OBS to transmit your live stream to DevEx. What I came across was Node media server, a package tailor-made to ingest, transcode and then publicate the transcode(d) stream(s) for the world to see. There is almost no documentation on how to use this package, so I had to look online for implementations of it in the way that was usable to me.

Luckily, a GitHub user by the name of johndavedecano had a repository called node-rtmp-hls, which is exactly what I needed. What I've learned was that software like OBS transmit an RTMP stream, which is a relic from the flash player days and was used to transmit real-time data to and from flash players from a server. This was used in the early days of live streaming, before HTML5 video waas popularised, as it was the only way to reliably watch a live stream (I think, this is pure speculation). Since Flash has gone, Apple has come with HLS, which is their counter-point to RTMP, and works over regular HTTP. This means that I can use the configuration John made for his repository, and use it myself! Although I had no clue what everything did until I did some more learning. And some learning I did.

I learned about RTMP, HLS, how the Node-Media-Server package works and how the configuration John made turns one RTMP stream into multiple HLS streams. The Config file has two main keys that are very important, relay and trans, standing for "relay" and "transcode" repectively. The relay key tells the stream app (rtmp://devex.jonahgold.dev/stream) to push all video coming in to the hls_360p, hls_480p, hls_720p, hls_1080p apps. These apps are all transcoders, meaning they mold incoming video into a certain output. This is done with FFmpeg, the most versatile tool for basically turning every form of video into every form of video or audio you want. FFmpeg takes the RTMP stream coming in from stream, and outputs it into the settings provided by each "app". This is in turn written to disk, but that's not where the party ends.

Now we have 4 separate streams (or 2 as I've commented out the 720p and 1080p streams). These streams need to come together in some way, so the video player in your browser knows about all the possible resolutions to choose from. This is done in what's called a "HLS playlist". Every resolution has one of their own, and the code in playlist.ts creates one for all resolutions, making it possible to digest all stream resolutions from one file only.

When I found out how this all worked, my mind started racing with possible features. That's why I've made a thumbnail feature, creating thumbnails for all live streams every minute. I didn't create the command myself, as I have no idea which options FFmpeg has, but I can explain what it does after seeing it in action.

Small fun fact: apparently only Safari supports HLS natively, but every other browser can support it with what's called Media Source Extensions. The HLS.js I used for the web app is such an extension, and adds HLS support to a lot more browsers (plus improving the one in Safari with quality level selection and more video events).

In short

So, what have I learned the most:

  • What the RTMP and HLS protocols are.
  • How the node-media-server package turns 1 RTMP stream into 4 HLS streams in multiple resolutions.
  • How to use this knowledge to think of featuers myself.
  • That a lot of the live streaming platforms still rely on old protocols from when they first started.
  • HLS is only supported in Safari, but very easy to implement in other browsers.

Special thanks

  • Justus Sturkerboom and Lukas van Driel from Real-Time Web for allowing me to experiment with this amazing concept.
  • Lukas van Driel especially for being a good coach, providing me with very helpful feedback each week.
  • Student-assistant Robin Frugte
  • Victor, Guus, Roeland, Vincent & Evelyn for providing me with very good usability feedback!
  • Evelyn again, for her patience and mental support.
  • Squid Squad A for being the best squad.
  • John for providing such a helpful template to work off of.
  • Chen for creating Node-Media-Server.
  • Destek from StackOverflow for providing the FFmpeg command to generate thumbnails from an HLS stream.

All other sources used documentation, which I've documented in the Tech stack section.

About

The livestream platform to share your developer experience with others!

Topics

Resources

License

Stars

Watchers

Forks

Languages

  • Svelte 66.3%
  • TypeScript 27.8%
  • CSS 3.5%
  • Dockerfile 1.1%
  • JavaScript 0.9%
  • HTML 0.2%
  • Shell 0.2%