There are tons of complicated terms within the world of live video content. In this live streaming glossary, we detail key terms with tech-savvy definitions and a tech-to-English translation, for live streaming professionals of all levels of experience.
Understanding these terms is critical when configuring and maintaining a live stream. Unfortunately, when you “Google” these terms you frequently find an overly technical definition that leaves you with more questions than answers.
Don’t worry. We’ve got you covered.
By definition, streaming is a technology used to deliver content to computers and mobile devices over the Internet. Streaming transmits data – usually audio and video, but increasingly other kinds as well – as a continuous flow, which allows the recipients to begin to watch or listen almost immediately.
In layman’s terms, streaming is what you’re doing when you broadcast your live video to the Internet. It’s also what is happening when someone watches your video.
Transcoding is the direct digital-to-digital conversion of one encoding to another.
In the live streaming world, you may need to use a transcoding service to help deliver your content to particular platforms. Some platforms like Periscope require a lower quality video stream than others, like YouTube. Transcoding takes your video stream and basically creates another version of it to match the requirements of different platforms and devices.
Transrating is the process of changing a video file from one bitrate to another.
Typically part of any transcoding system or service, transrating specifically provides a version of your live stream at various bitrates or speeds to meet the requirements or limitations of a variety of endpoints.
Video encoding is the process of converting digital video files from one format to another.
Encoding is the biggest part of live streaming! It’s what takes your feed and turns it into a file that can be streamed all across the Internet. Encoding typically happens via streaming software such as OBS, or with a hardware encoder that you physically attach to your video camera.
Video ingestion is the process of capturing, transferring, and storing video files in an organized manner for simple identification and location in the future.
Typically you’ll hear this term in reference to “ingest servers”. These are the servers that take in the feed you’re streaming and either store it or deliver it to your audience live.
In telecommunications and computing, bitrate is the number of bits that are conveyed or processed per unit of time.
In plain English, it’s the speed in which your data, or video, flies across the Internet. If your bitrate is too slow it may result in a poor quality video which may lag or look distorted.
A content delivery network, or content distribution network (CDN), is a geographically distributed network of proxy servers and their data centers. The goal is to distribute service spatially relative to end-users to provide high availability and high performance.
Basically, a CDN makes copies of your data and spreads it to data centers around the globe. When someone goes to access it, or in this case, watch a video, the CDN will direct that person to the copy physically closest to them to provide the quickest most reliable connection.
Latency is the time that elapses from when the source sends a packet until the destination receives it.
This is like your “time” when running track in high school. Latency describes how quickly your data got to where it was going.
Packet loss occurs when one or more packets of data are traveling across a computer network and fail to reach their destination.
In other words, packet loss is something you will find in the streaming statistics of your encoder and sometimes on the ingest server or platform you are streaming to. It basically tells you how much of your video failed to make it to the ingest server. This is an indication of network problems. Ideally, you should have little to no packet loss in a high-quality video stream.
Screen resolution is the number of pixels in x and y dimensions for monitors, television screens, mobile devices or any other type of display device. The screen resolution is generally measured as width x height in pixels. For example, resolution 1920 x 1080 means the screen is 1,920 pixels in width and 1,080 pixels in height.
Essentially, resolution is the size of your screen, and when you live stream you configure a particular resolution to broadcast.
Real-Time Messaging Protocol (RTMP) was initially a proprietary protocol developed by Macromedia for streaming audio, video, and data over the Internet, between a Flash player and a server. Macromedia is now owned by Adobe, which has released an incomplete version of the specification of the protocol for public use.
RTMP is one of the most common streaming protocols out there. It’s what you use to stream your video to Facebook, YouTube, Twitch or any of the popular services. It has two parts that you get from the service and enter into your encoder, an RTMP address, and a stream key.
Stream key is a code used to identify a source of audio or video streaming, in order to show the stream in a website or software over the web.
Your stream key is a password you need in order to stream your video to a particular RTMP ingest server.
A media player is a software that “plays” audio, video or animation files in the computer.
An endpoint is a remote computing device that communicates back and forth with a network to which is it connected.
Examples of endpoints include desktops, laptops, smartphones, tablets, servers, etc.
OTT stands for “over-the-top,” a term used for the delivery of film and TV content via the Internet, without requiring users to subscribe to a traditional cable or satellite pay-TV service.
Video on demand (VOD) is a programming system which allows users to select and watch or listen to such video or audio content as movies and TV shows whenever they choose, rather than at a scheduled broadcast time.
That pretty much says it all.
The keyframe, or intra-frame, is the full frame of the image in a video. Subsequent frames, the delta frames, only contain the information that has changed. Keyframes will appear multiple times within a stream, depending on how it was created or how it’s being streamed.
When configuring your live streaming encoder, oftentimes, you’ll be asked for a keyframe interval. Various services require different keyframe settings. Be sure to check your streaming platform to find the right setting and ensure the best quality stream.