Skip to main content

Streaming media

A foundational understanding of streaming media is essential for working effectively with Arc XP’s Video Center. This guide provides a high-level overview of the key concepts, terms, and technologies that power modern video delivery on the web. Whether you’re just getting started or looking to deepen your knowledge, this primer helps you speak the language of video with confidence.

Understanding the challenges of videos on the web

Video is one of the most complex forms of digital content to deliver. Raw, high-resolution video footage can reach bitrates of 1 Gbps or more. But most viewers aren’t accessing content in studio conditions; they’re watching on phones, laptops, or TVs with varying screen sizes and internet speeds.

To support all these devices and connections, we must create multiple versions (or renditions) of each video. Each rendition is optimized for a different set of viewing conditions, balancing quality with performance. Technologies like codecs, container formats, and adaptive streaming make this possible.

Codecs: Compressing and decompressing video

A codec (short for compressor/decompressor) is a software algorithm that reduces the size of video files for easier delivery and playback. Arc XP’s Video Center uses the AVC (H.264) codec, the most broadly supported option across modern devices.

Other codecs like HEVC, VP9, and AV1 offer advantages in specific contexts, but AVC remains our default due to its excellent compatibility and manageable licensing costs.

Containers: Packaging video for delivery

While codecs compress the video data, container formats package that data with the necessary information for playback. The most common format used in Video Center is MP4. You may also encounter formats like .aac or .mp3 for audio-only content.

When we change container formats without re-encoding the video itself, we’re performing a process knowing as transmuxing (or just muxing).

Adaptive Bitrate (ABR) streaming

Adaptive Bitrate (ABR) streaming is what makes platforms like YouTube, Netflix, and Arc XP Video Center so flexible. Instead of delivering a single version of a video, we create a ladder of renditions at various resolutions and bitrates.

As a user begins watching, the player assesses their device capabilities and network conditions, then selects the most appropriate stream. If the connection changes mid-playback, the player can seamlessly switch to a better-match rendition, helping minimize buffering and interruptions.

Video Center uses HLS (HTTP Live Streaming) for ABR delivery, a widely supported standard. Other options, like CMAF and DASH, are also gaining popularity.

MPEG-2 versus fMP4

To enable ABR, videos are delivered in segments, usually just a few seconds long. This allows the player to quickly switch between renditions as needed.

Currently, Video Center uses MPEG-2 Transform Stream (.ts) segments. A newer format, Fragmented MP4 (fMP4), reduces segmented overhead and lowers bandwidth usage by 5-15%. While fMP4 offers clear advantages, transitioning to it requires re-encoding petabytes a content, which is a costly endeavor.

Progressive download

Before ABR became standard, many platforms used progressing downloads. This method downloads the full video file, allowing playback to begin while the download is in progress.

While simpler, progressive delivery doesn’t adapt to changing network conditions. As a result, it’s largely being phased out of Video Center in favor of ABR.

Note

Don’t confuse Progressive MP4 (a delivery method) with Progressive Scan (a video rendering method). The “p” in 720p refers to Progressive Scan, a distinction that was more relevant in the era of interlaced video.

Buffering and playback performance

If a high-resolution video is delivered to a device on a slow connection, playback pauses as the player waits for more data. This is buffering.

To avoid this, ABR dynamically adjusts the video quality in real-time. While this might mean sacrificing some visual clarity, most users prefer continuous playback over occasional buffering interruptions.

The video ecosystem: Core components

A successful streaming experience depends on several integrated technologies:

  • CDN (Content Delivery Network) - delivers video content from servers close to the viewer for faster loading. Arc XP primarily uses Akamai, with some legacy use of AWS CloudFront.

  • Player - the user-facing components that decodes and displays video. Arc XP’s player, PoWa, offers a high Quality of Experience (QoE) score, comparable to platforms like Peacock and Paramount+.

  • Advertising platform - handles ad delivery and user targeting. Video Center integrates with Google Ad Manager to support pre-roll ad insertion.

  • Video platform - the backbone that supports video upload, encoding, permissions, metadata, syndication, and more. Video Center is Arc XP’s comprehensive video platform.

Licensing and patents

Even when technologies are based on open standards, they still may be subject to licensing obligations. Codec usage, in particular, can trigger royalty obligations from patent tools like HEVC Advance, MPEG LA, and Velos Media. We recommend that you consult with a lawyer on your licensing and other intellectual property obligations. Open Comments PanelOpen Details Panel