{"id":955,"date":"2026-04-21T10:27:46","date_gmt":"2026-04-21T03:27:46","guid":{"rendered":"https:\/\/liveapi.com\/blog\/what-is-cmaf\/"},"modified":"2026-04-22T08:39:09","modified_gmt":"2026-04-22T01:39:09","slug":"what-is-cmaf","status":"publish","type":"post","link":"https:\/\/liveapi.com\/blog\/what-is-cmaf\/","title":{"rendered":"What Is CMAF? Common Media Application Format Explained"},"content":{"rendered":"<span class=\"rt-reading-time\" style=\"display: block;\"><span class=\"rt-label rt-prefix\">Reading Time: <\/span> <span class=\"rt-time\">11<\/span> <span class=\"rt-label rt-postfix\">minutes<\/span><\/span><p>Before CMAF, streaming a single video to every device meant maintaining two separate sets of encoded files: one for Apple devices using <a href=\"https:\/\/liveapi.com\/blog\/what-is-hls-streaming\/\" target=\"_blank\" rel=\"noopener\">HLS streaming<\/a> with MPEG-TS containers, and another for Android, smart TVs, and browsers using MPEG-DASH with fragmented MP4. That duplication doubled your encoding bill, doubled your storage costs, and doubled the work for your CDN.<\/p>\n<p>CMAF eliminates that duplication. By standardizing on fragmented MP4 as a single container format that both HLS and MPEG-DASH can read, CMAF lets you encode your video once and deliver it everywhere. You also get a clear path to sub-3-second live latency through chunked transfer encoding \u2014 without switching to WebRTC.<\/p>\n<p>This guide covers how CMAF works technically, what it means for latency, DRM, and storage costs, and how to decide whether it belongs in your streaming stack.<\/p>\n<h2>What Is CMAF?<\/h2>\n<p>CMAF (Common Media Application Format) is an international standard for packaging, storing, and delivering segmented media over HTTP. Published by ISO\/IEC in <a href=\"https:\/\/www.iso.org\/standard\/71975.html\" target=\"_blank\" rel=\"nofollow noopener\">January 2018<\/a> as ISO\/IEC 23000-19, it defines a single media container format \u2014 fragmented MP4 (fMP4) \u2014 that works with both HLS and MPEG-DASH.<\/p>\n<p>Instead of maintaining separate <code>.ts<\/code> files for HLS and <code>.mp4<\/code> files for DASH, you package your content once into CMAF and serve it through either protocol. Both manifests point to the same media files.<\/p>\n<p>CMAF is <strong>not a streaming protocol<\/strong>. It doesn&#8217;t define how media is requested or delivered across the network. It defines how media is structured and packaged. Delivery still happens through HLS or MPEG-DASH manifests \u2014 CMAF just standardizes what those manifests point to.<\/p>\n<p>The standard was proposed in February 2016 by Apple and Microsoft, who brought the idea to the MPEG working group. Their goal was to cut the industry-wide cost and complexity of delivering video across a fragmented device landscape.<\/p>\n<h2>The Problem CMAF Solves<\/h2>\n<p>To understand why CMAF matters, you need to understand what streaming looked like before it.<\/p>\n<h3>The Two-Format Problem<\/h3>\n<p>HLS \u2014 Apple&#8217;s <a href=\"https:\/\/liveapi.com\/blog\/what-is-http-live-streaming\/\" target=\"_blank\" rel=\"noopener\">HTTP Live Streaming<\/a> protocol \u2014 originally required MPEG-TS (<code>.ts<\/code>) container files for video segments. The <a href=\"https:\/\/liveapi.com\/blog\/mpegts-vs-hls\/\" target=\"_blank\" rel=\"noopener\">MPEG-TS vs HLS comparison<\/a> shows the key technical differences, but the core issue is this: MPEG-DASH, the international standard backed by most non-Apple devices, used fragmented MP4 containers.<\/p>\n<p>If you wanted to reach both Apple and non-Apple devices with the same content, you had to:<\/p>\n<ol>\n<li>Encode your video once<\/li>\n<li>Package it into MPEG-TS segments for HLS<\/li>\n<li>Package the same video into fMP4 segments for DASH<\/li>\n<li>Store both sets of files on your CDN<\/li>\n<li>Maintain two separate manifest files<\/li>\n<\/ol>\n<p>That doubles your storage footprint and CDN bandwidth for no technical reason \u2014 the video data is identical, just wrapped in different containers. According to Bunny.net, CMAF can reduce encoding, packaging, and storage demand by over 70% compared to maintaining both formats.<\/p>\n<h3>The Latency Ceiling<\/h3>\n<p>Traditional HLS was also limited in latency. The protocol buffered complete video segments \u2014 typically 6\u201310 seconds each \u2014 before delivering them to the player. That meant end-to-end latency of 15\u201330 seconds for standard HLS streams.<\/p>\n<p>CMAF introduced chunked transfer encoding, which breaks each segment into smaller chunks that the CDN and player can start processing before the full segment is complete. This brings latency down to 2\u20135 seconds with CMAF HLS, and under 2 seconds with Low-Latency HLS (LL-HLS).<\/p>\n<h2>How CMAF Works<\/h2>\n<p>CMAF structures media in a layered hierarchy:<\/p>\n<ul>\n<li><strong>CMAF Track:<\/strong> A single media stream (audio, video, or subtitles) encoded in fMP4 format. Each track contains a CMAF Header (decoder configuration data) and one or more CMAF Fragments.<\/li>\n<li><strong>CMAF Fragment:<\/strong> An independently decodable unit within a track. Typically 1\u20136 seconds of media, aligned across audio and video tracks.<\/li>\n<li><strong>CMAF Segment:<\/strong> A grouping of one or more fragments, referenced by the manifest.<\/li>\n<li><strong>CMAF Switching Set:<\/strong> A group of tracks with the same content encoded at different bitrates \u2014 the foundation for adaptive bitrate streaming.<\/li>\n<li><strong>CMAF Selection Set:<\/strong> A group of switching sets representing different content choices (audio language, subtitle track, video angle, etc.).<\/li>\n<\/ul>\n<p>The manifest file \u2014 either an HLS <a href=\"https:\/\/liveapi.com\/blog\/what-is-m3u8\/\" target=\"_blank\" rel=\"noopener\"><code>.m3u8<\/code> playlist<\/a> or a DASH <code>.mpd<\/code> file \u2014 references the CMAF segments and tells the player which URLs to request and when.<\/p>\n<h3>Supported Codecs<\/h3>\n<p>CMAF specifies which audio and video codecs are valid inside the fMP4 container:<\/p>\n<ul>\n<li><strong>Video:<\/strong> H.264 (AVC), H.265 (HEVC)<\/li>\n<li><strong>Audio:<\/strong> AAC-LC, HE-AAC, AC-3, EC-3, Opus<\/li>\n<li><strong>Subtitles:<\/strong> WebVTT, TTML\/IMSC1<\/li>\n<\/ul>\n<p>This covers the major codecs used in production streaming today. H.264 is the safe default for broad device compatibility; H.265\/HEVC offers better compression at the same quality but requires more recent devices.<\/p>\n<h3>How Chunked Transfer Enables Low Latency<\/h3>\n<p>Standard CMAF delivery still waits for a complete segment before publishing it to the CDN. Chunked transfer encoding changes that.<\/p>\n<p>With chunked CMAF:<\/p>\n<ol>\n<li>The encoder outputs video in small chunks (100\u2013500ms each) immediately after encoding each chunk<\/li>\n<li>The origin server starts transferring chunks to the CDN edge before the full segment is complete<\/li>\n<li>The CDN uses HTTP chunked transfer encoding to stream chunks to the player as they arrive<\/li>\n<li>The player buffers and starts playing chunks as they come in, without waiting for the full segment<\/li>\n<\/ol>\n<p>This pipeline cuts end-to-end latency from 15\u201330 seconds (standard HLS) to 2\u20135 seconds (CMAF HLS), and as low as 1\u20132 seconds with LL-HLS partial segments.<\/p>\n<h2>CMAF vs. HLS vs. MPEG-DASH<\/h2>\n<table>\n<tbody>\n<tr>\n<th>Feature<\/th>\n<th>Traditional HLS<\/th>\n<th>CMAF with HLS<\/th>\n<th>MPEG-DASH<\/th>\n<th>Low-Latency HLS<\/th>\n<\/tr>\n<tr>\n<td>Container<\/td>\n<td>MPEG-TS (.ts)<\/td>\n<td>fMP4 (.m4s)<\/td>\n<td>fMP4 (.m4s)<\/td>\n<td>fMP4 (.m4s)<\/td>\n<\/tr>\n<tr>\n<td>Protocol<\/td>\n<td>HLS<\/td>\n<td>HLS<\/td>\n<td>DASH<\/td>\n<td>HLS<\/td>\n<\/tr>\n<tr>\n<td>Standard latency<\/td>\n<td>15\u201330s<\/td>\n<td>5\u201310s<\/td>\n<td>3\u20138s<\/td>\n<td>2\u20135s<\/td>\n<\/tr>\n<tr>\n<td>Low-latency mode<\/td>\n<td>No<\/td>\n<td>Yes (chunked)<\/td>\n<td>Yes (chunked)<\/td>\n<td>Yes (partial segments)<\/td>\n<\/tr>\n<tr>\n<td>Apple device support<\/td>\n<td>Yes (all)<\/td>\n<td>iOS 12+<\/td>\n<td>Limited<\/td>\n<td>iOS 15+<\/td>\n<\/tr>\n<tr>\n<td>Android\/browser support<\/td>\n<td>Limited<\/td>\n<td>Yes<\/td>\n<td>Yes<\/td>\n<td>Yes<\/td>\n<\/tr>\n<tr>\n<td>DRM support<\/td>\n<td>FairPlay (AES-128)<\/td>\n<td>CENC \/ CBCS<\/td>\n<td>CENC \/ CBCS<\/td>\n<td>CENC \/ CBCS<\/td>\n<\/tr>\n<tr>\n<td>Single encode for all<\/td>\n<td>No<\/td>\n<td>Yes<\/td>\n<td>Yes<\/td>\n<td>Yes<\/td>\n<\/tr>\n<tr>\n<td>Industry standard<\/td>\n<td>De facto<\/td>\n<td>ISO\/IEC 23000-19<\/td>\n<td>ISO\/IEC 23009-1<\/td>\n<td>Apple LL-HLS spec<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>For a detailed breakdown, see <a href=\"https:\/\/liveapi.com\/blog\/hls-vs-dash\/\" target=\"_blank\" rel=\"noopener\">HLS vs DASH<\/a> \u2014 CMAF with HLS gives you the widest device coverage from a single encode, while MPEG-DASH remains common in markets where Apple devices are less dominant.<\/p>\n<h2>CMAF and Low-Latency Streaming<\/h2>\n<p>CMAF&#8217;s chunked transfer approach is the leading solution for broadcast-quality latency without the infrastructure complexity of WebRTC. Here&#8217;s where CMAF fits in the latency spectrum:<\/p>\n<table>\n<tbody>\n<tr>\n<th>Method<\/th>\n<th>Typical Latency<\/th>\n<th>Scale<\/th>\n<th>Best For<\/th>\n<\/tr>\n<tr>\n<td>WebRTC<\/td>\n<td>Under 500ms<\/td>\n<td>Hundreds\u2013thousands of viewers<\/td>\n<td>Real-time interaction, video calls<\/td>\n<\/tr>\n<tr>\n<td>CMAF with LL-HLS<\/td>\n<td>1\u20133 seconds<\/td>\n<td>Millions of viewers<\/td>\n<td>Live sports, news, live events<\/td>\n<\/tr>\n<tr>\n<td>CMAF with HLS<\/td>\n<td>2\u20135 seconds<\/td>\n<td>Millions of viewers<\/td>\n<td>General live streaming<\/td>\n<\/tr>\n<tr>\n<td>Standard HLS (MPEG-TS)<\/td>\n<td>15\u201330 seconds<\/td>\n<td>Millions of viewers<\/td>\n<td>VOD, delay-tolerant live<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>For <a href=\"https:\/\/liveapi.com\/blog\/ultra-low-latency-video-streaming\/\" target=\"_blank\" rel=\"noopener\">ultra-low-latency streaming<\/a> at CDN scale, CMAF is the right protocol choice when you need to serve more than a few thousand concurrent viewers. WebRTC&#8217;s peer-to-peer or SFU architecture doesn&#8217;t scale to CDN-level audiences.<\/p>\n<h2>CMAF and DRM<\/h2>\n<p>One of CMAF&#8217;s most practical benefits is its approach to content protection.<\/p>\n<p>Before CMAF, protecting content with multiple DRM systems required encoding and encrypting separate files \u2014 one for FairPlay (Apple), one for PlayReady (Microsoft), and one for Widevine (Google). That meant storing three encrypted versions of every piece of content.<\/p>\n<p>CMAF standardizes on <strong>CENC (Common Encryption)<\/strong>, specifically the <strong>CBCS encryption mode<\/strong> (AES-128 Cipher Block Chaining with pattern encryption). CBCS is supported by:<\/p>\n<ul>\n<li><strong>FairPlay<\/strong> \u2014 Apple&#8217;s DRM for iOS, macOS, tvOS<\/li>\n<li><strong>PlayReady<\/strong> \u2014 Microsoft&#8217;s DRM for Windows, Xbox, some smart TVs<\/li>\n<li><strong>Widevine<\/strong> \u2014 Google&#8217;s DRM for Android, Chrome, Chromecast<\/li>\n<\/ul>\n<p>A single CMAF-encrypted file carries initialization data for all three DRM systems. Your <a href=\"https:\/\/liveapi.com\/blog\/drm-for-video\/\" target=\"_blank\" rel=\"noopener\">DRM-protected video<\/a> is stored once, encrypted once, and served to any device \u2014 cutting DRM-related storage overhead by roughly 66% compared to system-specific encryption.<\/p>\n<p>One caveat: FairPlay originally used a different encryption scheme (SAMPLE-AES with AES-128) for older HLS streams. Make sure your packaging tools specifically support CBCS for CMAF, not just the legacy FairPlay format.<\/p>\n<h2>Advantages of CMAF<\/h2>\n<h3>Single Encode, Universal Delivery<\/h3>\n<p>Package your video once as CMAF and serve it to Apple devices via HLS manifests and to Android, smart TVs, and browsers via DASH manifests. No duplicate encoding runs, no duplicate storage. The storage savings exceed 70% compared to maintaining separate MPEG-TS and fMP4 segment libraries at scale.<\/p>\n<h3>Lower Latency Than Standard HLS<\/h3>\n<p>CMAF&#8217;s chunked transfer encoding brings live streaming latency below 5 seconds \u2014 well below the 15\u201330 seconds of traditional HLS. With LL-HLS, you can reach 1\u20133 seconds on CDN-based delivery.<\/p>\n<h3>Adaptive Bitrate Streaming<\/h3>\n<p>CMAF supports <a href=\"https:\/\/liveapi.com\/blog\/adaptive-bitrate-streaming\/\" target=\"_blank\" rel=\"noopener\">adaptive bitrate streaming<\/a> through its switching set structure. The player automatically switches between quality levels based on available bandwidth, keeping playback smooth even when network conditions fluctuate.<\/p>\n<h3>CDN-Compatible Scale<\/h3>\n<p>Because CMAF uses standard HTTP delivery, any CDN that supports chunked transfer encoding can distribute CMAF streams. You get the full scale of global CDN infrastructure rather than WebRTC&#8217;s point-to-point architecture. <a href=\"https:\/\/liveapi.com\/blog\/cdn-for-video-streaming\/\" target=\"_blank\" rel=\"noopener\">CDN delivery for video<\/a> applies directly to CMAF.<\/p>\n<h3>Standardized DRM<\/h3>\n<p>As covered above, CBCS encryption lets you protect content once and serve it with FairPlay, PlayReady, and Widevine from the same encrypted file. This cuts storage costs and removes per-DRM packaging complexity.<\/p>\n<h3>Simplified Encoding Pipeline<\/h3>\n<p>With one container, one encryption pass, and two lightweight manifests, your <a href=\"https:\/\/liveapi.com\/blog\/what-is-video-encoding\/\" target=\"_blank\" rel=\"noopener\">video encoding<\/a> and packaging pipeline has fewer moving parts. Fewer format conversions means less processing overhead and fewer failure points during live streams.<\/p>\n<h3>Better CDN Cache Efficiency<\/h3>\n<p>Because CMAF segments are identical whether referenced by HLS or DASH manifests, CDN caches don&#8217;t need to store duplicate copies. One cached file serves requests from both HLS and DASH players, improving cache hit rates and reducing origin load.<\/p>\n<h2>Disadvantages of CMAF<\/h2>\n<h3>Limited Support on Older Devices<\/h3>\n<p>CMAF&#8217;s fMP4 container in HLS requires iOS 12 or later and tvOS 12 or later. Devices running older Apple OS versions default to MPEG-TS-based HLS. If your user base has significant numbers of older Apple devices, you may need to maintain a fallback MPEG-TS stream alongside CMAF.<\/p>\n<h3>Not All CDNs Support Chunked Transfer<\/h3>\n<p>CMAF&#8217;s low-latency mode depends on end-to-end support for HTTP chunked transfer encoding \u2014 from origin server to CDN edge to player. Not all CDN configurations support this by default. Check your CDN&#8217;s documentation before assuming LL-HLS or chunked CMAF delivery is available.<\/p>\n<h3>DRM Encryption Complexity<\/h3>\n<p>While CBCS is the converging standard, some legacy DRM implementations still use CTR mode. If you&#8217;re supporting older PlayReady or Widevine clients, test for compatibility. CMAF technically supports both CBCS and CTR modes, but CBCS is the practical standard for multi-DRM CMAF deployments.<\/p>\n<h3>Tooling Still Maturing<\/h3>\n<p>CMAF is newer than HLS, and not all encoding, packaging, and player tools support it with equal quality. FFmpeg, Bento4, and Shaka Packager handle CMAF packaging well, but test your full pipeline \u2014 encoder to CDN to player \u2014 before deploying to production.<\/p>\n<h3>Can&#8217;t Match WebRTC Latency<\/h3>\n<p>Even with chunked transfer, CMAF cannot reach the sub-500ms latency of WebRTC. If your application requires real-time interaction \u2014 video calls, live auction bidding, interactive gaming \u2014 CMAF is not the right choice. <a href=\"https:\/\/liveapi.com\/blog\/webrtc-live-streaming\/\" target=\"_blank\" rel=\"noopener\">WebRTC live streaming<\/a> handles those use cases.<\/p>\n<hr \/>\n<p>Choosing a packaging format is only one part of building a streaming application. You still need encoding infrastructure, CDN delivery, and a player that handles fMP4 segments. Whether you build that yourself or use a streaming API, the CMAF format sits in the middle of your pipeline.<\/p>\n<hr \/>\n<h2>How to Implement CMAF in Your Streaming Stack<\/h2>\n<p>Here&#8217;s a practical path to CMAF delivery.<\/p>\n<h3>Step 1: Choose an Encoder That Outputs fMP4<\/h3>\n<p>Your encoder or <a href=\"https:\/\/liveapi.com\/blog\/what-is-video-transcoding\/\" target=\"_blank\" rel=\"noopener\">video transcoder<\/a> needs to output fragmented MP4 segments, not MPEG-TS. Tools that support CMAF packaging:<\/p>\n<ul>\n<li><strong>FFmpeg<\/strong> \u2014 open-source, supports CMAF-compatible HLS output via <code>-f hls -hls_segment_type fmp4<\/code><\/li>\n<li><strong>Shaka Packager<\/strong> \u2014 Google&#8217;s open-source packager, supports CMAF segments for HLS and DASH<\/li>\n<li><strong>AWS Elemental MediaConvert \/ MediaPackage<\/strong> \u2014 managed service with CMAF packaging for both HLS and DASH output<\/li>\n<li><strong>Bitmovin Encoder<\/strong> \u2014 commercial encoder with CMAF and LL-HLS support<\/li>\n<\/ul>\n<p>For live streams, your encoder should support chunked output at the ingest level so you&#8217;re not adding latency in the packaging step.<\/p>\n<h3>Step 2: Configure Dual-Manifest CMAF Packaging<\/h3>\n<p>Your packager creates two manifests pointing to the same CMAF media files:<\/p>\n<ul>\n<li>An HLS <code>.m3u8<\/code> master playlist referencing <code>.m4s<\/code> segment files<\/li>\n<li>A DASH <code>.mpd<\/code> manifest referencing the same <code>.m4s<\/code> files<\/li>\n<\/ul>\n<p>Both manifests point to the same fMP4 segments \u2014 no duplicate files needed. For low-latency CMAF, configure 2\u20136 second segments with chunk sizes of 100\u2013500ms. Smaller chunks reduce latency but increase HTTP request overhead.<\/p>\n<h3>Step 3: Configure Your CDN for Chunked Transfer<\/h3>\n<p>For standard CMAF (5\u201310 second latency), most CDNs work without special configuration. For low-latency CMAF with chunked transfer, you need a CDN that supports:<\/p>\n<ul>\n<li><strong>HTTP chunked transfer encoding<\/strong> at the edge<\/li>\n<li><strong>Partial object caching<\/strong> \u2014 so the CDN can forward incomplete segments to clients<\/li>\n<\/ul>\n<p>CDN providers with documented LL-HLS and CMAF support include Akamai, Fastly, and Cloudflare.<\/p>\n<h3>Step 4: Update Your Player<\/h3>\n<p>Most modern players handle CMAF fMP4 natively:<\/p>\n<ul>\n<li><strong>hls.js<\/strong> \u2014 supports fMP4 segments in HLS (required for CMAF playback in browsers)<\/li>\n<li><strong>Shaka Player<\/strong> \u2014 full CMAF support for HLS and DASH in browsers<\/li>\n<li><strong>ExoPlayer<\/strong> (Android) \u2014 native CMAF \/ DASH support<\/li>\n<li><strong>AVPlayer<\/strong> (iOS\/tvOS) \u2014 native CMAF \/ HLS support from iOS 12+<\/li>\n<\/ul>\n<p>For LL-HLS playback, use hls.js 1.0+ or Shaka Player 3.0+. Older player versions fall back to standard HLS segment behavior.<\/p>\n<h3>Step 5: Add DRM If Required<\/h3>\n<p>If your content needs protection, configure CBCS encryption in your packager. Generate a single encrypted CMAF stream with DRM initialization data for FairPlay, PlayReady, and Widevine embedded in the manifest. Your DRM license server handles per-system key delivery at playback time.<\/p>\n<h3>Using a Streaming API<\/h3>\n<p>Building and maintaining this pipeline \u2014 encoder, packager, CDN, player \u2014 takes significant engineering time. A <a href=\"https:\/\/liveapi.com\/live-streaming-api\/\" target=\"_blank\" rel=\"noopener\">live streaming API<\/a> like LiveAPI handles the full pipeline: ingest via <a href=\"https:\/\/liveapi.com\/blog\/what-is-rtmp\/\" target=\"_blank\" rel=\"noopener\">RTMP<\/a> or <a href=\"https:\/\/liveapi.com\/blog\/srt-protocol\/\" target=\"_blank\" rel=\"noopener\">SRT<\/a>, automatic adaptive bitrate encoding, HLS output via Akamai, Cloudflare, and Fastly CDNs, and an embeddable player \u2014 so you&#8217;re shipping features instead of managing packaging infrastructure.<\/p>\n<h2>Is CMAF Right for Your Project?<\/h2>\n<p><strong>Use CMAF if:<\/strong><\/p>\n<ul>\n<li>You serve both Apple and non-Apple devices and want a single encode<\/li>\n<li>You need sub-5-second live latency without WebRTC infrastructure<\/li>\n<li>You&#8217;re implementing DRM and want one encrypted asset for FairPlay, PlayReady, and Widevine<\/li>\n<li>You&#8217;re optimizing storage and CDN costs at scale (the 70%+ savings add up fast above a few terabytes)<\/li>\n<li>You&#8217;re building an OTT platform or VOD service where wide device compatibility matters<\/li>\n<\/ul>\n<p><strong>Skip CMAF if:<\/strong><\/p>\n<ul>\n<li>Your audience is exclusively on Apple devices and you already use MPEG-TS HLS \u2014 the migration may not be worth it<\/li>\n<li>You need sub-500ms latency \u2014 WebRTC is the right choice for that<\/li>\n<li>Your audience has significant numbers of pre-iOS 12 devices and you can&#8217;t run a fallback stream<\/li>\n<li>Your CDN doesn&#8217;t support chunked transfer and low latency is a hard requirement<\/li>\n<\/ul>\n<h2>CMAF FAQ<\/h2>\n<h3>What does CMAF stand for?<\/h3>\n<p>CMAF stands for Common Media Application Format. It&#8217;s an ISO\/IEC standard (23000-19) that defines how to package segmented media in fragmented MP4 containers for HTTP-based streaming across HLS and MPEG-DASH delivery.<\/p>\n<h3>Is CMAF a protocol?<\/h3>\n<p>No. CMAF is a container and packaging format, not a streaming protocol. It works with existing delivery protocols \u2014 HLS and MPEG-DASH \u2014 by standardizing the structure of the media files those protocols reference. The protocol layer (how requests are made, how playlists are served) stays the same.<\/p>\n<h3>What&#8217;s the difference between CMAF and HLS?<\/h3>\n<p>The two operate at different layers. HLS is a delivery protocol that defines how clients request and play back media. <a href=\"https:\/\/liveapi.com\/blog\/cmaf-vs-hls\/\" target=\"_blank\" rel=\"noopener\">CMAF vs HLS<\/a> explains the full picture: CMAF defines the container format of the media segments that HLS serves. Traditional HLS uses MPEG-TS containers; CMAF HLS uses fMP4 containers. CMAF doesn&#8217;t replace HLS \u2014 it replaces the MPEG-TS container inside HLS.<\/p>\n<h3>What latency can CMAF achieve?<\/h3>\n<p>Standard CMAF with HLS achieves 2\u20135 seconds end-to-end latency. With chunked transfer encoding and Low-Latency HLS (LL-HLS), you can reach 1\u20132 seconds. For sub-second latency, WebRTC is the only scalable option \u2014 CMAF can&#8217;t match that ceiling regardless of chunk size.<\/p>\n<h3>Does CMAF support DRM?<\/h3>\n<p>Yes. CMAF uses CENC (Common Encryption) with CBCS mode, which is supported by FairPlay (Apple), PlayReady (Microsoft), and Widevine (Google). A single CMAF-encrypted file can serve all three DRM systems from the same bytes, reducing storage requirements significantly compared to separate per-DRM encryption.<\/p>\n<h3>What container format does CMAF use?<\/h3>\n<p>CMAF uses fragmented MP4 (fMP4), also called ISOBMFF (ISO Base Media File Format). This is the same container MPEG-DASH has always used, which is why CMAF content can serve both HLS and DASH from a single file without re-encoding.<\/p>\n<h3>What devices support CMAF?<\/h3>\n<p>CMAF with HLS runs on iOS 12+, tvOS 12+, macOS (Safari), and all major browsers when using hls.js or Shaka Player. CMAF with DASH runs on Android, Chromecast, most smart TVs, and modern browsers. Legacy Apple devices running pre-iOS 12 require a separate MPEG-TS fallback stream.<\/p>\n<h3>How is CMAF different from MPEG-DASH?<\/h3>\n<p>CMAF is a packaging format; MPEG-DASH is a streaming protocol. MPEG-DASH has always used fMP4 containers, making CMAF content structurally compatible with DASH. What CMAF adds is the standardized track structure, switching sets, selection sets, and common encryption scheme \u2014 building a consistent layer on top of raw fMP4 that both HLS and DASH manifests can reference.<\/p>\n<h3>Is CMAF used by major streaming services?<\/h3>\n<p>Yes. Netflix, Disney+, and most major CDN providers have adopted CMAF. Apple added fMP4 support to HLS in 2017, and the ISO formally published the CMAF standard in January 2018. Akamai and other CDN providers have documented best-practice guides for ultra-low-latency CMAF delivery.<\/p>\n<h3>What tools support CMAF packaging?<\/h3>\n<p>FFmpeg, Shaka Packager, Bento4, AWS Elemental MediaConvert, Bitmovin, and most commercial encoding platforms support CMAF packaging. Confirm your tool specifically supports fMP4 segments for HLS output (<code>hls_segment_type fmp4<\/code> in FFmpeg), not just DASH output.<\/p>\n<h2>Closing<\/h2>\n<p>CMAF solves a real infrastructure problem: the cost and complexity of maintaining separate media files for HLS and DASH delivery. By standardizing on fragmented MP4 as the container for both protocols, it cuts storage costs by over 70%, reduces packaging complexity, and opens the door to sub-5-second live streaming latency through chunked transfer encoding.<\/p>\n<p>If you&#8217;re building a streaming application that needs to reach a broad device base \u2014 iOS, Android, browsers, smart TVs \u2014 with a single encode, CMAF is the right packaging format for your stack.<\/p>\n<p>To build on top of CMAF-compatible delivery infrastructure without managing encoders, packagers, and CDN configurations yourself, <a href=\"https:\/\/liveapi.com\/\" target=\"_blank\" rel=\"noopener\">get started with LiveAPI<\/a>. LiveAPI handles RTMP and SRT ingest, HLS output across Akamai, Cloudflare, and Fastly CDNs, adaptive bitrate encoding, and an embeddable player \u2014 so you can focus on your product.<\/p>\n","protected":false},"excerpt":{"rendered":"<p><span class=\"rt-reading-time\" style=\"display: block;\"><span class=\"rt-label rt-prefix\">Reading Time: <\/span> <span class=\"rt-time\">11<\/span> <span class=\"rt-label rt-postfix\">minutes<\/span><\/span> Before CMAF, streaming a single video to every device meant maintaining two separate sets of encoded files: one for Apple devices using HLS streaming with MPEG-TS containers, and another for Android, smart TVs, and browsers using MPEG-DASH with fragmented MP4. That duplication doubled your encoding bill, doubled your storage costs, and doubled the work for [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":958,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_yoast_wpseo_title":"What Is CMAF? Common Media Application Format Explained %%sep%% %%sitename%%","_yoast_wpseo_metadesc":"Learn what CMAF is, how Common Media Application Format works, how it reduces latency and storage costs, and when to use it in your streaming stack.","inline_featured_image":false,"footnotes":""},"categories":[16],"tags":[],"class_list":["post-955","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-video-format"],"jetpack_featured_media_url":"https:\/\/liveapi.com\/blog\/wp-content\/uploads\/2026\/04\/What-Is-CMAF.jpg","yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v15.6.2 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<meta name=\"description\" content=\"Learn what CMAF is, how Common Media Application Format works, how it reduces latency and storage costs, and when to use it in your streaming stack.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/liveapi.com\/blog\/what-is-cmaf\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"What Is CMAF? Common Media Application Format Explained - LiveAPI Blog\" \/>\n<meta property=\"og:description\" content=\"Learn what CMAF is, how Common Media Application Format works, how it reduces latency and storage costs, and when to use it in your streaming stack.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/liveapi.com\/blog\/what-is-cmaf\/\" \/>\n<meta property=\"og:site_name\" content=\"LiveAPI Blog\" \/>\n<meta property=\"article:published_time\" content=\"2026-04-21T03:27:46+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2026-04-22T01:39:09+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/liveapi.com\/blog\/wp-content\/uploads\/2026\/04\/What-Is-CMAF.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1912\" \/>\n\t<meta property=\"og:image:height\" content=\"1076\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Est. reading time\">\n\t<meta name=\"twitter:data1\" content=\"15 minutes\">\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebSite\",\"@id\":\"https:\/\/liveapi.com\/blog\/#website\",\"url\":\"https:\/\/liveapi.com\/blog\/\",\"name\":\"LiveAPI Blog\",\"description\":\"Live Video Streaming API Blog\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":\"https:\/\/liveapi.com\/blog\/?s={search_term_string}\",\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-US\"},{\"@type\":\"ImageObject\",\"@id\":\"https:\/\/liveapi.com\/blog\/what-is-cmaf\/#primaryimage\",\"inLanguage\":\"en-US\",\"url\":\"https:\/\/liveapi.com\/blog\/wp-content\/uploads\/2026\/04\/What-Is-CMAF.jpg\",\"width\":1912,\"height\":1076,\"caption\":\"What Is CMAF\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/liveapi.com\/blog\/what-is-cmaf\/#webpage\",\"url\":\"https:\/\/liveapi.com\/blog\/what-is-cmaf\/\",\"name\":\"What Is CMAF? Common Media Application Format Explained - LiveAPI Blog\",\"isPartOf\":{\"@id\":\"https:\/\/liveapi.com\/blog\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/liveapi.com\/blog\/what-is-cmaf\/#primaryimage\"},\"datePublished\":\"2026-04-21T03:27:46+00:00\",\"dateModified\":\"2026-04-22T01:39:09+00:00\",\"author\":{\"@id\":\"https:\/\/liveapi.com\/blog\/#\/schema\/person\/98f2ee8b3a0bd93351c0d9e8ce490e4a\"},\"description\":\"Learn what CMAF is, how Common Media Application Format works, how it reduces latency and storage costs, and when to use it in your streaming stack.\",\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/liveapi.com\/blog\/what-is-cmaf\/\"]}]},{\"@type\":\"Person\",\"@id\":\"https:\/\/liveapi.com\/blog\/#\/schema\/person\/98f2ee8b3a0bd93351c0d9e8ce490e4a\",\"name\":\"govz\",\"image\":{\"@type\":\"ImageObject\",\"@id\":\"https:\/\/liveapi.com\/blog\/#personlogo\",\"inLanguage\":\"en-US\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/ab5cbe0543c0a44dc944c720159323bd001fc39a8ba5b1f137cd22e7578e84c9?s=96&d=mm&r=g\",\"caption\":\"govz\"},\"sameAs\":[\"https:\/\/liveapi.com\/blog\"]}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","_links":{"self":[{"href":"https:\/\/liveapi.com\/blog\/wp-json\/wp\/v2\/posts\/955","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/liveapi.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/liveapi.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/liveapi.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/liveapi.com\/blog\/wp-json\/wp\/v2\/comments?post=955"}],"version-history":[{"count":2,"href":"https:\/\/liveapi.com\/blog\/wp-json\/wp\/v2\/posts\/955\/revisions"}],"predecessor-version":[{"id":959,"href":"https:\/\/liveapi.com\/blog\/wp-json\/wp\/v2\/posts\/955\/revisions\/959"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/liveapi.com\/blog\/wp-json\/wp\/v2\/media\/958"}],"wp:attachment":[{"href":"https:\/\/liveapi.com\/blog\/wp-json\/wp\/v2\/media?parent=955"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/liveapi.com\/blog\/wp-json\/wp\/v2\/categories?post=955"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/liveapi.com\/blog\/wp-json\/wp\/v2\/tags?post=955"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}