The WHATWG Blog

Please leave your sense of logic at the door, thanks!

This Week in HTML 5 – Episode 10

by Mark Pilgrim, Google in Weekly Review

Welcome back to "This Week in HTML 5," where I'll try to summarize the major activity in the ongoing standards process in the WHATWG and W3C HTML Working Group.

The big news this week is offline caching. This has been in HTML 5 for a while, but this week Ian Hickson caught up with his email and integrated all outstanding feedback. He summarizes the changes:

  • Made the online whitelist be prefix-based instead of exact match. [r2337]
  • Removed opportunistic caching, leaving only the fallback behavior part. [r2338]
  • Made fallback URLs be prefix-based instead of only path-prefix based (we no longer ignore the query component). [r2343]
  • Made application caches scoped to their browsing context, and allowed iframes to start new scopes. By default the contents of an iframe are part of the appcache of the parent, but if you declare a manifest, you get your own cache. [r2344]
  • Made fallback pages have to be same-origin (security fix). [r2342]
  • Made the whole model treat redirects as errors to be more resilient in the face of captive portals when offline (it's unclear what else would actually be useful and safe behavior anyway). [r2339]
  • Fixed a bunch of race conditions by redefining how application caches are created in the first place. [r2346]
  • Made 404 and 410 responses for application caches blow away the application cache. [r2348]
  • Made checking and downloading events fire on ApplicationCache objects that join an update process midway. [r2353]
  • Made the update algorithm check the manifest at the start and at the end and fail if the manifest changed in any way. [r2350]
  • Made errors on master and dynamic entries in the cache get handled in a non-fatal manner (and made 404 and 410 remove the entry). [r2348]
  • Changed the API from .length and .item() to .items and .hasItem(). [r2352]

And now, a short digression into video formats...

You may think of video files as "AVI files" or "MP4 files". In reality, "AVI" and "MP4" are just container formats. Just like a ZIP file can contain any sort of file within it, video container formats only define how to store things within them, not what kinds of data are stored. (It's a little more complicated than that, because container formats do limit what codecs you can store in them, but never mind.) A video file usually contains multiple tracks -- a video track (without audio), one or more audio tracks (without video), one or more subtitle/caption tracks, and so forth. Tracks are usually inter-related; an audio track contains markers within it to help synchronize the audio with the video, and a subtitle track contains time codes marking when each phrase should be displayed. Individual tracks can have metadata, such as the aspect ratio of a video track, or the language of an audio or subtitle track. Containers can also have metadata, such as the title of the video itself, cover art for the video, episode numbers (for television shows), and so on.

Individual video tracks are encoded with a certain video codec, which is the algorithm by which the video was authored and compressed. Modern video codecs include H.264, DivX, VC-1, but there are many, many others. Audio tracks are also encoded in a specific codec, such as MP3, AAC, or Ogg Vorbis. Common video containers are ASF, MP4, and AVI. Thus, saying that you have sent someone an "MP4 file" is not specific enough for the recipient to determine if they can play it. The recipient needs to know the container format (such as MP4 or AVI), but also the video codec (such as H.264 or Ogg Theora) and the audio codec (such as MP3 or Ogg Vorbis). Furthermore, video codecs (and some audio codecs) are broad standards with multiple profiles, so saying that you have sent someone an "MP4 file with H.264 video and AAC audio" is still not specific enough. An iPhone can play MP4 files with "baseline profile" H.264 video and "low complexity" AAC audio. (These are well-defined technical terms, not laymen's terms.) Desktop Macs can play MP4 files with "main profile" H.264 video and "main profile" AAC audio. Adobe Flash can play MP4 files with "high profile" H.264 video and "HE" AAC audio. Of course, it's a little more complicated than that.

Thus...

r2332 adds a navigator.canPlayType() method. This is intended for scripts to query whether the client can play a certain type of video. There are two major problems with this: first, MIME types are not specific enough, as they will only describe the video container. Learning that the client "can play" MP4 files is useless without knowing what video codecs it supports inside the container, not to mention what profiles of that video codec it supports. The second problem is that, unless the browser itself ships with support for specific video and audio codecs (as Firefox 3.1 will do with Ogg Theora and Ogg Vorbis), it will need to rely on some multimedia library provided by the underlying operating system. Windows has DirectShow, Mac OS X has QuickTime, but neither of these libraries can actually tell you whether a codec is supported. The best you can do is try to play the video and notice if it fails. [WHATWG thread]

Other interesting changes and discussions this week:

Around the web:

Tune in next week for another exciting episode of "This Week in HTML 5."

5 Responses to “This Week in HTML 5 – Episode 10”

  1. Hi, Can you outline how programmatic control of audio and subtitle streams should happen. I realize that it’s the underlying player’s responsibility to handle it but I have not seen anything yet (and I apologize if I’m overlooking something obvious) which gives information on alternative streams and possibility of making them active.

  2. Acmeguy, the current DOM APIs for video and audio do not provide any programmatic control over subtitles. The APIs do however provide the HTMLMediaElement.muted and HTMLMediaElement.volume properties for adjusting the volume.