Wednesday, March 14, 2012

Web Video Standard to Revolutionize e-Learning

The first Web Standards Group Canberra meeting for 2012 was in the DEEWR conference centre. This has a very useful 300 seat theatre, with an area outside for breaks and some smaller rooms. This has a separate entrance from the DEEWR office, making access easier.

VIDEO SYNCHRONIZATION

First speaker was Christopher Giffard on problems with "video", in terms of interaction and accessibility. Assistive aids for the disability, such as closed captions, are considered excessively time consuming. He pointed out that HTML5 could be used poorly to support accessibility.

Christopher advocated the use of "timed data". This is interesting as I have been considered the issue of synchronous and asynchronous online learning. Many of the issues are the same: how do you allow for events which happen at a particular point in time, and others which do not.

There is an emerging standard for the use of <track>, which is similar to <source>. However, there have been few implementations. Track can be sued in <video> or <audio> to provide subtitles and captions. The captions are stored externally and then specifies an algorithm for synchronising this with the video or audio. Track can be manipulated with Javascript. Track works by specificity a source file for the caption content, a language and a data type (such as text/vtt) and a kind (such as "captions").

Christopher built a tool Captionator.js to support track, using the Media Text Tracks JavaScript API.

Christopher recommended WebVTT (Web Video Timed Text), a new caption specification language. This is a flat text file format based on SRT. It has lines of text starting with start and stop time codes, followed by text captions. There can be limited HTML type mark-up in the text, for bold, italics and the like. There are also "Chapters" to tag segments of video (with sub-chapters).

These features could be very useful for educational videos, for all students, not just those with accessibility requirements. Videos can be given a table of contents like a book.

In 200 I experimented with using synchronized audio with slides for education. See "Tips and Traps With Electronic Presentation Tools":
  1. Text,
  2. Slides (in OpenOffice format),
  3. Audio Slideshow (in Real Media Slideshow).

However, this material took considerable effort to prepare and depended on proprietary formats for playback.

It would be interesting to see if this could now be automated for recordings of presentations where slides are used. In multimedia equipped classrooms, such as at ANU, it should be possible to automatically create chapters in the video labeled with the title of each slide. If the slides in turn match a set of course notes, it should be possible to link the video to the sections in the notes. This could then be used to automatically create an "enhanced" eBook, with text, slides and video all cross referenced.

It should be noted that the technology is not limited to linear video. Christopher pointed out that not only can closed captions be displayed, but they can automatically pause the video to allow time to be read.

FORMAL LANGUAGE TO SUPPORT LEARNING SYNCHRONIZATION?

The use of a formal language to synchronize text and video suggests that perhaps something similar for learning. At present an educational designer will describe what the student needs to know and have done at the beginning and end of a course, but not in detail in between. In effect there is only synchronisation at two points: the start end of the course. In contrast a computer programmer will specify precise details as to what a process has to have done which it reaches a point to exchange data with another. This would get around the problem of students not knowing what they need to do or when they need to do it.

NEW WORLD OF VIDEO A FEW MONTHS AWAY

Christopher pointed out that support for video synchronization will be available in web browsers within a few months. Mozilla are "making progress" with Firefox. IE10 will support WebVTT, TTML and the JavaScript API.

Christopher demonstrated using a video of "Minister Garrett Introduces the School Funding Review". With this the transcript introduces each speaker and allows the viewer to click on a line of the text and play the relevant clip. He also showed a media management system to keep track of how accessible videos are.

No comments:

Post a Comment