OII Standards and Specifications List

I*M Europe
OII Home Page
What is OII?
Standards List
OII Guides
OII Fora List
Conference Reports
Monthly Reports
EC Reports
Whats New?
OII Index
OII Feedback
Search Database

OII Guide to Intractive Media

This OII Guide explains the role of those standards and specifications currently being used, or under development, to allow system developers to interchange information about how users can interact with information presentation systems.

For the purposes of this OII Guide interactive media is defined as "media for which the order in which information is presented is under the control of the user". By definition interactive media relies onnavigable links between different pieces of information, which may or may not be stored in separate files held on one or more servers.

Most modern standards and specifications for interactive media are based on The Dexter hypertext reference model. This model identifies three layers within an interactive information resource:

  • a run-time layer which defines the user interface used on the client machine
  • a storage layer which defines the file/database structure from which information is to be provided at the server
  • a within-component layer which defines the different types of information object and the way in which these are processed by an application.

Within the Dexter model the relationship between the run-time layer and the storage layer is controlled by the use of presentation specificationswhile the relationship between the within-component layer and the storage layer is controlled by the use of anchors. It is these latter that provide the key to interactive media as they provide the "glue" that connects networks of content objects.

There are two principle categories of interactive media:

  1. Manually-controlled interactive media
  2. Time-controlled interactive media.
While most current hypertext delivery systems are based on the use of manually-controlled interactive media the developing need for the management of complex sets of multimedia information will lead to an increasing use of time-controlled interactive media over the next few years.

Manually-controlled Interactive Media

If you are reading this document on a computer screen you are already familiar with the basic concepts of manual control of media display. A computer provides users with line/page up/down keys or mouse buttons that can be used to control the speed at which readers move from one piece of information to the next within a predefined sequence. Such facilities do not, however, in themselves result in interactive media.

Interactive media requires the presence of hot spots or buttons that users can select to move from one piece of information to another. Internet users will be familiar with the concept of links which become active when the screen cursor is placed within their boundary. By clicking a mouse button while a link is active the user is taken from the point at which the link has been anchored within the calling resource to a point in the referenced resource that contains information that could not be sequentially related to the data previously being displayed.

There are two main techniques for creating and storing information relating to link anchors:

  1. using internally-stored links that form an integral part of the calling resource
  2. using externally-stored links that are stored in a separately managed file/database which can be associated with the currently displayed file.

Internally-stored links

The most commonly used form of internally-stored links are those provided in the HyperText Markup Language (HTML) that forms the basis of most of the interactive media currently available on the Internet.

The most common form of link used in HTML files is that provided by the anchor (A) element. At its most basic this anchor consists of an element containing some text (or an image) which the user clicks on to activate the link. The resource to which the user is to be taken is identified, in an "hypertext reference" (href) attribute associated with the anchor element's start-tag, through the use of an Internet Uniform Resource Locator (URL). For example, the links from the head of this page to the home page for the OII initiative have the form:

<a href="oii-home.html">OII Home Page</a>  

Other forms of user interaction provided in HTML include MAPelements that can be used to make parts of images active hot spots, and the various data capture, menu and radiobutton option selection functions and buttons that make up an HTML form.

Note: A detailed explanation of the structure and use of HTML forms is beyond the scope of this OII Guide. Details of the techniques used to create HTML forms can be found in Chapter 14 of SGML and HTML Explained.

The main points to note about HTML links are:

  • the link must be predefined in the calling file
  • the link can only take you to a single additional resource
  • the link can only take you to points in the referenced resource that have previously been assigned named anchors by its creator, or to the start of the referenced resource.

Externally stored links

Whilst embedded HTML-style links are adequate for author controlled linking between documents, they are not suitable for many purposes for which electronic documents are used. For example, document reviews and critiques often need to be prepared by people who are not allowed to change the source document. In most such cases the person commenting on the data will not know the "name" assigned to the object being commented on by its creator, and is likely to want to comment a specific part of the named object rather than its whole contents.

To make it possible for people to create links to documents they do not have write access to it is necessary to:

  1. Store the link information in a file other than the referenced document.
  2. Reference objects that have not had names previously assigned to them.
  3. Reference specific parts of data storage objects and/or logical components.
  4. Create links to more than one part of a document, or to parts of different documents.
  5. Control the order in which different pieces of linked data are viewed.

Many of the standards currently being developed for the management of interactive media are based on techniques defined in ISO 10744, which defines the Hypermedia/Time-based Structuring Language (HyTime). HyTime defines a set of separately applicable SGMLarchitectural forms that can be used to:

  • locate component parts of documents based on their name, their position in the document, their properties or their contents (thelocation module)
  • create links between identified locations (the hyperlinks module)
  • control the sequence in which objects are displayed (thescheduling module)
  • control the way in which objects are presented (the rendition module).

HyTime hyperlinks are elements that reference sets of locators and provide a set of traversal rules that control the sequence in which these locations can be viewed.

The HyTime model forms the basis for the XML Linking Language (XLink) proposal for linking together XML and HTML documents using either interally or externally stored links.

HyTime also forms the basis for the Interchange Standard for Modifiable Interactive Documents (ISMID) being developed byISO/IEC JTC1/SC34, the creators of the SGML and HyTime standards. ISMID is designed to standardize the interchange of Interactive Electronic Technical Manuals (IETMs) such as those developed according to the US Department of Defense's MIL-M-GCSFUI specification for Manuals, Interactive Electronic Technical: General content, style, format and user-interaction requirements.

ISMID defines SGML architectural forms that can be used to create, modify or delete containers, content objects and control objects. It also provides a standardized set of flow control objects (if, while andswitch). ISMID actions are controlled through stimuli that triggerresponses defined within the ISMID document.

Time-controlled Interactive Media

The simplest form of time control is that provided by the use of the HTML META element to control the length of time an HTML document is displayed before moving on to the next page of the display. This is defined in terms of a refresh time, expressed in seconds, and the file to be displayed as a result of the refresh, e.g.:

<META http-equiv="refresh" content="30;URL=page2.htm">  

The display of multimedia data often requires the synchronization of data stored as separate resources. To create multimedia interactive documents you need mechanisms for resynchronizing data sets when a user interaction is responded to.

There are a number of commercial specifications for the creation of synchronized multimedia sets. Perhaps the most poplular of these are Apple's QuickTime and Microsoft's Video for Windows specifications. On the Internet RealNetworks' RealTime Player is the most commonly used tool for delivery of synchronized multimedia.

The most commonly used set of international standards for the delivery of synchronized audiovisual material is the set of standards developed by ISO's Moving Pictures Experts Group (MPEG-1 and MPEG-2). Part 6 of the ISO/IEC 13818 specification for MPEG-2 defines Digital Storage Media Command and Control Extensions (DSM-CC). These extensions allow users to browse, select, download and control MPEG-2 conformant data streams from a network.

Management of MPEG and related objects is carried out using theISO/IEC 13522, Coding of multimedia and hypermedia information standard developed by ISO's Multimedia and Hypermedia Experts Group (MHEG). Part 5 of the MHEG standard definesSupport for base-level interactive applications while Part 6 definesSupport for extended interactive applications.

MHEG-5 is designed for use in a limited set of application domains, mainly consisting of interactive video retrieval, manipulation and presentation for Video-on-Demand or Near Video-on-Demand services. MHEG-5 events are restricted to IsAvailable,ContentAvailable, IsRunning, IsStopped and IsDeleted. Part 6 of the standard extends this functionality by allowing for the integration of a Java Virtual Machine that will allow users to interact more fully with the set-top boxes used to control video-on-demand services.

Another speciailized set of interaction mechanisms is provided by theVirtual Reality Markup Language (VRML). This language allows users to interact with computer-generated 3D images. Interaction is term of translation and rotation of 3D images, control of lighting and perspective, and changing of textures and other display properties under the control of application-specific user interfaces. The VRML 97 specification introduced the concept of timed events being used to create routes within a 3D view. Route naviagtion can be controlled using scripts to check the current status of sensors.

For more general-purpose applications the HyTime scheduling moduleallows the relationships between different information resources to be defined in terms of either relative or absolute times. It can also be used to control the length of time that data can be displayed for, and to set upevents that can be triggered by event pulses.

The HyTime rendition module allows the definition of electronicbatons that can be used to control the speed of presentation andwands that can be used to modify the characteristics of the presentation (colour space, screen size, etc)

The W3C Working Group on Synchronized Multimedia (SYMM) published a Synchronized Multimedia Integration Language (SMIL) 1.0 Specification approved recommendation in June 1998. SMIL defines of a set of XML elements that can be used to group media object elements for parallel or sequential display. SMIL defines a set of media object attributes that can be used in conjunction with switchand link elements to control the presentation of media objects. Facilities are provided for controlling the duration of an object's display, for repeating a synchronized set of media objects and for identifying theregion of the rendering surface to be assigned to each media object. SMIL links are used to control the presentation of media objects that require the use of plug-in applications for controlling their display. They are defined as an extended form of an HTML anchor. SMIL links define one-to-one relationships between media objects. They are not designed to provide users with control of these objects.

In September 1998 a W3C Informative Note suggested a mechanism for defining Timed Interactive Multimedia Extensions for HTML (HTML+TIME), which was subtitled Extending SMIL into the Web Browser. This proposal suggests how SMIL-coded synchonized multimedia objects could be incorporated into HTML documents that provide the navigation features required for interacting across the Internet.

The timing extensions proposed in the HTML+TIME specification will allow authors to specify that a particular HTML element should appear at a given time after the file was initially called, for a specified duration, and/or to repeat at specified intervals.

The HTML+TIME proposal also shows how SMIL presentations can be interacted with using standardized action methods embedded withinscripts and the definition of HTML events (e.g. onclick). The proposal includes an object model for handling timed media which will extend W3C's Document Object Model (DOM).

On 31st August 1998 the Advanced Television Standards Committee Digital TV Applications Software Environments (ATSC/DASE) group published a specification for a Broadcast HyperText Markup Language (BHTML) which extends HTML by adding attributes for standardizing multimedia object descriptions within HTML OBJECTelements, using the SMIL SWITCH option and introducing an EVENTelement to manage the actions to be taken when certain conditions are encountered. The BHTML specification also contains properties for defining 3D effects, controlling the volume of audio presentations, and for clipping and overflowing image areas.

During September 1998 an Advanced Interactive Content Initiative (AICI) was started to bring together the teams working on the MPEG-4VRML and BHTML specifications for use in interactive set-top boxes. A statement of the AICI Requirements and Architecture for Advanced Interactive Content in Broadcast Decoders was finalized on 1st October. The initiative intend to complete their specifications by the end of 1998.

In October 1998 ISO's Imaging and Graphics Business Team (IGBT)published a consultative report entitled Towards an IT Standard for Interaction. This proposes the development of yet another international standard that will provide facilities for:

  • Navigation through and updating of large information bases
  • Person to person communications
  • Distributed interactive information systems, where the resulting interaction is composed from the interaction facilities of each subsystem and is transparent to the user.

It is hoped that the plethora of new proposals that have been made in this area during the latter half of 1998 will stimulate the currently diverse groups looking into the problems of interactive media to come together to create a single method for the interchange of information relating to permitted user actions within hypermedia information sets.

Section Contents
OII Home Page
OII Index
OII Help

This information set on OII standards is maintained by Martin Bryan of The SGML Centre and Man-Sze Li of IC Focus on behalf of European Commission DGXIII/E.

File created: December 1998

Home - Gate - Back - Top - Interact - Relevant