The Alliance Access Grid Node
operated by Academic Computing Services
at The University of Kansas

The Access Grid is an Internet-based model for video conferencing developed by the Future Lab (FL) within the Mathematics and Computer Science (MCS) division of Argonne National Laboratories (ANL). The Access Grid is an extension of the Alliance Computational Grid which is a distributed computing environment designed to provide convenient access to high performance computer systems to any network user.

As described on the Access Grid web site:

"The Access Grid is the ensemble of resources that can be used to support human interaction across the grid. It consists of:

The Access Grid will support large-scale distributed meetings, collaborative work sessions, seminars, lectures, tutorials and training.

The Access Grid design point is group-to-group communication (thus differentiating it from desktop to desktop based tools that are focused on individual communication)."

The Access Grid includes the notion of a "persistent" video conferencing venue, a conferencing site operating continuously and accessible to a wide audience of users on an ad hoc basis.

Basic functionality

An Access Grid site or "node" will usually be a small conference room or auditorium, provisioned with the equipment to participate in a multipoint video conference. The basic functionality provided within the node is:

To achieve this functionality the Access Grid model relies upon the ability to send and receive Internet Multicast traffic to and from all conference nodes. The Access Grid is based on software (vic and rat) developed as part of the Internet Multicast backbone, or MBONE, which provided multicast services over the unicast Internet backbone (using "tunnels", or "bridges", between multicast nexus sites).

Internet2 has deployed Multicast throughout the Abilene backbone network, and many GigaPoPs and Universities are in the process of deploying Multicast within their campus networks. For example, the Great Plains Network has deployed Multicast within its backbone network, and KU has deployed Multicast to a number of campus buildings as of this writing. As a result, KU campus users can send Multicast traffic to any other site connected to the Abilene network.

Argonne is no longer operating a "bridge," so users must run their own bridges or be connected to Multicast enabled campus networks. If your site is not running IP multicast, you can "roll-your-own", using a Cisco 2621 router which can encapsulate the IP multicast packets into IP unicast. For more information on that option, see the PowerPoint presentation available at http://www.mcs.anl.gov/home/nickless/RollOwnMulticast.ppt

The following infrastructure is recommended:

Software components

The Access Grid model revolves around two pieces of software: These and the infrastructure software supporting these two applications are described briefly in this section.

vic

VideoConference (vic) was developed by Steve McCanne and Van Jacobson at the Lawrence Berkeley Labs. It is intended to link multiple sites with multiple simultaneous video streams over a multicast infrastructure.

vic CAN perform 2 basic functions:

Note that vic may be run in such a way that it only receives video transmissions or only sends transmissions; it is not required to do both at the same time. vic's ability to limit its behavior simplifies partition of conferencing functionality over multiple computers, which, in turn, allows commodity computer systems of moderate price to be used to build a node, as shown in the following diagram:

When vic is receiving transmissions, it will display a collection of thumbnail images of each video stream and allow the user to select one or more thumbnails for enlargement. If a computer system running vic is attached to multiple display devices, say a monitor and a video projector, the operator can position the thumbnail window on the monitor and move enlarged images to the video projector for the conference attendees.

vic is based on the Real-time Transport Protocol (RTP) which provides real time communication support. Note that vic is NOT an H.323 compliant conferencing application. H.323 was designed for desktop to desktop conferencing, while vic is designed for group-to-group conferencing.

For more information about vic see http://www-mice.cs.ucl.ac.uk/multimedia/software/vic/

Many AG nodes use a modified version of vic, called "ddvic", on their video display systems. ddvic uses Microsoft DirectDraw commands to draw its video streams, and, as a result, requires much less (around 50% less) CPU than the standard version of vic. (ddvic does require that every graphics card in the system be set at the same color depth, however. This can be a problem for systems configured with heterogenous graphics cards.)

rat

The Robust Audio Tool (rat) is a recent version of the Visual Audio Tool, also developed by Steve McCanne and Van Jacobson at the Lawrence Berkely Labs. rat allows multiple users to engage in a audio conference over the Internet in multicast mode. rat can perform 2 basic functions:

rat displays a list of connected participants and identifies who is speaking and who is listening at any given time.

Within the Access Grid node, signals from and to attached audio equipment are funneled through an "echo canceller" made by the Gentner Communications Corporation, to eliminate certain kinds of echoes produced during networked conferencing.

In addition, connections to the Gentner echo canceller require a "balanced" signal. Since most PC sound cards and some mics and speakers utilize "unbalanced" inputs/outputs, some kind of "Level Balancer" is required to connect these components. Argonne recommends the RU-LA2D IHF-Pro Interface.

It is probably fair to say that the Gentner echo canceller is the major component of the audio conferencing system (even though this goes against the grain for the "computer centric" members of the community). Networks of Gentners work together to provide useful audio signal exchanges, as shown in the following diagram:

The Gentners can participate in 3 different connectivity infrastructures:

When a Gentner uses a computer network to connect to other Gentners, it connects to the computer just as it would to a simple Codec (compression/decompression device). The computer performs the relatively simple function of moving traffic from one Gentner to another. As it happens, rat also provides a basic mixing capability as well, but the Gentner retains the major responsibility for providing an audible signal.

For more information about rat see http://www-mice.cs.ucl.ac.uk/multimedia/software/rat and the Access Grid web site.

There are a couple of ways for node operators to listen to the traffic emanating from their own site:

The Distributed PowerPoint software

The Argonne Distributed PowerPoint software allows a single presenter at one node to control PowerPoint applications running on computer systems located at other Access Grid nodes.

For example, a conference speaker can run PowerPoint along with the Distributed PowerPoint master software on her laptop computer at the podium of one of the AG sites. When the speaker changes slides, the master will notify the DPPT server, which will notify DPPT clients running on systems at other nodes which will, in turn, direct their local PowerPoint programs to change slides.

The clients are usually configured to run with PowerPoint on the Access Grid Display machine at each remote node, so the slides will be viewable by the remote audience.

Note that this approach requires that some PowerPoint features be removed or disabled prior to presentation, because Distributed PowerPoint cannot deal with them. (See later discussions of VNC and "scan conversion" for alternatives.)

The DPPT clients can operate on PowerPoint slidesets published on a Web server, or on local copies of the slidesets.

The MUD software

Operators at each site involved in an Access Grid conference typically keep in touch by using software originally developed for online "role-playing" games generically called Multi-User dragons and Dungeons" games, or "MUDs". (MUD functionality is similar to that of Inter net Relay Chat operating with access control.)

Argonne runs a MUD server for use by Access Grid operators who run MUD clients on their desktop systems. tkMOO-lite is currently the recommended MUD client for this purpose, but others, such as Tiny-Fugue in the Unix environment can be used as well. tkMOO will run on both Windows and Linux systems, so it may be be run on any of the AG component systems described below.

The Multicast Beacon

To help diagnose multicast network problems during conferences, Argonne promotes the use of the NLANR multicast "Beacon" monitoring system, which includes three pieces of software: The Beacon at each node connects to a Multicast group and collects latency, loss, and packet misordering statistics from all other beacons connected to that Multicast group and sends them to the Beacon server. The Beacon viewer displays these traffic statistics as a matrix showing traffic to and from each Beacon attached to the server. There is also a web-based Beacon viewer.

The Beacon software depends on synchronized times, so that systems running the Beacon must also run Network Time Protocol (NTP) servers and synchronize their clocks to an NTP master clock and keep them synchronized. At KU the Beacon is running on the AG node's video capture system.

The Virtual Venue software

Coordinating multiple group conferences can be complicated. Argonne has developed a collection of web pages and Java applications that can simplify the process.

The Virtual Venue is basically a web-page that lets users select a "conference" to attend. In this context a "conference" is composed of

If your systems are Virtual Venue-enabled, the display system operator can click on a conference room name and the vic, rat and MUD applications running on the video display, video capture and audio processing systems will all be started with target addresses and settings appropriate to the selected conference room.

This coordination is accomplished by running an "event server" and the event controller on the display system, along with "event listeners" on the video capture and audio processing systems.

The following Java applications comprise the Virtual Venue software suite:

FunctionJava application Starting script filename
Event serverag.EventServerMonitor start-eventserv.bat
Event controller ag.DisplayResourceManager drm.bat
Audio listener ag.AudioResourceManager arm-eventlistener
Video listener ag.VideoResourceManager vrm-eventlistener

The usual start up procedure is to:

  1. start the arm-eventlistener on the audio system
  2. start the vrm-eventlistener on the video capture system
  3. start the script start-eventserv.bat on the display system
  4. start the script drm.bat on the display system
  5. start Netscape Communicator
  6. go to http://venues.accessgrid.org/AG
  7. log into the Access Grid
  8. choose a Virtual Venue

After these steps you should see a vic window open along with a Tk-Moo window.

Virtual Network Computing (VNC)

VNC allows users to share monitor screens over the Internet in a variety of modes. In the Access Grid environment, VNC allows a speaker to share his/her podium laptop with Access Grid display systems which can then project it at remote nodes. This is useful when a speaker wishes to give real-time demonstrations or present PowerPoint slides that include "fancy" features, such as animations, that cannot be displayed using Distributed PowerPoint.

VNC employs a client server architecture, and there are clients and servers available for Windows98/NT/2000 and Unix operating systems. The Unix version of the VNC server allows multiple users to share the same screen, so speakers can easily demonstrate Unix-based applications.

The Windows version of the VNC server does not appear to allow multiple clients, so special steps must be taken to share Windows screens. Eric He of the Chemical and Petroleum Engineering Department at University of Kansas developed a novel use of VNC components to allow many remote users to share a Windows-based laptop. Eric configured a Unix server to "relay" the Windows-based screen contents to remote Access Grid display systems.

Although not part of the original Access Grid canon, VNC has been employed during several Access Grid conferences, and shows promise for future applications. VNC eliminates the coordination effort required to display Distributed PowerPoint slide sets. (No files need to be downloaded ahead of time and no slide synchronization is required.)

The major drawback is that VNC generates and receives considerable network traffic (in the megabyte range), when it updates a screen image. Simple PowerPoint slides will usually update in a couple of seconds and simple animations have been successfully displayed, but complex slides took as much as 10 seconds to update during the Alliance Chautauqua 2000. In general, update times are a function of the number of pixels changed and the number of remote viewers (as well as avaible bandwidth), so VNC will not be appropriate for all applications.

Instructions for setting up a VNC relay, are presented in Using Unix-based VNC to relay other VNC traffic.

Basic system configurations

The AG model uses a collection of commodity components to provide various services. To assure optimal responsiveness individual functions (video capture, video display, audio capture and presentation) are placed on separate computer systems.

There is a variety of hardware and software configurations that can provide the required video conferencing functionality. This section shows one such configuration:

Audio capture and presentation computer

The audio capture computer:

Software

Hardware in place or planned for the KU ACS node:

Video capture computer

The video capture computer system converts analog video from cameras and/or VCRs, etc. to digital for transmission by vic over the multicast network.

Software

Hardware installed in the KU ACS node:

Video display computer

Receives video content over the network and displays it on the PC monitor as well as one or more other monitors and/or video projectors if desirable (using the ability of Win2K to display its console screen across multiple video cards)

Software

Hardware in the KU ACS node:

Echo canceller control computer

The audio control computer runs Windows 98 and uses custom Genter Control Software to control the Gentner mixer/echo canceller. See http://www.gentner.com/ for more details. Within the KU ACS node, this function is provided by a 200MHz Pentium-based PC.

Speaker's podium computer

The speaker's podium computer runs:

Configuration suggested by Argonne: Some laptop powerful enough to run PowerPoint

The KU ACS Podium laptop is connected to a "scan converter" that can convert the VGA/SVGA signal generated by the laptop to NTSC video expected by video capture cards. The CORIOscan Select from TVONE is lists for around $495, and can be used to produce a reasonably high-resolution image (1280x860). (Another possible alternative is to use a dual-head video card, such as the Geforce2 MX Pro, to produce a standard NTSC signal to drive either an external TV or a video capture card.)

Alternatives for displaying speaker slidesets

As mentioned earlier, the Access Grid provides several methods for displaying speaker slidesets.
  1. use Distributed PowerPoint. This is the "standard" method and provides high quality representation at every site with very little network traffic. Using DPPT means getting each slide set prior to use, stripping it of special PPT features and publishing it on a Web server for distribution to each remote site.

    This approach may not work well if the speaker relies on special features (such as fancy animations) or launches other applications during the talk.

  2. use a VNC server running on the Podium laptop and a VNC relay (as discussed earlier). This approach provides high quality video, including simple animations and all PowerPoint features, but introduces some update delay, and generates much more network traffic than the other alternatives. (If a version of VNC were produced to employ Multicast for image distribution network traffic would be significantly reduced.)

  3. split the Podium laptop video output (using, for example, an SVGA/VGA amplifier such as the Extron P/2 DA2+), send one channel to a local projector for the local audience, and one to a scan converter for conversion to an NTSC video signal and then to a video capture card for distribution over vic from the video capture machine.

    This will give excellent update speed both locally and remotely, but relatively poor image quality at remote sites. Text smaller than 20 points is usually not legible, but animations and videos present well (as long as high resolution is not necessary). This approach could be a very effective, general solution IF vic could be used with a higher quality codec than the usual H.261. An MPEG-1 codecs is apparently under development and should provide a significant improvement.

  4. use commercial streaming video package. For example, during the Kansas issue of Alliance Chautauqua 2000, Cisco IPTV was employed to present full-motion animations at high resolution. However, setting up for IPTV broadcasts is complex and requires access to an IPTV server, so this alternative will not be available to all.
With respect to the VNC alternative...there are at least 2 usage alternatives:
  1. presenters move their slide sets to the Podium laptop prior to distribution via VNC. This can cause problems if the slide sets use software either not configured on the Podium laptop or configured differently on the Podium laptop, etc. As a result, all slide sets must be tested before use.

  2. presenters load VNC on their own laptops before they go on. The alien laptops must be attached to the local network, which may mean redefining TCP/IP parameters, and the VNC relay must reestablish connection with the Podium each time a new presenter comes on board using this approach.

Ancillary Servers

You may need to run some of the ancillary servers mentioned earlier on separate computer systems. For example, you may need boxes to run a

Operators

You will need from 1 to 4 operators, depending on how you apportion duties, to run an Access Grid node. With one operator per basic function you will need an operator for:

To some degree there is a trade-off between system costs and operator costs, and the staffing requirements will vary with the complexity of the presentations being offered at a site.

 

Backup server

Imitating Doug Johnson of the Ohio Supercomputer Center, ACS acquired a backup system in case of hardware failure within the ACS node. This system is dual-booted for Linux and Windows 2000, and can be used to replace some combination of display, video capture or audio functions, depending on whether failed components run on Linux or Windows. The backup system serves double-duty as a VNC relay, and is configured as follows:

 

Additional Info

The Access Grid web site:

http://www-fp.mcs.anl.gov/fl/accessgrid/

For a more detailed list of hardware components on the KU node, including sources and prices see

http://www.cc.ku.edu/~grobe/docs/access-grid-node/access-grid-ku.html

Acknowledgments

Some of the material for this web page has been taken from the Argonne Labs web site listed above, or from documents provided via that site.

The plagiarizer is Michael Grobe at grobe@ku.edu, who has freely modified the web site contents and must be held accountable for any errors introduced during that process, among other sins. (November 7, 2000)