Maribeth Back

Welcome! This site presents an overview of my professional work. For the past fifteen years, that's been designing virtual and ubiquitous technologies for real people. I'm especially interested in multimodal forms of human communication, and how the applications of new interface technologies can be informed by social context, as well as physical and perceptual constraints.

mixed and immersive realities

Currently I am a senior research scientist at the FX Pal Alto Laboratory, leading the Mixed and Immersive Realities group. We are experimenting with 3D virtual environments and mixed realities, with particular focus on the application of these technologies for the enterprise. We see powerful applications in remote collaboration, industrial collaboration, and process visualization and control. Right now we are building a virtual factory to "mirror" real processes in a real-world factory, importing sensor data from the factory into the virtual environment. We are also tracking the everyday use of a virtual collaboration environment for the office.

usable smart environments

A few years ago, I led FXPAL's Usable Smart Environments group, which addressed smart environments in the workplace, in particular conference rooms - an area with decades of rich research and plenty of interesting implementations, but with a long way to go yet in usability as well as functionality. The Usable Smart Environment project (USE) focused on designing multimodal interfaces for easy-to-use, highly functional next-generation conference rooms. Our first design prototype was a "no wizards" room for an American executive; that is, a room the executive could walk into and use by himself, without help from a technologist. The second generation of the USE technology is currently installed in two multi-screen conference rooms at FXPAL, and is in daily use by the company.

experiments in the future of reading

At Xerox PARC, I was a Senior Member of the Research Staff, working at first in the Computer Science Lab and later in the RED (Research in Experimental Documents) group. We examined emerging technologies for their impact on media and the human relationship to information, and we built working prototypes to illustrate our ideas. My role there combined a number of aspects of research: project initiator and team lead, designer, builder, evangelist, and writer. Much of my research focuses on analysis and design methods for creating an efficient, practical user experience: how to build real-world, socially informed applications for new technologies. Often, this involves creating robust working prototypes and putting them in public spaces. (A busload of ten-year-old kids can be a pretty rigorous reality check.) For example...

We were invited to create a gallery exhibition at Silicon Valley's new Tech Museum of Innovation. We chose to focus on the intersection of reading with digital technology, and to build real working prototypes of some of our ideas. "XFR: Experiments in the Future of Reading" ran from March to September 2000, and tours to other technology and science museums through 2003. I created and was project lead for several of the exhibits. You'll find images and more details about them on the associated sites below, as well as some further embodiments for education and assistive applications.

  • The Listen Reader: an interactive children's storybook featuring a rich, evocative ambient soundtrack. We use embedded RFID tags to sense what page we're on, and we use capacitive field sensors to measure human proximity to the pages. Proximity measurements control volume and other expressive parameters of the sounds associated with each page. Here's a preprint of the CHI paper.

  • Speeder Reader: Combines the notion of dynamic typography with the notion of the car as interface, or, if you prefer, it's speed-reading with speed-racing controls. A speed-reading protocol called RSVP (for Rapid Serial Visual Presentation) allows people to learn to read up to 2000 words per minute. This is because it flashes words or short phrases onto the screen in front of you, affixed in one spot; you don't have to move your eyes around a page to read. A gas pedal to controls your rate of speed-reading and a steering wheel navigates between streams of text. Here's a preprint of the Computers and Graphics journal paper.

  • Walk-In Comix: a graphic novel you can literally walk into -- it's printed on the walls, floor, and ceiling of a small, labyrinthine set of rooms. Talk about getting immersed in a book...This project offered a new take on interactive narrative via static presentation -- and also made us develop some new methods for graphical representation on very large printed surfaces.

  • Here's an article about the show as a whole: XFR: Experiments in the Future of Reading.

 

physical instruments for digital systems

Recently I've been teaching a couple of courses at UC Berkeley, in human-computer interaction, user experience, and design. One course, a graduate class at the UC Berkeley Institute of Design called "Design Realization: Physical Instruments for Digital Systems." We explored forms and functions appropriate for personal instruments that turn complex datasets into working knowledge. The course was project-based and gave equal weight to mechatronics, programming, social aspects of design, and physical design and form. You can see the syllabus here.

 

assistive and educational technologies

  • The AirBook is an assistive reading device that combines dynamic text (especially RSVP, rapid serial visual presentation) with force-free capacitive field sensors to create a simple, easily controlled assistive reading device. This reader is designed to assist people with visual disabilities (like dyslexia, macular degeneration, loss of fine motor control or loss of contrast sensitivity) by giving them more control over font size and contrast. It s also for people with upper-body disabilities, lack of fine muscle control, or severe arthritis, all of which can make the handling of standard paper books difficult. The force-free sensor system can be adjusted for large-scale motion or for tiny ranges of movement, and requires no pressure or fiddling with physical objects. Here's the CHI short paper on the AirBook.

  • The Health Care Chair was a healthcare informatics research project I worked on as a graduate student at the Harvard Graduate School of Design. The goal in the Health Care Chair project was to create a sensor-laden, networked chair for the home that could support medical care and offer communication and education via Web-based services.

  • The Surgical Room of the Future was another project at Harvard, funded by DARPA. We explored the use of smart materials and sensor networks in the support of surgical procedures. We also explored some complete redesigns of the modern operating theater, including a mobile version for the battlefield.

 

augmented realities and virtual environments (older work)

  • Audio a-life is a sound environment that breeds itself, creating new sounds in response to a set of initial conditions and environmental experiences. Online you can check out a paper on the Kinetic Mandala, one of our test environments, that Maureen Stone and I wrote for the VRML'99 conference.

  • Audio Aura was an experiment in audio augmented reality built on top of the existing Active Badge infrastructure in PARC's Computer Science Lab (CSL). Progressive versions of this work were reported on at ICAD '97, UIST'97,and CHI'98; here's the CHI paper.

 

audio engineering and sound design

Audio design and engineering is one of my specialties (I spent many years as a professional audio engineer, for high-end recording studios and for professional theater). For example: using audio as a feedback/response system for information visualization environments, or developing an audio augmented reality for exploring peripheral data, or augmenting tactile or gestural controllers with audio tools. I also work in the related disciplines of data sonification and auditory display. I am very interested in how people understand sound and other dynamic processes, and how good design can take advantage of this understanding.

 

If you like, check out my CV.

 

thanks for stopping by.