Hostname: page-component-78c5997874-94fs2 Total loading time: 0 Render date: 2024-11-02T19:00:02.220Z Has data issue: false hasContentIssue false

Back to the real world: Tangible interaction for design

Published online by Cambridge University Press:  17 June 2009

Ellen Yi-Luen Do
Affiliation:
ACME Lab, College of Architecture and College of Computing, Georgia Institute of Technology, Atlanta, Georgia, USA
Mark D. Gross
Affiliation:
Computational Design Lab, School of Architecture, Carnegie Mellon University, Pittsburgh, Pennsylvania, USA
Rights & Permissions [Opens in a new window]

Abstract

Type
Guest Editorial
Copyright
Copyright © Cambridge University Press 2009

1. INTRODUCTION

1.1. What is tangible interaction? Why should we care?

After several decades in which design computing has been almost exclusively the domain of software, today, many investigators are building hybrid systems and tools that in one way or another bridge the divide between physical “real-world” artifacts and computational artifacts. On the one hand, the rise and popularity of mass customization, rapid prototyping, and manufacturing raises questions about the kinds of software systems and tools that will make these hardware technologies useful in designing. In contrast, advances in microcontroller and communications technologies have led to a wave of embedding computation in physical artifacts and environments. Both are described in Gershenfeld's (Reference Gershenfeld1999, Reference Gershenfeld2005) popular books.

Tangible interaction is a growing field that draws technology and methods from disciplines as diverse as human–computer interaction (HCI), industrial design, engineering, and psychology. If the idea of ubiquitous computing (Weiser, Reference Weiser1991) is computation integrated seamlessly into the world in different forms for different tasks, tangibility gives physical form and meaning to computational resources and data. The HCI community terms this seamless interaction variously “tangible bits” (Ishii & Ullmer, Reference Ishii and Ullmer1997) or “embodied interaction” (Dourish, Reference Dourish2001). Tangible interaction is simply coupling digital information to physical representations. A tangible user interface (TUI) extends the traditional graphical user interface on screens into the everyday physical world to realize the old goal of “direct manipulation” of data (Shneiderman, Reference Shneiderman1983).

Aish (Reference Aish1979) and Frazer et al. (Reference Frazer, Frazer and Frazer1980) were pioneers in tangible interaction for design; both developed instrumented physical objects as input devices for computer-aided design. However, the widespread adoption of the mouse in the first commercial personal computers shadowed other forms of interacting with computers such as the pen and TUIs. The development of TUIs based on augmented reality and embedded computing proceeded independently, and it was many years before the HCI community rediscovered these early efforts (Sutphen et al., Reference Sutphen, Sharlin, Watson and Frazer2000).

Coupling physical features to digital information can be perceptually mediated by human senses (i.e., sight, touch, smell, etc.), leveraging the affordances of things (blocks are stackable, balls are bouncy) and control mechanisms (squeezing toothpaste out of the tube or turning a knob). Before the days of fast CPUs programmers could read the changing patterns of lights on the mainframe console (known colloquially as “das Blinkenlights”) to help debug their code, an early instance of bringing processes within the machine into the physical realm. More recent tangible interaction research makes virtual objects “graspable” (Fitzmaurice et al., Reference Fitzmaurice, Ishii and Buxton1995) by using physical “bricks” as handles and letting people “take the digital out” into the real world to manipulate it physically.

Blocks, for example, are a popular physical form for tangible interaction. We rotate Navigational Blocks (Camarata et al., Reference Camarata, Do, Johnson and Gross2002) to query digital content in a tourist spot; flip and rotate blocks to scroll a map on Z-Agon (Matsumoto et al., Reference Matsumoto, Horiguchi, Nakashima and Okude2006); stack up Computational Building Blocks (Anderson et al., Reference Anderson, Frankel, Marks, Agarwala, Beardsley, Hodgins, Leigh, Ryall, Sullivan and Yedidia2000) to model three dimensions; or snap together Active Cubes (Watanabe et al., Reference Watanabe, Itoh, Asai, Kitamura, Kishino and Kikuchi2004) to interact with the virtual world. Alternatively, consider projection: projection and spatial augmented reality have been employed with multitouch surfaces and computer vision in reacTable (Jordà et al., Reference Jordà, Geiger, Alonso and Kaltenbrunner2007) for hands-on musical collaboration or in bringing dinosaurs to life in Virtual Showcase (Bimber et al., Reference Bimber, Encarnação and Schmalstieg2003). Coming full circle is the “rich interaction camera” (Frens, Reference Frens2006) that applies the form, interaction, and functionality of a conventional 35-mm film camera to improve the usability of its digital counterpart.

Since its first issue, AI EDAM has published the best work at the frontiers of engineering design and computing; today, tangible interaction is one of those frontiers. At first glance tangible interaction might seem a mere conceit: interfaces are inherently superficial. However, new input and output modes are the technological force behind tangible interaction, and in the history of computing new input and output modes have yielded big changes: the move from batch processing to personal computing, high-resolution bitmap graphics, and the mouse. Today, embedded computing, low-cost video cameras and projection, novel sensors, and new responsive materials drive new modes of interaction. As this Special Issue shows, these new modes of interaction have the potential—in conjunction with previous research on artificial intelligence in engineering design, analysis, and manufacture—to change how we practice, teach and learn, and investigate engineering design.

This Special Issue of AI EDAM populates the space of hybrid computational–physical systems with articles specifically about the use of tangible interaction in design. The seven articles accepted for this Special Issue are the design cousins of the other tangible interaction work mentioned above. The articles here fall in three categories: the lay of the land: a survey of the field that serves as a lens to look at existing work; teaching with tangibles: tangible interaction in design education; and tangible tools: tangible applications and tools for designing.

2. THE LAY OF THE LAND

In the first article, “Framing Tangible Interaction Frameworks,” Mazalek and van den Hoven take a broad look at the field of tangible interaction. In earlier days, individual systems and their particular characteristics defined the field. As the field matured, researchers continued to build interesting and innovative interfaces, but they also began to identify frameworks within which to position their work. Now researchers in the field have many tangible interaction frameworks to choose from. Mazalek and van den Hoven's article orders the diverse array of frameworks that have arisen over the past decade or so of research on TUIs.

3. TEACHING WITH TANGIBLES

The next two articles offer insights and reflections on tangible interaction in design education. In “Tangible Interactions in a Digital Age: Medium and Graphic Visualization in Design Journals,” Oehlberg, Lau, and Agogino examine the changing nature of design journals as digital and tangible media begin to mix. The article describes a study of students' design journals in UC Berkeley's Mechanical Engineering Project course. Some students keep traditional paper-and-pencil journals, whereas others keep their journals in a digital online form; still others adopt a hybrid format. Their study, which spans journals from 4 years of the project course, developed protocols for analyzing the design content in the journals as well for coding the use of media.

Shaer, Horn, and Jacob's article, “Tangible User Interface Laboratory: Teaching Tangible Interaction Design in Practice,” describes the structure and implementation of an introductory university-level course in tangible interaction, drawing on the authors' experience over several years at Tufts University's Computer Science Department. They describe strategic decisions they took in designing the course and some consequences of these decisions. Their article also includes examples of tangible interaction designs that students in the course produced.

4. TANGIBLE TOOLS

The last four articles report on tangible tools for design. The first article gives a glimpse of how researchers at an industry laboratory engage in prototyping a tangible tool. The second describes a tool that projects video onto physical artifacts to visualize patterns and textures; this facilitates real-time feedback for collaboration and communication among stakeholders. The third looks at the paradigm of projection based interactive augmented reality to explore opportunities for the use of such tool. The fourth and last article concerns using tangible tools to foster children's story telling and movie making.

“Prototyping a Tangible Tool for Design: Multimedia E-Article Sticky Notes” by Back, Matsumoto, and Dunnigan presents a tangible sticky note prototype device with display and sensor technology. The idea is that people can use tangible e-paper tools to sort, store, and share information with the combined affordances of physical features and digital handling of visual information. The authors describe the design and development of different interaction modalities (e.g., flipping, sticking, stacking) to control a multimedia memo system and suggest applications in project brainstorming and storyboarding for design. The article also shows how to use today's off-the-shelf technology to prototype behaviors of tomorrow's technology.

In “A Tangible Design Tool for Sketching Materials in Products” Saakes and Stappers describe a tool they built that uses augmented prototyping for ceramics design, projecting two-dimensional imagery onto physical shapes. Their tool enables ceramics designers to quickly and fluidly experiment with the surface features and graphic design of ceramic products such as plates, bowls, and tiles. Their article describes the task the tool supports, the tool itself, and experiences with master designers using the tool in practice.

“Analyzing Opportunities for Using Interactive Augmented Prototyping in Design Practice” by Verlinden and Horvath outlines the prospects for using augmented reality systems and rapid prototyping in industrial design. They examined data from three case studies in different domains (the design of a tractor, an oscilloscope, and a museum interior) from which they derived a set of hints or guidelines for adopting interactive augmented prototyping in industrial design.

In “Play-It-By-Eye! Collect Movies and Improvise Perspectives With Tangible Video Objects,” Vaucelle and Ishii report on a series of iterative design case studies. They describe four video editing systems they built for children that use tangible interaction in various ways, embedding movie making as part of play. For each system they describe the underlying design motivation and the specific implementation, their experience testing the design with children, and the lessons they learned, which informed the next design iteration.

5. THE LARGER LANDSCAPE

These seven articles represent a spectrum of different perspectives, yet they hardly cover the entire landscape of tangible interaction for design. For example, we have no representative from “programmable matter” or modular robotics, research that is sure to have profound implications for tangible interaction and design (Goldstein et al., Reference Goldstein, Campbell and Mowry2005). Nor have we a paper from the active community of do-it-yourself engineering, in which the availability of new materials and desktop manufacturing technologies is enabling citizens to design and build quite complex engineered systems (Buechley et al., Reference Buechley, Eisenberg, Catchen and Crockett2008). Despite Gershenfeld's promise of “things that think” (1999), most work to date on tangible interaction for design has little, if any, artificial intelligence. Perhaps a future issue will include these and other aspects of tangible interaction in design.

The seven articles here were selected from 17 submissions with two rounds of review. In the first round, all submitted articles were blind reviewed by multiple reviewers. (Each article received at least three reviews, and nine articles received five or more reviews.) In the second round, the Guest Editors reviewed and edited the revised manuscripts. We thank all of the authors and reviewers for their diligent work and timely responses. We also thank Editor-in-Chief Professor David C. Brown for giving us the opportunity to edit this Special Issue and Cambridge Senior Project Managing Editor Nancy BriggsShearer for taking care of logistics.

Ellen Yi-Luen Do is an Associate Professor of design computing and design cognition at Georgia Institute of Technology, with joint appointments in the College of Architecture and the College of Computing. She is also an affiliate faculty member at the Health Systems Institute, GVU Center, and the Center for Music Technology. Dr. Do is interested in building better tools for people, from understanding the human intelligence involved in the design process to the improvement of the interface with computers. Her research explores new interaction modalities as well as the physical and virtual worlds that push the current boundaries of computing environments for design.

Mark D. Gross is a Professor of computational design in the School of Architecture at Carnegie Mellon University. Dr. Gross has worked on constraint programming environments for design, pen-based interaction, sketch and diagram recognition, tangible and embedded interfaces, and computationally enhanced design construction kits.

References

REFERENCES

Aish, R. (1979). 3D input for CAAD systems. Computer-Aided Design 11 (2), 6670.CrossRefGoogle Scholar
Anderson, D., Frankel, J.M., Marks, J., Agarwala, A., Beardsley, P., Hodgins, J., Leigh, D., Ryall, K., Sullivan, E., & Yedidia, J.S. (2000). Tangible interaction + graphical interpretation: a new approach to 3D modeling. Proc. 27th Conf. Computer Graphics and Interactive Techniques (SIGGRAPH), pp. 393402.CrossRefGoogle Scholar
Bimber, O.L., Encarnação, M., & Schmalstieg, D. (2003). The virtual showcase as a new platform for augmented reality digital storytelling. Proc. EuroGraphics Workshop on Virtual Environments (EGVE), pp. 8795.CrossRefGoogle Scholar
Buechley, L., Eisenberg, M., Catchen, J., & Crockett, A. (2008). The LilyPad arduino: using computational textiles to investigate engagement, aesthetics, and diversity in computer science education. Proc. 26th Conf. Human Factors in Computing Systems (SIGCHI), pp. 423432.CrossRefGoogle Scholar
Camarata, K., Do, E.Y.-L., Johnson, B.R., & Gross, M.D. (2002). Navigational blocks: navigating information space with tangible media. Proc. 7th Int. Conf. Intelligent User Interfaces (IUI), pp. 3138.CrossRefGoogle Scholar
Dourish, P. (2001). Where the Action Is: The Foundations of Embodied Interaction. Cambridge, MA: MIT Press.CrossRefGoogle Scholar
Fitzmaurice, G.W., Ishii, H., & Buxton, W. (1995). Bricks: laying the foundations for graspable user interfaces. Proc. 13th Conf. Human Factors in Computing Systems (SIGCHI), pp. 442449.CrossRefGoogle Scholar
Frazer, J.H., Frazer, J.M., & Frazer, P.A. (1980). New developments in intelligent modeling. Proc. Computer Graphics 80, pp. 139154.Google Scholar
Frens, J.W. (2006). A rich user interface for a digital camera. Personal and Ubiquitous Computing 10 (2–3), 177180.CrossRefGoogle Scholar
Goldstein, S.C., Campbell, J.D., & Mowry, T.C. (2005). Programmable matter. IEEE Computer 38 (6), 99101.CrossRefGoogle Scholar
Gershenfeld, N. (1999). When Things Start to Think. New York: Henry Holt & Co.Google Scholar
Gershenfeld, N. (2005). FAB: The Coming Revolution on Your Desktop—From Personal Computers to Personal Fabrication. New York: Basic Books.Google Scholar
Ishii, H., & Ullmer, B., (1997). Tangible bits: towards seamless interfaces between people, bits and atoms. Proc. 15th Conf. Human Factors in Computing Systems (SIGCHI), pp. 2227.CrossRefGoogle Scholar
Jordà, S., Geiger, G., Alonso, M., & Kaltenbrunner, M. (2007). The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces. Proc. 1st Int. Conf. Tangible and Embedded Interaction (TEI), pp. 139146. Accessed at http://mtg.upf.edu/reactable/CrossRefGoogle Scholar
Matsumoto, T., Horiguchi, D., Nakashima, S., & Okude, N. (2006). Z-Agon: mobile multi-display browser cube. Proc. Extended Abstracts Human Factors in Computing Systems (SIGCHI), pp. 351356.CrossRefGoogle Scholar
Shneiderman, B. (1983). Direct manipulation: a step beyond programming languages. IEEE Computer 16 (8), 5769.CrossRefGoogle Scholar
Sutphen, S., Sharlin, E., Watson, B., & Frazer, J. (2000). Reviving a tangible interface affording 3D spatial interaction. Proc. 11th Western Canadian Computer Graphics Symp., pp. 155166.Google Scholar
Watanabe, R., Itoh, Y., Asai, M., Kitamura, Y., Kishino, F., & Kikuchi, H. (2004). The soul of ActiveCube—implementing a flexible, multimodal, three-dimensional spatial tangible interface. Proc. Int. Conf. Advances in Computer Entertainment Technology (ACE), pp. 173180.Google Scholar
Weiser, M. (1991). The computer for the twenty-first century. Scientific American 265 (3), 94104.CrossRefGoogle Scholar