Skip to main content
You are not a member of this wiki.
2010 Horizon Report Wiki
About this Horizon Project
Call For Examples
Where Are They Now?
How to Participate
Selected RSS Feeds
Google Custom Search
Horizon Project Central
The Horizon Report
Australia-New Zealand Edition
Business & Economic Development Edition
Horizon Wiki Archive
2009 Horizon Wiki
2008 Horizon Wiki
2007 Horizon Wiki
2006 Horizon Wiki
New Media Consortium (NMC)
EDUCAUSE Learning Initiative (ELI)
2010 Short List Gesture-Based Computing
2010 Short List
2010 Horizon Report Short List
Time-to-Adoption Horizon: One Year or Less
Time-to-Adoption Horizon: Two to Three Years
Simple Augmented Reality
The Semantic Web
Time-to-Adoption Horizon: Four to Five Years
Data Visualization & Analytics
2010 Final Topic and 2010 Short LIst:
Time-to-Adoption: Four to Five Years
New topic in 2010.
Devices that can accept multiple simultaneous inputs (like using two fingers on the Apple iPhone or the Microsoft Surface to zoom in or out) and gesture-based inputs like those used on the Nintendo Wii have begun to change the way we interact with computers. We are seeing a gradual shift towards interfaces that adapt to—or are built for—humans and human gestures. The idea that natural, comfortable movements can be used to control computers is opening the way to a host of input devices that look and feel very different from the keyboard and mouse.
Gesture-based computing allows users to engage in virtual activities with motion and movement similar to what they would use in the real world. Content is manipulated intuitively, making it much easier to interact with, particularly for the very young or for those with poor motor control. The intuitive feel of gesture-based computing is leading to new kinds of teaching or training simulations, that look, feel, and operate almost exactly like their real-world counterparts. Larger multi-touch displays support collaborative work, allowing multiple users to interact with content simultaneously, unlike a single-user mouse.
Relevance for Teaching, Learning & Creative Expression
Researchers at Georgia Tech University have developed gesture-based games designed to help deaf children learn linguistics at the critical time of language development.
Using off-the-shelf existing technologies, the Sixth Sense project from MIT provides a gesture interface that can be used to augment information into real world spaces.
After discovering the significant improvement in dexterity that surgeons-in-training gained from playing with the Wii (48%), researchers are developing a set of Wii-based medical training materials.
A number of mobile applications use gestures. Mover lets users flick files from one phone to another; Shut Up, an app from Nokia, silences the phone when the user turns it upside down; nAlertme, an antitheft app, sounds an alarm if the phone isn't shaken in a specific, preset way:
As an assignment, several graduate students at Carnegie Mellon University created a virtual snowball fight using PC software and components of Nintendo's Wii:
Microsoft's new Project Natal, similar to Nintendo's Wii, engages full-body movement:
For Further Reading
University offers new technology to help students study
, 1 October 2009.) The Mathewson-IGT Knowledge Center at the University of Nevada in Reno purchased two Microsoft Surfaces. In addition to maps and games, the University added an anatomy study guide.
Why Desktop Touch Screens Don't Really Work Well For Humans
The Washington Post
, 12 October 2009.) A desktop touch screen isn't comfortable: a more ergonomic design (like an architect's drafting board) would relieve arm fatigue.
help on how to format text
Horizon Project Wiki
Creative Commons License
Banner image after Scott Ingram Photography
The New Media Consortium
is an international 501(c)3 not-for-profit consortium of
hundreds of learning-focused organizations
dedicated to the exploration and use of new media and new technologies. (
Turn off "Getting Started"