Gesture-Based Computing


**Time-to-Adoption Horizon: Four to Five Years**
"Thanks in part to the Nintendo Wii, the Apple iPhone and the iPad, many people now have some immediate experience with gesture-based computing as a means for interacting with a computer. The proliferation of games and devices that incorporate easy and intuitive gestural interactions will certainly continue, bringing with it a new era of user interface design that moves well beyond the keyboard and mouse. While the full realization of the potential of gesture-based computing remains several years away, especially in education, its significance cannot be underestimated, especially for a new generation of students accustomed to touching, tapping, swiping, jumping, and moving as a means of engaging with information." (2011 Horizon Report)

According to Wikipedia.org, gesture-based computing is derived from gesture recognition which is a motif in computer science and language technology. Its aim is to interpret human gestures through mathematical algorithms. Gestures spring from any bodily motion or state, specifically from the face or hand thus its usefulness for processing information from humans which is not transferred via speech or type. Computers are enabled to understand human body language thus creating a significant connection between machines and humans. As this concept continues to expand, we may witness the demise of input devices such as the mouse, keyboards, and touchscreens.

Use

The list below from Wikipedia.org show the various types of gestures which can be identified by computers:
  • Sign language recognition
  • For socially assistive robotics
  • Directional indication through pointing
  • Control through facial gesturesgbc
  • Alternative computer interfaces
  • Immersive game technology
  • Virtual controllers
  • Affective computing
  • Remote control

Input Devices

The following list from Wikipedia.org includes some tools used in gesture-based environments:
  • Wired gloves
  • Depth-aware cameras
  • Stereo cameras
  • Controller-based gestures
  • Single camera

The Big Idea in Education


According to the 2011 Horizon Report, gesture-based computing is one of the key trends to watch in education technology. Although the report states that it is still four or five years away from mainstream classroom adoption, its success is evident with all the current devices and gaming systems such as X-box Kinect and SixthSense . Gaming companies are already teaming up with numerous reputable children's television programs which supports this type of embodied learning lending to further advances in the
gbchild
future. Microsoft's Xbox 360 announced back on October 18, that they will be teaming up with Sesame Street and Nat Geo Wild to provide "playful learning" by merging the shows to the Kinect device. Alex Games, educational design director of Microsoft states, "we can encourage kids to use their motor skills and to learn using their body in immersive experiences." They are combining television and play to promote a different level of engagement via the characters interacting "more accurately, gauging for example if a child gets the wrong answer to a question that's posed".

Audrey Watters writes in "A Big Step for Gesture-Based Learning? Kinect Connects with Sesame Street", Microsoft is now filming interactive TV shows that "seek to inspire kids and their parents to get off the couch and into the action, working cooperatively with their favorite characters to have fun and learn at the same time." This is definitely a win-win situation. As teachers begin to learn more about the opportunities of the Kinect in the classroom, they will need continued support which is already available through KinectEDucation. As more and more developments are being inspired, there are several articles, guides, and lesson plans that are included on the website. One of the articles highlights what is needed for Kinect to be successful in the classroom.

The following teacher-enriched ideas shared in 3 Things Kinect Needs to Be Successfuare briefly outlined as what may work for effective integration:
  1. Reformed classroom model - the traditional classroom can be transformed into an active learning center with simple movement of desks and/or tables
  2. Software: User-generated or commercial - education stakeholders have the opportunity to design the software in order to target standards
  3. Paradigm shift - once the benefits are introduced as more of it being an input device then the skepticism will disappear
Screen-shot-2011-08-10-at-8.08.10-PM1-300x249.jpg

The following video was created by KinectEDucation to show the benefits of the system in the classroom.


Another article from their website shares 5 Benefits of Using Kinect in Education. The benefits are:
  1. Facilitate research-support learning - active learning will increase the academic and social performance of students
  2. Seamlessly integrate technology - the enriched content will hide most of the actual technologygoodbye.jpg
  3. Embrace cultural diversity - global connections with other students (currently piloting in Africa)
  4. Establish content relevancy - augmented labs will provide more opportunities for those who are unable to access virtual labs and labs at school
  5. Explore new environments - varied content is explored

And lastly, the article, 5 Reasons Kinect Will Succeed in Education describes how it will become a technology focal point in education due to various research. The reasons are as follows:
  1. Content relevancy amplified -watch the video below to eradicate any doubts
  2. Minimize negative influencers of the education system -maintaining accountability with alignment of the most effective learning strategies
  3. Engages learner through movement - students become the controller through higher-order thinking
  4. Inexpensive and advanced technology - one time purchase from open source community
  5. Acts as an input device first, gaming device second - biggest hurdle educators are trying to jump over



Watch the video below from social elearning for a glimpse of how gesture based learning is developing in Second Life via Kinect.



Below is a video created by a student from the NetGen Education Project (another collaboration from the Flat Classroom Project)

Find more videos like this on Net Gen Ed Project


Listen to another great TedTalk from John Underkoffler recorded at Ted2010. He shares how gestured-based computing supports inputs and outputs in the same space.


Below is a slide show of screenshots and images related to this topic of the 2010 NMC Horizon Report (http://horizon.nmc.org/) shown as part of the presentation at the release of the report at the EDUCAUSE ELI Conference, January 19, 2010 in Austin.




Next: Blended Learning