Defining the Next Generation of Reality-Based Interfaces
Friday, February 28, 2014
Dr. Michael Poor, Baylor University
“Defining the next generation of reality-based interfaces”
With the exponential development of hardware and software over the past few decades, a broad range of new interaction styles have been developed. The majority of these styles diverge from the Direct Manipulation or “window, icon, menu, pointing device” (WIMP) type of interaction. Alternatives like tangible user interface, context-aware interfaces, and other unique systems have been proposed, but none has been adopted as the “next” user interface or even considered a viable GUI alternative. Researchers have observed that these proposed paradigms all rely upon reality-based interaction techniques. This realization prompted the investigation into whether or not the inclusions of reality-based interaction techniques have an effect on a user’s experience. I will begin by addressing how next generation interfaces have already intertwined themselves into our everyday lives and how they will continue to do so in the future. Additionally, I will cover the Reality-Based Interaction framework, which claims the more an interaction style can incorporate actions that are based on knowledge that the users have of the real world, the easier it will be for them to interact with a new system. The longer users have known these actions, the easier it will be for the user to call upon their knowledge of these actions. Furthermore, I will discuss how researchers are lending weight to this theory by investigating whether or not the amount of reality-based interaction that is embedded in a system can improve a subject’s overall performance. The four investigated interaction styles are, a GUI, a 3D interactive character, a VR/AR interactive character, and a life-like animatronic.
Monday, March 3 2014
Serra Hall, Room 210
2:30 p.m. to 3:25 p.m.
Download Attachment (pdf)
Maria Cristina Manabat