t t t t t
h o t t ec h / ga i l b a l f our
See me, feel me Touch screens only tap the surface of humancomputer interfaces. A better experience is coming
Large display lab at Queen’s University’s Human Media Laboratory
If you have a tablet computer, you probably remember how quickly it became second nature to interact with it hands on. Soon you were swiping the screen, collapsing and expanding text with the tip of your finger and wondering how you ever got by without it. But this mode of interaction is just the beginning—and despite their raging popularity, touch screens are actually not very sophisticated or natural. Many researchers believe the computers of the near future will be able to respond to your eye movements, gestures and body language, and read your tone of voice for urgency and respond in a way that best fits the situation. They will take on 3-D organic forms and may even change shape to adapt to their environment or task.
Do what comes naturally “Think about it. If you want to pick up a pen, you just pick it up. You don’t issue a command to pick up a pen, and then have something else pick it up. ‘Tap and hold’ is not something we do in the real world,” 14 b a c k b o nem ag. com
/
m ar ch / ap ril 20 13
said Chris Harrison, a Ph.D. candidate in the Human-Computer Interaction Institute at Carnegie Mellon University in Pittsburgh. Most interfaces still require a “decoupling” to take place before an action is carried out—clicking on an icon or issuing a command. What people currently do with touch screens is not much more than poking and swiping, he said. “But our hands can do a lot more than poke at things. Think about how difficult it would be to get ready in the morning if you had to poke at your clothes to put them on, or poke at your buttons to do them up.” Harrison is researching ways in which using different parts of the hand, such as knuckles or fingernails, could invoke different types of commands. Another area of research looks at turning a small computing device into a projection computer by using a combination of inputs (including vibration and measurement of gestures via small cameras) to turn any object into an extension of your device. “So one day I might be able to put my phone down at Starbucks and the whole
table becomes interactive. And then I’ll pick my phone up and walk away when I’m done.” This project, called OmniTouch, is exploring a wearable system that enables graphical, interactive and multi-touch input on arbitrary, everyday surfaces. You can even project the computer right onto your own skin, so that your hand, arm or leg essentially becomes your device, because “you are not going to walk around carrying a table with you,” Harrison said. The changes in mobile adoption have already altered how Web sites are designed. The biggest change has been the demand for responsive Web design (RWD), said Scott Christie, principal at Christie Stewart, a Toronto-based marketing design firm. RWD refers to an approach in which a site is crafted to provide an optimal viewing experience—easy reading and navigation with a minimum of resizing or scrolling. Another big change is that Web sites are getting smaller and simpler: thanks to robust social media sites, rich content can now live elsewhere.