The Touch Experience

A few weeks ago, John White interviewed me for a white paper that he is doing on touch. I didn’t think I would have much to say – then he started asking questions… it appears that I do have an opinion on most everything. So when I was thinking of topics for this year’s Southern California User Experience Camp, I contacted John for the interview transcript and developed a few slides to facilitate a discussion for a session. Here are some thoughts that I shared at SoCal UX Camp.

When the mouse was first introduced, people picked it up and placed it on the screen… The basic human instinct was to “touch the thing I want to manipulate.” That’s how we think, right? If I want to manipulate something, I touch it.

Touch is immediate – there’s no abstraction. When there’s a device – mouse, joystick, roller ball – between you and the object you want to manipulate – the device introduces a level of abstraction that you need to overcome.

With touch, there is no mechanical interface to learn… You have been doing thing with your fingers for a while. But there are new gestures to learn:

touch gesture reference
touch gesture reference

 

 

 

 

 

 

 

Source: http://www.lukew.com/ff/entry.asp?1071

And more…

 

more gesture examples
more gesture examples

 

 

 

 

 

 

 

Source: http://productmanagement.geoffreyemery.com/?p=15

Apple introduced touch to the phone market in 2007. Touch had existed for many years but Apple applied it to the phone. Touch changed the phone experience. Prior to touch, people had used stylus to touch things on the phone screen. With the stylus, you had to learn a tool versus using your finger… but you can read all about that in the first chapter of The Customer Experience Revolution.

And touch isn’t for everything – clicking on individual cells in a spreadsheet for example. There are some use cases for which touch may never be right:

  • Accessibility
  • Productivity apps (spreadsheets)
  • Data entry (forms)
  • Keyboard shortcuts

As we move forward, I expect there will be more voice to text usage that will replace the need for keyboards and touch. And advances in holographic, wearable and embedded (in us) technology will be new ways of thinking about these interactions and experiences, too.

And there are my thoughts on touch… What’s yours?