UI&us is about User Interface Design, User Experience design and the cognitive psychology behind design in general. It's written by Keith Lang, co-founder of Skitch; now a part of Evernote. His views and opinions are his own and do not represent in any way the views or opinions of any company.
My Dad sent me this video today. Apparently it's been doing the rounds since 2009 but I'd not seen it. The video is from the TV show Ukraine's Got talent and contains eight and a half minutes of astounding 'sand animation' by Kseniya Simonova.
Take a little break and watch the performance of 'The Great Patriotic War' here. Link to large size video, which I recommend. 8:30 long.
There are three exceptional things about this video. One — it's great art — enthralling performance, emotional themes, beautiful imagery. Secondly, the performance itself is technically amazing, yet, apparently the artist has only been doing this for one year. Finally, all of this is being achieved with some plain sand on a flat (backlit) surface. The tools for art don't get much simpler.
And yet…this is exactly the type of real-time, subtle, organic, sensual and fast art I always imagine computers could be capable of. Unlike many swooshy multitouch demos, this is not art for art's sake, instead the animation covers very human topics; one of every four people in the region died in WWII's Eastern Front. And she's using every last creative aspect of sand, from brushing, to finger and palm painting, throwing sand and scraping with the edge of her palm.
Two Hands are Better Than One
So this is how great it can be with some sand. How about some silicon? Matt Gemmell wrote a great piece on iPad application design I enjoyed. On the topic of the iPad's large, multitouch area, he writes…
The important point is that there are other, more obvious ways to accomplish these things; the two-handed input features are conveniences and power-user features. They’re useful and time-saving and possibly discoverable, but they’re not the only way to accomplish those tasks. We’re only just beginning to come to terms with the possibilities of dual-handed input; essential functionality shouldn’t require it yet.
You can see in the video that Kseniya rarely uses two hands. My stopwatch recorded only 1:15 minutes of two-handed use in the eight-and-a-half minute performance. That is, she only uses two hands simultaneously in this performance — 15% of the time. When she does, it's to do something quickly like clear an area. She also seems to use two hands when she's wants to draw symetrically, like the hair at 3:43.
The matter is not that simple though. Many times she switches hands in the performance because she wants to draw on the far left (she appears right-handed) or because she wants a particular shape, or needs to approach from a particular side.
Sometimes she switches for speed, and artistic effect; alternating left and right throws.
Just the Tip(s) of the Iceberg
I love this video because of the richness in the interaction. It's an encyclopaedia of gestures, from a single finger-painting, to multi-finger dabbing, parallel lines with thumbs and middle-inger. French-curve arcs with a palm, broading erasing strokes with the whole hand and intricate air-brush effects with sand released from above. I agree with Matt: we are at the beginning of this whole wonderful adventure. I'm going to keep Kseniya performance in mind as something to strive for. This is a great interface.
"It's something that we used to point at objects on a computer screen"
"Just one thing at a time?"
"Yes honey."
"Wow! But how did you do this?"
[She resizes a square with two fingers and then touches the others to propagate the change]
"Well…in the past it was different. First you need to select all the objects you were interested in, by clicking in a space nearby, then dragging an imaginary rubber band around them all. If they weren't next to one another, then you needed to hold down Command on the keyboard while you clicked on each one. Then you would adjust the size of them with a separate control panel at the side of the screen. Or you might size one how you want, then press Command+C to copy, then Command+V to paste the squares…… are you listening?"
"No, sorry Daddy, that's all too technical for me. I don't know how you remembered all that in the old days!"
In the future, our children will all use rich multi-touch devices. They will look at the mouse & keyboard combination in the same way we today look at the Command Line Interface.
UPDATE: I got it wrong. But I think the trend is right.
Tomorrow…
Apple is rumoured to be announcing a new Tablet device. You probably know this. Rumours of it being shiny and thin (which it probably will be.) How it will be always connected to the internet, and show you books and newspapers and movies on-demand (which it probably can.) How it will have some magical new jaw-dropping interface (which it probably will have.)
But what excites me most is a possible feature that no one seems to have thought of. It's not sexy, and it's something we use everyday on our desktop machines. In fact you probably can't remember computing without it. And yet, I feel it's the key to the future of computing, and without it, the Tablet will not be able to spawn the New Age of Computing. So what's this amazing technology? I'll tell you: mouseOver. You know, the feature whereby links on a page change when you mouse over them, buttons darken and tooltips appear. The subtle interaction that lets you learn more about an interface without committing to anything as serious as a mouse click.
Of course, the Tablet is all about Multitouch -insert choirs of angels- so there's no mouse to be seen. Just a finger or three. So let's call it 'touchOver'. Imagine icons that darken, lighten and pop-out as you waver your finger over them like a tantalising box of fancy chocolates.
So, why bother to include an interaction feature from the past?
First, let's look at the existing benefits of mouseOver in desktop and web applications:
Users feel more comfortable with unfamiliar interfaces, exploring without the commitment of clicking
The user has feedback which helps them "aim" their cursor
Both of these are valuable. But in multi-touch interface history I see rare mention of support for the touchscreen equivalent of mouseOver. I don't know why—maybe it has been technically difficult to cleanly detect fingertip position as they hover over a touch-surface. Maybe the interaction design was never solved. Maybe I've been looking in the wrong places. Maybe it wasn't deemed necessary.
[0095]Another potential technique for the determination between "hovering" and "touching" is to temporally model the "shadow" region (e.g., light impeded region of the display). In one embodiment, when the user is typically touching the display then the end of the shadow will typically remain stationary for a period of time, which may be used as a basis, at least in part, of "touching". In another embodiment, the shadow will typically enlarge as the pointing device approaches the display and shrinks as the pointing device recedes from the display, where the general time between enlarging and receding may be used as a basis, at least in part, of "touching"…
…where it seems that Apple now has the technology, the art and the desire to achieve touchOver. Their patent in essence describes an artificially drawn 'shadow' of each fingertip as it hovers over the interface. Here's a very quick mockup I made of how this may look, as applied to iPhone.
First, I think this will make the touchscreen user experience even better. Less mis-tapped buttons because you have a greater sense of where the device 'thinks' your finger is. More accurate detection of taps because the device knows about your finger position even before you tap.
Secondly, and more importantly, it serves as a stepping stone to a multitouch proxy device.
What do I mean by 'proxy device'? Take the mouse for example. You can see a physical 'mirror' of the mouse on the screen at all times — the cursor —that lets you interact without looking at the physical device.
For a multitouch tablet to replace, or at least augment the mechanical keyboard and mouse, there should be a way to let you keep your eyes on the screen at all times. I know of at least one device that works in this way, the Tactapad by Tactiva (never released commercially).
You can watch a movie of the Tactapad in action here. The tactapad uses a video camera looking down on the users hands to generate an artificial silhouette. A sufficiently advanced multitouch trackpad could generate an even more minimalist/clean version. Note: I'm not saying Apple would mimic the tool workflow as per Tactapad, simply that they'd share the idea of proxy manipulation.
The end result is the same.
A device that brings all the benefits of a dynamic multitouch interface to the desktop computing experience.
Caveats
"But touchscreens are so finicky!"
Lay your palm down on many touchscreens and it will register that incorrectly as a touch event. Other Apple patents describe logic to rule this out. In addition they boast the ability to switch between 'typing' mode (all fingers down) 'pointing mode' (one finger down) and drawing mode (three fingers down, like holding an imaginary pencil.) It may be a solvable problem.
"I'd get tired holding my fingers up all day"
Yes, you wouldn't want to hold your fingers 1cm above the desk all day long. I'm sure there is some solution. See above.
"But what about haptics/force feedback?"
Yes, haptics/force feedback may help you 'feel' your way around an interface without looking. I've been lucky enough to play with some lab-quality (read: $$$) haptic interfaces and agree that it's completely possibly to emulate the feel of pressing a phsyical button or pushing around a lump of clay. But those devices were not cheap, not light nor low-power. I'm looking forward to sophisticated haptics in out everyday devices as much as you, but in some years' time.
"I'd never give up programming on my trusty IBM mechanical clunkity-clunk keyboard."
Maybe writers and programmers will stick to using mechanical keyboards forever. Maybe we'll always keep a mechanical keyboard handy. But it will get harder to resist the appeal of a device where everything is under your fingertips… imagine, for example a Swype-like input interface that dynamically changes it's dictionary depending on what application, or even what part of a line of code, you're currently typing in. A truly context-aware device, done in an subtle and sensible way.
"Why hasn't someone done it before?"
Hehe. They said that to the Wright brothers too. Actually, I'd love to mock this up using something like Keymote for iPhone, but it's very difficult without touchOver-like functionality
And yes, Apple predictions are folly. But from my perspective it's simply a question of 'when' and 'by who'. And from my perspective, the answers are 'soon' and 'Apple'.
Past. Present. Future.
Here's the bit where I'd love your help: Have you seen any examples of touchscreen interfaces working with touchOver like capacity? How did they work? What other problems do you envision?
Is touchOver essential to a rich desktop multitouch experience? I love the fluidity of interfaces like this multi-touch puppetry (via by Bill Buxton) and think touchOver will be essential to move rich interaction like this to mainstream computing. Let me know. :)
This video demos SpaceClaims upcoming multitouch-enabled feature for their 3D CAD system. Some of these gestures, like the '2 finger to anchor, 1 to control', I first saw in Jeff Hans work. Makes for a great looking demo, but would only be sustainable with a tablet PC or something like a cintiq before your arms fall off. From my perspective, the solution will be a replacement device for the keyboard, where your hands are not interacting directly with the screen — but are one abstraction away from, like a mouse — and that's quite the UI design challenge.
Here is an example of a touch table based on Open source software. This kind of interface lets many users interact simultaneously. I like the round form-factor of this example.
Here's the expensive Microsoft version, 'Surface' being covered by mainstream news. Could make for a nice, if not expensive, coffee table.