Digital Media Boot Camp

A group of people working through a design challenge

On Monday and Tuesday this week I hosted a Digital Media Boot Camp at Canada 3.0 in Stratford, Ontario, on Canada’s digital media future. The goal was to come up with some concrete ideas around how to move forward on the vision of being able to do anything online in Canada by 2017, the country’s sesquicentennial year.

I was struck by a couple of things.

First was the diversity of people that showed up for what was essentially the “general public” stream of Canada 3.0. The range of experiences and backgrounds represented resulted in some great discussions.

The second was the recurring theme that emerged that access to the online world remains, in 2010, a real issue in Canada for a variety of reasons: economic disparity, urban/rural divide, fear, inexperience — all prevent full participation in Canada’s digital present. Ironically, technological barriers were emphasized at the conference, where both WiFi and mobile access were severely constrained.

(As an aside, there was also the irony, visible in the accompanying photo, of talking about a digital future using decidedly analogue markers and flip charts!)

There’s a lot of work to do to make a future a vision a reality.

They built a faux iPad and they’re going to use it

Speaking of UI prototyping, have a look at how the folks at Omni approached designing for the iPad without having laid hands on one. Not only did they make great use of paper prototypes, they created a non-functional mockup of an iPad to help get a feel for the interaction on a physical device. This reminds me of Jeff Hawkins, founder of Palm, carrying around a crude wooden prototype of the original Palm Pilot as part of his design research into that product.

UI prototypes help explore, share, and validate a design. Going the extra mile to create a simulation of the device on which a software product will be used undoubtedly contributes to a successful design.

I built a ray gun and I’m going to use it

This month’s approaching UX Group meeting, a UX ‘show and tell’ for artifacts developed in support of creating a user experience, has me thinking about UI prototypes.

Creating UI prototypes is an important part of the design process. Whether built using pen and paper, dedicated prototyping software tools, image-editing software like Adobe Photoshop, or plain old html, a UI prototype makes the design concrete and helps to build a shared understanding of what a product’s user experience will be like.

At one end of the prototyping fidelity scale is a paper prototype, which has the great merits of being inexpensive, easy to create, and eminently disposable.

At the other end of the scale is what I like to call a ray gun. What’s a ray gun? To answer that, I’ll go back to my inspiration for this particular metaphor. One of the first science fiction books that I read when I was young was Tales from the White Hart, a collection of generally humourous short stories by Arthur C. Clarke. I haven’t read it in many years, but I have fond memories of it. (I’m not sure how accurate those memories are, though!)

One of the stories, “Armaments Race”, describes a competition between the makers of rival science fiction television programs to create impressive special effects for weapons. The details of the titular armaments race are quite entertaining as each program’s team unveils increasingly realistic simulations of ray guns. At this point I’ll add a warning for those of you who haven’t yet read “Armaments Race” that the next sentence is a spoiler, albeit one that is crucial to the point of this post! The punch line of the story is, in essence, that an actual functioning ray gun with real destructive power is built in the pursuit of a great simulation.

Metaphorically, then, a ray gun is a UI prototype that crosses a line into a functioning product. I have to admit that I’ve built more than one ray gun as a user experience designer. Depending on who you talk to, that’s either a good or a bad thing.

Abbott and Costello on team communication

Bud Abbott and Lou Costello clarify the baseball team’s lineup

I was talking with a colleague recently about the February UX Group meeting on guerrilla usability, and how one of my strategies has always been to try to work closely with developers and understand their concerns. She reminded me of a meeting that I had once called to reinforce the strong connection between a UX team and a development team.

When I worked at Platform Computing the user experience team had a pretty good relationship with various development teams. Nevertheless, issues occasionally arose where it wasn’t at all clear what the nature of the problem was, and further discussion only muddied the water. When that happened, one of the UX team would smile and interject with the apparent non sequitur “third base”. It was a bit of shorthand we used for a situation where communication just isn’t happening and folks are getting frustrated. Sometimes communication problems can make it seem like people or teams aren’t aligned, when really they’re just misunderstanding each other. Explanations to the developers as to what we meant by “third base”, though, weren’t any clearer!

I decided to share more clearly the source of the “third base” comment with the development team that we worked with most often. I called a meeting with our two teams and showed a video clip of Bud Abbott and Lou Costello’s famous Who’s On First comedy routine. (Here’s a version of it on YouTube.) None of the development team members had seen it previously, probably because the routine is many decades old and that whole team was made up of Chinese-Canadians who hadn’t been exposed to that corner of American culture. As it turns out, they all loved it, and remarked that the kind of wordplay on display has parallels in Chinese comedy. Watching the video together reminded us all that mis-communication is natural, and that we can work to overcome it while having fun building products together.

Perils of a gestural UI

The iPhone was Apple’s first product that leapt completely into the world of gestural interface; it was later followed by the similar iPod Touch. The recent iPad looks to build upon the success of those products. While the iPhone isn’t perfect, as I’ve written previously, it’s a wonderful product for me.

The company’s gestural endeavours aren’t confined to new product categories. Apple has also built multi-touch gestural trackpads into various models of MacBook. I’ve never made a lot use of the extended capabilities in the trackpads — I found two-fingered scrolling to be pretty awkward (though rotation is fine for me).

While I hadn’t previously tried to analyze my response, I recently took a closer look and figured out what has thrown me about scrolling using the trackpad. To paraphrase Inigo Montoya in The Princess Bride, “You keep using that gesture. I do not think it means what you think it means.” Let me explain.

On the iPhone, dragging my finger on the display causes what’s visible on the screen to move in the direction that my finger is moving. A good example of this is seen in Safari, the iPhone’s web browser. If a web page doesn’t fit on the screen I can put my finger on the screen, drag it across the screen, and the web page moves with my finger. It’s as if the page were sitting on a table and I put a finger on the page to move it across the table in a particular direction. If I move my finger towards me, the page moves towards me — scrolling “up” on the screen. If I move my finger away from me, the page moves away from me — scrolling “down” on the screen.

The trackpad on my MacBook is different. Using Safari as an example again, when viewing a web page the entire page may not appear within the browser window. I can scroll the page in a few ways. I can use the cursor to move the scroll bar, or I can use the arrow keys on the keyboard to move the scroll bar. The key in both these cases is that I’m controlling the scroll bar, which in turn scrolls the page. A third way to scroll the page is via two-finger scrolling on the trackpad. Here is where things get interesting. If I move my fingers towards me, the page moves away from me — scrolling “down” on the screen. If I move my fingers away from me, the page moves towards me — scrolling “up” on the screen. These behaviours are the opposite of what’s happening on iPhone. The reason is that two-fingered trackpad scrolling is linked to moving the scroll bars rather than moving the page directly.

Moving back and forth between iPhone and Mac made it easier for me to finally identify the source of my trackpad scrolling discomfort.

This really feels like a collision between the historically dominant interaction paradigm as found in Mac OS X and Windows, and a new gestural paradigm as seen on iPhone. For me, when I’m gesturing to scroll I’m moving the page, not the UI control. iPhone supports that model. The MacBook trackpad doesn’t. The question I have is, how many more of these collisions will appear as Apple continues to build on its gestural UI (and, of course, as other companies add their own twists).

UX guerrillas in our midst

This month’s meeting of the User Experience Group of Waterloo Region is a workshop presented by my friend and colleague Blair Nonnecke. It’s a Guerrilla Usability Workshop, and it will involve group work and plenty of opportunity to learn and share. If you’ve been out before, you’ll already know that the discussions can be quite rewarding. If you haven’t been out, or haven’t attended in a while, come on out and see what you’ve been missing.

It’s this Thursday February 18 at 5:30pm in the friendly confines of the Accelerator Centre. Please be sure to RSVP if you intend to be there, but even if you don’t get a chance to do so, come on out anyway.

The Apple iPad and ubiquitous computing

One of the things that strikes me about the iPad that Apple introduced last week is the name. It’s attracted a fair amount of derision and ridicule, just as the iPhone has drawn flak in some circles for its being such a closed, tightly controlled system.

(I should make it clear at this point that I myself love Apple’s products and have been using them for two decades.)

To me, though, the iPad name (and, by extension, the whole iPad/iPhone family of products) harkens back to work done at Xerox PARC in the late 1980s and early 1990s on ubiquitous computing. That term was coined by the late Mark Weiser and it’s an area that he helped define. Here’s a great prediction from Wesier in 1988:

For thirty years most interface design, and most computer design, has been headed down the path of the “dramatic” machine. Its highest ideal is to make a computer so exciting, so wonderful, so interesting, that we never want to be without it. A less-traveled path I call the “invisible”; its highest ideal is to make a computer so imbedded, so fitting, so natural, that we use it without even thinking about it. (I have also called this notion “Ubiquitous Computing”, and have placed its origins in post-modernism.) I believe that in the next twenty years the second path will come to dominate. But this will not be easy; very little of our current systems infrastructure will survive. We have been building versions of the infrastructure-to-come at PARC for the past four years, in the form of inch-, foot-, and yard-sized computers we call Tabs, Pads, and Boards. Our prototypes have sometimes succeeded, but more often failed to be invisible. From what we have learned, we are now exploring some new directions for ubicomp, including the famous “dangling string” display.

Note that Tabs were envisioned as quite-small computing devices, much like the current iPhone or iPod Touch. Pads were somewhat larger, much like the new iPad. Boards were larger-still, wall-mounted smart boards, or perhaps like today’s Microsoft Surface. 20-plus years later, Apple is delivering on the vision of ubiquitous computing with an ever-evolving suite of products and services. (Many observers also point back to Apple’s Newton MessagePad of the early 1990s as an ancestor of the current Apple products.)

My take, and it’s not a particularly clever or original one, is that the iPad, like the iPhone before it, isn’t meant to be seen as a general purpose computer. It’s an appliance for which the user doesn’t need to be aware of what’s going on under the hood — the computer is invisible. iPad users just get stuff done anywhere and at any time. Moreover, iPad is part of a larger Apple ecosystem that includes the iPhone and traditional Macs, but also a suite of cloud-hosted services that Apple is long-rumoured to have been working on.

The better these devices can deliver services invisibly and ubiquitously, the better the experience will be for many people. Not all people, of course — many will still opt for more open systems and solutions. There will always be other options available, from companies such as Google, but Apple’s direction is an important one.

Getting in tune with the UX Group

The December meeting of the UX Group was a great event, despite the appalling weather. Much as with last month’s Ignite Waterloo event, the meeting showed that when people talk about a product design that they are passionate about, the results are always illuminating and engaging.

I thought I’d briefly share the products that I brought to the table.

A guitar capo in use

First was a guitar capo. A capo is a device for holding down the strings on a fretted musical instrument, like a guitar, in order to raise the pitch. There are several styles and designs, ranging from a simple bar with an elastic strap, to more complex inventions. I’ve owned several, with designs optimized for cost (the aforementioned elastic strap) and preservation of tuning (though at the usability cost of requiring very precise placement) amongst them. The capo that I showed is made by Kyser, and is optimized for fast, one-handed operation. The easy to grab handle makes fast changes a breeze, and it can be easily clamped to the headstock when not in use. Mine works quite well and I’m happy with the results.

A tuner attached to a mandolin headstock

Next up was a compact tuner. Musicians have long lived with the need to tune their instruments. While being able to do so by ear is a great skill to have, not everyone has the ear to do so reliably when first learning to play, and even those that have developed their ear may need to tune in a noisy environment. Electronic tuners have been around for decades now, and they’ve been a great aid for getting an instrument in tune. My first electronic tuner, which I acquired years ago for tuning my guitar which I and still have, is a large device and has a great analogue needle that shows how far off a note is from being in tune. It’s clumsy to use with an acoustic instrument, though, but it is accurate. The newer tuner that I brought to the event, made by Intelli, is optimized for ease of use with fretted instruments. It clamps onto the head stock of the instrument and detects notes through vibrations transmitted via this direct contact. It swivels to make the display visible, the display is very bright and easy to see, it works with both acoustic and electric instruments, fits all my guitars and my mandolins, and it is small enough to easily fit in an instrument’s case. It’s not perfectly accurate, but it’s great for my needs.

It was fun to share these objects with the group, and I enjoyed the conversations.

Cloudy, with a chance of thoughts

Screen image: a ‘thought cloud’ about Jazz

We have something new at Primal Fusion this week. It’s another Primal Labs release, in this case an interaction prototype that enables you to build what we call a thought cloud to express your thinking on a topic.

We’ve released it into Primal Labs, rather than as a part of our main thought networking service, for a couple of reasons. First, we want to get feedback from our community of users on whether this a is a useful way to express your thoughts. Second, it’s not finished, and there are many things that we could do with it. Rather than take it in a particular direction we want that community feedback as quickly as possible to help us prioritize what we do.

We can certainly see using this in our main service, and we have other ideas about how to use it as well. For now, though, please try it out in Labs area and let us know what you think.

My November IgniteWaterloo talk

Ignite Waterloo has released videos of 16 talks from the November 25 first event on Vimeo. It’s great to be able to watch the talks again, as it really was a wonderful night. I’m somewhat relieved to discover that my talk, entitled Metaphor in product design: Are you sure that’s an album?, turned out okay. Note that it started life as a blog post here, but the video expands on the post a little and is more fun!