Svbtle

 

Svbtle

Product Design Across Contexts: Designing For Use As Well As Environment?

Have you ever walked over to someone with headphones in their ears and started talking to them, just to have them wave their hands and point at their headphones, mouthing to you, “SORRY I’M ON THE PHONE?” You interrupt your half-finished thought and back slowly away, as the person apologizes to their telephone recipient on the other end.

It’s enough of an occurrence (at least for me) that I can’t tell when someone’s on the phone–that I wonder if the design of the device and the body language surrounding the device is a bit muddled. It becomes unclear if you’re sitting, listening to music, talking, watching an important demo, or what you’re actually doing–and whether or not it’s a good time to try to get your attention.

In a second example: I saw a presenter give a talk while using his phone as a set of notes; unfortunately, the simple behavior of standing, looking down, one hand out below and glancing at the phone was disconcerting. His stance shifted to that all-too-familiar stance of not making eye contact, glancing down, closing off his body positioning, and reading from a notepad. Worse, he actually looked distracted–despite his engaging material.

With the universal application of small mobile devices, it’s sometimes unclear whether you’re using the device to talk, game, speak, read, or any other number of communication, creation and connective purposes. The devices are designed almost exclusively for connections across the phone, to another person beyond the immediate context, and also for the user’s ease of functionality–but what about designing the devices to give clues in the current context about what’s going on? (Provided, of course, that the user wants people to know what they’re up to–this is not always the case).

How can product design in an increasingly mobile and connected world become more context-sensitive to your immediate location and surrounding environment? Is it possible to create a set of simple cues and clues that change your current context–making your connection with alternate realities and worlds simpler for you as a result?

In general, human body language gives clues to what someone is doing–sitting at desks, holding telephones, even the design of the dated telephone booth created a portal from which a person could separate from every day context and perform an activity that also clued the external environment into what he or she was doing–the telephone booth, for example, says: “Don’t come into the box while I’m here; I’m talking.”

On a phone or tablet, this gets harder–what, exactly, are you doing with the small square computer in your hands? What does that mean for the people around you and for your approachability and accessibility?

What if phones and tablets “talked”–externally–and communicated to us some information about what the user was doing?

For example: what if, while standing outside on a balcony, staring off at space, or on a company bus with your headphones in, the external part of the phone turned bright pink as a sign that says, “Look, I’m on an important call, don’t talk to me?”

Three simple color indicators, perhaps neon hues, that tell a story about what the user is doing–or the relationship of the user to the people around them.

Neon Purple: Thought and Creation Space. Don’t Even Consider Disrupting the Bubble. Neon Pink: I’m on a phone call. Neon Yellow: Earbuds in, feel free to disrupt me.

How can we change the design of the phone to communicate what it is that we’re doing in a way that’s externally-facing (and useful)?

 
31
Kudos
 
31
Kudos