Aug 13

File Under: Nostalgia

brazilI’m a sentimental person by nature, but my nostalgia has cranked up even further with the recent birth of my son: attempting to see everything through his new eyes, reconsidering the things I’ve begun to take for granted, and wondering how his childhood will (and won’t) be like mine.

On a more practical level, I’ve had to make space for him in our 800 square foot house – he has moved into our office / spare room, and seems to accrue more accoutrements with every passing day. In clearing room for his stuff, I’ve been slowly phasing out my extensive paper files and shifting my papers and such online: distributing them across an unholy alliance of Dropbox, Google Drive, and Evernote.

All three systems employ some version of the traditional folder tree hierarchy, but what makes them really useful is their searchability – something noticeably lacking in the bank of steel Bisleys holding up my desk at home. Evernote in particular takes this a step further, allowing me to categorize and tag my files ad hoc – functionality that Apple’s forthcoming Mavericks operating system will be bringing to the desktop format this fall.

Traditional hierarchical file systems have always required significant organizational expertise and investment to maintain: you’ve got to decide on a structure upfront that will remain flexible enough to accommodate unknown or unforeseen future changes, and that structure both has to be kept up and its vagaries communicated to any new user. Individual files are typically housed in a single primary location, and copies must be created to allow for cross-reference or intradepartmental sharing – the cliché about governmental forms being filled “in triplicate“. Traditional hierarchical systems are, more often than not, inflexible and institutionally expensive, and hard (if not impossible) to revise or reform.

Yet we still use these systems, even if by adding search, filter and tagging functionality, we have fundamentally altered the way the underlying information is conceived and related.

Skeumorphism has begun to fade in currency as an aesthetic consideration (or conversation topic), but nostalgia (or fetishism, depending on your perspective) still has an important place in design. Like the brushed aluminum of the old Quicktime interface with its frustrating click-and-drag-and-click-and-drag volume wheel, nostalgia can serve to define and focus in equal parts, despite obscuring or confounding. The designer needs to understand these cultural and physical cues and balance those with the usability and function of the designed solution.

I’m left wondering, then, what benefit – if any – there is in the psychology of hierarchical systems; and why, if for some reason beyond technological limitations, they have persisted as long as they have? Did the legacy of growing up with the Dewey decimal system and thousands of Steelcase filing cabinets leave me at a disadvantage in the emerging economy? Will the my son’s productivity be enhanced thanks to a past free of psychologically-limiting hierarchical structures? Or will he be more lost in this postmodern sea of infinitely relational databases? Most importantly, what are the grander implications of a shift away from these confined structures?

I don’t assume for a moment that a child born today will come of age in a world without structure, simply because the new Macintosh OS supports tags. But what I do know is that the nature of nostalgia is irrevocably altered when Facebook and Instagram are the dominant photographical repositories of our time, or at minimum that the digital word and image (and sound) have far displaced the physical artifact. As I slowly scan and shred my old documents (or trundle them off to a forgotten corner of the attic), I can’t help but reflect on the finality of the process – a road that, once followed, one is not likely to backtrack.


Aug 13

A Walkman Amongst Panopticons

Walkman / Secret GardenI can’t say that I had a particularly happy grade school experience. I’d moved back in with my mother and stepfather at nine years old, back from my inner city D.C. public school, and found myself in an unfamiliar world filled with unfamiliar faces. The private school kids I was surrounded by had known each other practically from birth, and had (amazingly) already stratified into social groups. As a latecomer, it took years to penetrate those circles, and in the meantime, life was somewhere between physically unpleasant and just lonely. I credit my survival to the walkman.

Anyone beyond a certain age can respect the fact that the portable tape player / radio is referred to not only without its manufacturer (Sony) but also in lower case. For close to a decade, the Sony Walkman was the standard-bearer for personal music devices. Whether you used it for catching up with radio news or tuning out school bus taunts, it was a truly revolutionary device: along with the tape cassette itself, it made music portable in a way that wasn’t controlled by disc jockeys and their broadcast organizations. Mine encouraged and affirmed me, shored up my faltering youthful resolve, and faithfully commiserated during my low moments.

Fast-forward to today, where I am listening to a Spotify recommendation as I write these words. Paid radio services or those supported by intermittent advertising seem like natural market responses to rampant piracy (if it is indeed rampant) and thecloudifyingof the world’s music; at first blush, social listening feels like a perfectly normal extension of public or group listening. What I’m struck by is my hesitation to pursue long-practiced behaviors (like listening to a single song on repeat) given the stream of tracks I see my Facebook friends listening to. Or, for that matter, the twinge of judgment I sense when I see a friend listening to something I think is lame.

I’ve long wondered about the sort of panopticon our social services construct around us: not only that which we experience socially (“what would my friends think if they saw me listening to Darius Rucker?”) (I didn’t.) but also that which we internalize. Having some intuitive sense of how the recommendation algorithms work – or perhaps more crucially, how our consumption data is bought and sold, and what it means for us in the long run – do we allow our technology to subtly streamline our behavior?

Netflix’s new single-account user profiles recently highlighted a telling (if slightly entertaining) example of this social consumption, albeit a relatively constrained one: with the rollout of multiple user features within a single, shared account, I found my wife trying to move the items she’d put in our legacy queue into her new personal one. Since my profile had inherited our viewing history and queue, she was attempting to separate her foreign films, dramas, and home workouts from my cheesy action movies and sci-fi thrillers. We’d always been pretty good at ascertaining which among us was responsible for a specific film pick, but there was some reasonable overlap in selections. (Perhaps most telling was when I’d come home from a business trip and the Recently Watched category would reveal what both of us decided we should watch on our own.)

By allowing multiple users, Netflix was creating the opportunity for members of a single family to reclaim their filmic identities and reconfigure the Netflix space to align more perfectly with their unique interests. In turn, its users could discard the collaborative definition of that shared space. In my haste to play with the new feature, I didn’t give it much thought.

A story on the TED Radio Hour this weekend sort of encapsulated this conundrum for me: that the vastly expanded opportunities we have to share and communicate may have more impact than they appear at face value. Notwithstanding the monetization of these avenues and opportunities (and the potential for abuse or eavesdropping), how are these new tools shaping the way we communicate and the way we create our own identities? Does the ‘publication of everything’ have a chilling effect on our social activity? If so, do those practices have even deeper significance over the longterm?

I miss my walkman.

As a note: I read Bill Bishop’s The Big Sort years ago and I still routinely return to it as one of the most interesting pieces of research on what could be tentatively called the “cellularization” of society. The book speaks specifically to political, religious, and social organizational themes, but I think it takes a much broader significance when read in the context of Facebook and other social media or social consumption tools.

Aug 13

In Defense of Nothing

ios7I threw up a little in my mouth the first time I saw the new crop of iOS7 icons. A good friend hammered at me about how much better the new design system was, and I told him I respectfully disagreed. But as I joked to my wife this morning, my “position is evolving” as I grow more familiar with the new system.

The sturm und drang among the design community underlies a bigger phenomenon than the OS wars: as software and hardware begins to converge (in the mind of the user), the scaffolding / structure of the software experience (and its visual style) grow ever more important. Uncustomized versions of each OS are now, more than ever, the pattern libraries defining the UI metaphors and behaviors against which independent applications are judged.

In the past few months, I’ve had the opportunity to work on a conceptual program that marks my company’s first new design since the release of the iOS7 beta, and as such, its first attempt to design within that new system. Aside from my own fascination with the flat visuals and typography / content-centric interface (it finally looks like my wireframes!), I’ve had to expeditiously suss out what sort of UX impacts the new patterns have.

To anyone paying any kind of attention to this topic, the answer should come as no surprise (it was, by and large, what Apple told us during WWDC): with consumers’ steepest learning curves behind us, it’s no longer necessary or helpful to focus on old mechanical visual metaphors. Instead, the goal is a common one: losing weight. With basic function and learning techniques established, it was time to reduce not only visual weight but also the accumulated mental “weight” of a decade (or more, depending on whether you include all GUIs) of less capable interfaces.

The thing I felt most in the new OS was not a thing at all: air, negative space. This is a trend consistent with Ive’s ongoing industrial direction elsewhere in Apple – the paper thin iMac is as impressive for what it is as for what it isn’t – hefty. In keeping with analogies, the unibody construction of devices (read: light but strong) are about as close as any company has yet approached the “interface-less” interfaces of the coming future.

What’s left in all this space is largely physical flourish – accelerometer-based ‘responsiveness’ and cute little wiggles and bubbles and transitions. Having sloughed off the density of the old OS6-era visual design, there was plenty of room left for unnecessary / seemingly-pointless (read: fun) nuance. And, just like in human relationships, while bold action breeds trust and respect, only nuance can really build fondness friendship.

Designers of all stripe (myself included, for a time), deplored the Duplo-styled color palette and some of the faintness or fragility of the stock application icons. I’ll still side with the haters in that some specific design decisions feel a bit tacked on or rushed (Game Center, in particular, went from bad to bad and weird), but in general the new design has won me over. What I find most interesting now is the implication to us mobile designers and developers.

A colleague of mine referred to the newly minimal interface as “anti-design”, but I think that’s overstating it – it’s not so much a lack of design as it is a lack of visual distraction. I likened it to the lights coming up when the bar’s about to close: there’s nothing left to distract you from the reality of the situation. Applications now have no excuse to hide poor content, poor functionality, or shoddy development practices in visually-overwrought immersive experiences. That isn’t to say they won’t try, but at least now it will be more obvious when they do.

Aug 13

A Very Long And Abstract Post About Process

processI’ve been asked on multiple occasions to help define or refine UX process, which tends to result in one of two types of document: a really long and detailed process that has a lot of optional specifics, or a very high-level abstract that probably serves more to start conversation than it does to help non-designers get up to speed.

This post is the latter.

Anyone who knows me has noticed that for someone whose focus is now clearly on mobile software design, I like to read and think and talk about nearly everything else instead. That’s no slight or evidence of disinterest to my chosen company; it’s more illustrative of the fact that I imagine the scope of my life’s work to extend well beyond the mobile apps and websites and wine labels and business cards I have produced.

What I think is most interesting about design is that – taken to that abstract level – everyone does it; human existence is about communicating and transmission of elements from one place in space and time to another via direct or indirect interaction. I tried to make that point in my previous post (oh so long ago) and was told that the connections I drew were probably significant enough on their own but a bit of a muddle as a group. Maybe sometime I will try to strip them apart and give them more love.

As a species of designers (and certainly not alone in that respect), it’s instructive to use design terminology and theory to understand the choices humans make. From my distant perspective on the ongoing Egyptian unrest and coup to my local observation of aggressive driving and litter, I am constantly trying to suss out what makes people behave the way they do and what my design experience might provide in terms of altering outcomes.

Without further ado, what follows are my thoughts on design method and rationale. Yet unedited, but better to be read in bad shape than to sit in draft form for another six months or whatever.

People keep asking me about process formalization, and I keep hedging by replying something along the lines of “we don’t have a process, we have a methodology”, distinguishing between a series of formal steps taken in order and a set of optional approaches that can be applied to a given design challenge. Notwithstanding my obvious dodge, I do think that there are some steps that must be a part of every project (or must be documented as assumptions, at great risk to the viability of whatever comes out the business end of the project plan). This post explains my thoughts on those.

In order of performance:

1. understand what the client has to offer;
2. understand what the end user needs / wants to do;
3. model the space in between.

Ideally, these three project components will overlap; understanding what the business has to offer requires getting to know the client and their internal subject matter experts, which nearly always results in hearing their complaints and gripes. Those gripes – what I call “friction” in the design – result in communication problems or needs mismatches. At their root, these mismatches are human problems or designed to accommodate personal and group decisions. Where there is friction, there are at least two users needing different things.

Unfortunately, it’s difficult for focused and unfocused individuals alike to articulate their needs outside of their personal context. That’s a big part of what we as designers are here to accomplish. Through a mix of trust-building exercises, mediation, contextual shifts, and personal expertise, our goal is to impart some of our methodology to our clients. Not until they see the value in our method can they commit to the steps required to achieve the end product.

Mediation is the careful lubricant applied to communication problems. The root of the term is media, which suggests our role as communicators: a medium is a connection, connective tissue, or some other substance that surrounds distinct elements to hold them together. In practice, mediation is a type of multi-person talk therapy designed to assist individuals both in articulating grievances and determining a common ground or interest upon which a resolution to those grievances can be constructed.

There are an intricate set of interdependent relationships involved in any situation where we design a product for a client other than ourselves, which are worth a lifetime of study on their own; for now I am going to stick with a few of primary interest to our own industry: client-to-client, designer-to-client, and client-to-user. The first is oft-noticed but rarely addressed, since most of us on this side of the relationship prefer to think that client should only approach us with all their “shit together”.

The fact is, no one has all their shit together; if they did, they wouldn’t need to hire us to help them design something – build something, maybe, but not design it.

So a big part of every project needs to be the trust-building exercise of listening, repeating, and refining the various struggles and accomplishments communicated to us by the client. By internalizing and reflecting to them their culture and language from the perspective of an outsider, we are already performing a great service to them – likely one they did not realize they were purchasing from us. By showing them that we understand them and that we care to try to know them better – and by illustrating our personal investment by the odd brainstormed solution (not defined, but tossed out as evidence that we are thinking about their challenges) – we prove to them that we are worthy of partnership.

What typically follows these stakeholder conversations is access to subject matter experts. These individuals should be understood to have great focus on whatever their realm happens to be. They are pieces in a puzzle who only rarely provide a holistic perspective. What they can provide is deep insight into the client-side capabilities (technological, service, and personal) and rationale as to the purpose of past design decisions. Subject matter experts can often ‘humanize’ a series of product development decisions that from the outside seem inscrutable. More importantly, these folks can highlight points of friction or tension within the organization that can be as important (or more important) to design for than the end user.

Contextual Shift
The goal of contextual shift is to remove the client from their daily, practiced perspective. There are a number of methods for accomplishing this, but the goal is the both crucial and enormously difficult: the ability to both acknowledge and ignore the constraints or limitations they have built their careers learning and understanding. It is hard enough for us to do as outsiders; never forget how stressful and monumentally-unusual it can be to an insider entrusted with a significant product authority.

Failure to draw the client out of himself – even if just for the duration of a two-day workshop – can mean significant problems for the project down the road. Like hypnotherapy, this shift can only be done in a safe and trusting atmosphere, and it helps to have an idea or image to return to in the future if anxiety creeps back into the client side of the relationship. Experience maps and personae and powerpoint slides are all good and useful, but none of those or similar artifacts can stand as the shared idea that bore the contextual shift.

The End User
It’s become almost rote to exclaim that solving for the end user is the goal of any design challenge. I won’t counter that except to say that end users are tough to come by; more often than not, we are designing within a context or experience, of which the end users are a component. That said, the end users are the ones most likely to have the most significant interaction with a software or hardware solution, so it is definitely important to focus attention on them. I humbly submit, however, that the designer’s gaze should be constantly expanding into the environment of the experience, even as their focus narrows.

Part of the reason for that is the role of what is typically called strategy or planning: like knowing what you don’t know, strategy is guessing or estimating what may fill in the gaps in what we can quantifiably ‘see’. Crucially, that comes back to not the current but future users of a given product (or the environment / context of that use). Strategy may be a specified role in the design organization, but it is everyone’s job. Strategy is built part on external subject matter expertise, part on observed behavior, and part on instinct. All three of those benefit from the designer’s perpetually-expanding intellectual horizons; it’s why so many work in agencies – to get access to different types of projects.

All that said, there will be a targeted end user or group of users that provide both high value to the client and research direction for determining the scope and nature of the functionality in a given design. Depending on the type of user research employed, trust can be a larger or more minimal part of the relationship. Mediation is again the most important capability the designer can have: the ability to watch, learn, ask, and understand user behavior and thought processes without leading or in any way directing the user toward specific solutions. Finding friction and understanding the expectations set by the circumstances of an interaction are paramount.

For the most part, contextual shift is less useful during user research. While it can be helpful to gather specific requests (“I wish I could do X while I am standing at the ATM”), the goal of this step in the process is to understand the ‘geography’ of the experience and the internal context of the different individuals acting within it. If you do decide to co-design an interaction with the targeted user, specific constraints or rules need to be established to reduce stress and encourage creativity. Thankfully, the human body and five senses provide a great starting point.

The bulk of the designer’s output is models – interventions set within specific contexts to communicate or otherwise transmit between one actor and another. If that sounds overly complicated, it’s because as designers, our purview should include any aspect of an experience down to the smallest interaction among participants: the hot, dry sensation of an oven door being opened; the reverberating doppler effect of a sound moving through an urban landscape; or the haptic ‘click’ of a button toggling to a selected state.

Models come in every shape and form: from wireframes and experience maps to choreographed videos and space plans. As designers, we should never restrict our focus solely to our area of greatest expertise; our role is problem-solver, not house-painter. As mobile software designers, we can’t ignore the hardware, services, or tangible products that complement or augment so many of our design solutions. Our models function in space and time, so our design methodology needs to accommodate inputs and feedback from well beyond the devices we design for.

Agile and Lean proponents tend to draw the focus of design away from its documentation and artifact to the benefit of the functioning design solution; that makes sense under specific conditions but may be overly dismissive of the function of those artifacts as part of a larger model. It may come as no surprise that I consider the total design process as itself a solution, rather than merely the output of that process. While many Agile devotees write off documentation as needlessly time-consuming, it does perform the critical role of reinforcing the trust that’s been built and illustrating the decision chain that has led to the functioning solution. Viewed as an important element in the proper function of the client-designer relationship, it’s much harder to relegate it to a simple sprint-zero component.

Dec 12

Holistic Strategy: The DSM, Brasilia, and The Fiscal Cliff

brasiliaSome interesting things have happened in the past few weeks, which I’d like to take a look at through the lens of my profession: the death of Oscar Niemeyer, revisions to the DSM, and the ongoing battle over the fiscal cliff.

What struck me about these events is the spotlight they shine on the connection between high level or holistic planning and ground level details.

Atomization is something we in digital design have become increasingly familiar with. The application development industry has found considerable success (generally) with the modularization of functionality into tinier and tinier components, designed to fit into larger platforms. “Feature creep” seems to have lost importance over the years as the features associated with phones and devices fade into the background relative to the applications that employ them. Operating system developers no longer attempt to control every aspect of their user experience, but rather the broader and more universal ones (though to different degrees). Platform design is essentially strategic and holistic.

Overlaying this kind of critique on the edits to the DSM produces some interesting results. The DSM is a purpose built reference for assisting medical professionals in their diagnosis and treatment of psychiatric disorders. Concurrently, it is the medical industry specification of the human condition; as such, it is essentially (to use interaction design terminology) a functional requirements specification for psychiatric normality – a normative spec document for nominal system function. In plain English, it is a set of logic and law for how the human brain malfunctions – conversely, a list of situations that outlines how ‘normal’ human beings should operate.

Additionally, the document serves to outline areas which do and don’t have existing treatment plans. This directly affects the pharmaceutical and insurance industries, identifying areas where programs or products do and don’t exist to address medical needs. Again in industry parlance, the revised DSM offers a ream of RFPs for new research or product development.

It may sound weird to consider such a document as part of a larger design strategy, but when you observe the ripples emanating from these small decisions, you begin to get a sense of how (as we also see in law) even minute alterations can lead to sweeping changes. Did autism exist before there was a medical classification for it? The question may sound trite or philosophical, but the practical result of the new classification was quite tangible. The bigger question remains: what is the overall goal for human health, and how do these more strict and specific definitions support it? What is the goal of the human health “platform”, and how does this new set of design guidelines help fulfill it?

Niemeyer highlights a different kind of platform and set of design decisions: the city. While not actually responsible for the city’s layout, the Brazilian architect will always be associated with the planned capital of Brazil, Brasilia. Most contemporary urban planners will admit – gladly or begrudgingly – that an individual or office cannot (and should not) attempt to control every aspect of a city plan. Brasilia, constructed from scratch in less than four years and largely a product of modernist social theory, has stood as an example of what happens when you try.

Conceived during the advent of the automobile, Brasilia was designed with no tolerance for future modifications and built with a strict population limit. Simply put, the city of Brasilia was designed and delivered as a finished product; an entire administrative campus as an artifact. Adaptations of programming (who can or does live in which structures, for example) were not accommodated for, and growth beyond planned limits was not considered. Today the city is – while still quite beautiful and comprehensible in its architecture and layout – largely considered a sort of modernist dystopia; ringed by planned and unplanned suburbs and favelas, and increasingly subject to the crime that pervades other Brazilian cities.

In digital analogy, Brasilia represents the attempt to control every aspect of the experience, even when that strategy ossifies into dogma in direct conflict with the needs of the end user. At the other end of the spectrum, The favela (in its way) represents the attempt to address every need individually, and ultimately the inability to address any need effectively. The middle ground – a framework or set of experience guidelines, including the requirement that the platform remain as flexible as possible is the most progressive approach. You can see this at work in the best branding, holistic planning, and Agile methodology. Again, modularization to support execution, but attention to and agreement on the holistic first and foremost.

The notion of a framework brings me to the last event: the fiscal cliff. The term “framework” was used extensively by the Romney-Ryan campaign to explain its economic plans in the run up to the 2012 presidential election. On its face, this seemed to make sense – given that specifics weren’t going to be dictated by either party, why attempt to detail them for purposes of soundbite? The difference in this case was that the US political system has increasingly grown to be a zero-sum contest of two competing frameworks, where neither side has incentive to compromise with the other and the general complements of each framework is already largely understood. Given that reality, what the lack of specificity amounted to was a lack of (again in interaction design terms) user acceptance criteria. Because so many voters were familiar with the philosophical direction of each campaign, those tiny proportions of undecided constituents could be assumed to make decisions based on the specifics, rather than the frameworks.

What’s interesting to me about this last example is that while much attention is paid to the philosophical direction of the country (per one party or the other), very little is paid to the design of the system that drives it. Recent burps from the media about parliamentary or procedural changes in congress or redistricting fights aside, the electorate seems either paralyzed or disinterested in the grand design process that defines so much of the world they live in. Legal definitions and minute word choices determine the political cycles and power balance among citizens, yet civic participation has become generally relegated to the push of a button – whether a vote or a donation. The process of governance has become so complex that the layperson has very little insight or vision into the paths that connect party philosophy with ground level reality, an obscurity that both parties seem to use to their advantage.

This post has been a bit of a meander, but in summation I would suggest three design insights of particular interest to the interaction designer: don’t neglect the overall user experience goals and strategy for the benefit of the individual components; remember to design systems and strategies that can accommodate future requirements without losing their center; and never, ever forget that the process of executing against a plan is as important as the product that emerges from it – keep your rationale, deliberation, and decisions transparent and well-documented, and keep all your constituents involved as best you can.

Nov 12

The How of Who

handwritingLately I’ve been spending a lot of time trying to explain what I know and what I’ve done (professionally). I’ve been asked to take design tests and write essays and draw things. Potential clients have requested I give marathon presentations and speak to panels. It’s a brave new world of lead generation and applicant vetting. In my first job interview, I was asked to code a table in HTML, with just a piece of paper and a pencil. Times have changed. Or have they?

I’m in the process of updating my portfolio at the moment, which – unsurprisingly – has become a quick trip down memory lane. The work is all there (though some of it has been born and died since the last portfolio update), but it’s amazing how flat and insufficient a portfolio can be when you’re trying to describe a third of your adult life.

I’ve had a couple of realizations about this: first, I should take some time to describe some of these projects here on the blog, where I have a bit more space to go long form, and I can indulge myself a little more to get into the fun (or frightening) anecdotes that really make the stories human. Second, I should take advantage of the blog format to fill in the gaps a bit around the more general things I’ve learned. As I reexamine the medium and structure of the portfolio (project vs. client, industry vs. employer, etc.), I’m also reexamining the value that I bring to an engagement and how best to tell that story in different situations.

Ultimately, the goal is to bring a bit of content strategy and interaction design to my own career. While such a public dialogue (monologue? internal dialogue?) probably isn’t the wisest format for cultivating a persona of infallible expertise, I am a firm believer that interaction design is by necessity a team sport. So consider this my first volley, and please do check back soon.

Nov 12

One Year Later

shackletonIt’s been forever since I updated this, and in the interest of full disclosure, I’ll admit that I considered trying to backdate some entries to make it a little less obvious (I didn’t).

So, first things first, let me explain what happened in the past year or so.

I resigned as the UX Director for T3 in June of this year.

There were a few reasons for that, the primary one being that I was simply burnt out on marketing and advertising. In theory, these types of projects are exciting components of the product development cycle; you are, after all, mediating the space between the end customer and your client, crafting the stories and (in our case) much of the interface those customers experience. The challenge comes when the client organization can’t agree on the basic benefits of a given product, or can’t afford to develop a product that meets enough consumer need.

It’s that second issue that gives the advertising industry its bad marks: when clever communication begins to hinge on artifice or sleight of hand maneuvering. For someone feeding himself on user advocacy, the legal word games and distraction techniques can only go on so long before a sense of internal rot sets in. Call it the curse of the empathetic, if not simply the sympathetic. But more on this later.

The other reason was institutional. It’s neither safe nor wise to discredit your past employer in a public venue, so I do hope, dear reader, that you know that I know that you know that I know, and don’t take this the wrong way: I simply didn’t have enough goals and desires in common with T3 leadership to make the relationship a successful one for either party. In more concrete terms, I felt that the resources and investments necessary to transition T3 to a modern model were not forthcoming.

What I’d hoped to do as UX Director was to lead a gradual transformation away from what I would call a “neotraditional” agency model toward a broader institutional perspective on creative problem solving. This puts UX perilously close to the center of an organization’s thinking – perilous, when in the eyes and experience of so many employees, UX had historically been tangential at best and superfluous at worst. Ultimately, I believed (and stated), that the end user had to be the primary focus of the work. But that’s a more complicated pill to swallow than it might appear.

What this means – and apologies for those who have heard this explained a million times over elsewhere – is that the traditional agency perspective has to flip on its head; the starting point needs to be at the consumer level, media-agnostic, and build back to the client goals. This puts a tremendous responsibility on research and planning, and requires a considerable level of trust on the part of the client relationship; two things not easily come by. But it also requires some quashing of the pure “auteur” sensibility that seems so sought-after in the traditional agency model.

Is it possible to do good user-centered design without an agency aligned on this kind of perspective? Maybe. I certainly don’t see it happening much. The way I characterized it was that without this approach, UX is just usability. There is a place for that, I suppose, and pure heuristics can always improve a product. But that’s not what I believe should be the role of a design organization; at least not one where I work.

Since June, I’ve been leading UX for Fangohr, reading, studying, cooking, and doing a lot of thinking. The (ongoing) result of that process is Black Dots, a (so far) one-man consultancy, devoted to user-centered product development strategy (until I can come up with better terms for it). The goal is two-part: applying the UCD skills I’ve acquired in identifying, communicating, and ultimately solving tangible problems; and finding the clients prepared to engage an agency that way.

So: that’s what I’ve been up to since last November. Thanks for stopping by; it’s been great to catch up, and please do check back later for more on Black Dots and more general brain sloshery.

Nov 11

Mobile UX and the Art of Good Conversation

andreA colleague of mine asked me to write a post for the company blog where I answered the question “How do increasing mobile access and location-based services affect IA?” Instead, I answered “What kind of impact does mobile have on UX best practices?” Because that’s my “way.”

According to a May 2011 survey by the Pew Research Center, one in three American adults now owns a smartphone. Of those, 87% use it to access the internet – nearly 70% doing so on a daily basis. That’s a considerable audience by any standard, and one that can increase significantly depending on your brand’s demographic target.

You need to follow your customers – to be where they are – and a there is clearly an incredible benefit to being as close to them as their pocket or earpiece. But with myriad devices and their hardware and software idiosyncrasies, what kind of impacts are there on UX practices?

The answer is simple: none.

It’s too easy (and all too common) to regurgitate a bulleted list of usability best practices and repackage that as mobile UX expertise. The fact is that UX has nothing to do with technology and everything to do with the human beings using it. That isn’t to say that usability heuristics have no place in mobile, just that they aren’t significantly different depending on device – legibility, consistency, affordance, etc. What does change from one technology to the next is content and context. But excellent UX is no harder (or easier) than excellent conversation.

Great conversations develop from shared interests: both participants have a need that the other can fulfill. Content is both relevant to the topic and tailored to fit the context. Language is structured and vocabulary is common. Space is deliberately included to allow for dialogue and exchange (and breathing!). Tone and pace are consistent and appropriate to the topic. Sounds a lot more familiar now, doesn’t it?

The novel human experience that mobile provides is a point of contact that the end user willfully invites into their daily routine – a piece of technology that has become so central to human life that it has begun to affect the concept of privacy itself. It’s used to communicate, to remember, to explore, and to find your way home. It’s deeply embedded in your customers’ personal lives, but only so far as it answers a pre-existing, primary need – in a meaningful, intuitive, and delightful way.

Before we roll out the heuristics and usability audits, we ask ourselves: what need is it that we are solving, and how do we do it better? To make certain the conversation maintained with the customer is truly two-way, we research them on their own terms and in their own space – adding, removing, or refining features according to their feedback. Desktop site contents and functionality can’t be forced or cajoled into the mobile experience – too often it won’t fit, and it may not be appropriate to the viewing context anyway. Only by watching, asking, and listening to the end user can we really know what kind of experience makes the most sense. We don’t bait and switch with functionality that over-promises and content that under-delivers, and we are vigilant in sussing out the customers’ physical limitations. We don’t expect them to know (or want to know) the core product or service the way we do.

Mobile UX isn’t special or specialized, it’s just good UX; and good UX is exercising the utmost respect for the individual we are designing for.

Oct 11

Open Letters: Exploring UX

ps3In another in the open letters series, a friend considering pursuing interaction design poses some questions to me. If you are new to the industry or exploring it on your own, this might provide some insight. I also expect this to work as a good segue into an ongoing series of posts chronicling my development of an interaction design department and curriculum. As always, questions and comments are invited!

Can you describe your path to interaction design/information architecture? How did you become interested in these subjects? Do you consider this a conventional route?

My path technically started way back in the early days of Photoshop, which I used on the yearbook committee in high school. I fancied myself a writer and photographer, but it wasn’t until a college friend showed me his Promise Ring fan site that I realized the potential of the web as an opportunity to self-publish – complete with the notion that someone out there in the world might discover me.

So while I kept up with photography and writing and digital manipulation of images, I started teaching myself basic HTML back in ’96. It was that, more than anything, that got my career going. In ’99 I graduated and drove out to NYC with no idea what I was going to do and no job prospects. I can’t remember who mentioned it, but someone pointed out that lots of people were hiring for web design and so I threw my awful resume onto to pile. I went from building HTML for a big company on long island to producing the same for a teen community site startup, and finally started turning my hand to visual design at an honest-to-goodness design studio.

I struggled through a few years in Chicago, eking out a living doing full-spectrum web design (concept through deployment) until very nearly burning out on the web entirely. I took two years off to grad school for urban planning in Philly and found myself – yet
again – back in front-end development. A good friend referred me to another friend, who asked me if I’d be interested in interviewing for a new kind of position called ‘interaction design’. I had no idea what it was, but I managed to get through the interview anyway.

So that’s kind of a long answer, but you might be surprised to find that it’s not terribly uncommon of a path to take. There are a growing number of masters programs in the field now, but the longer one has been in the industry, the less likely they are to have started in one of those.

What I’ve learned about folks who end up in this role (and do well in it) is that they have a keen sense of what I’d call ‘applied psychology’ – paying attention to what makes people do the things they do, and the kind of creative bent to both accommodate and manipulate those things. Usability, for example, often boils down to a detailed approach to heuristic analysis – really just being anal about large and small details. But then UX is more than usability, just like delicious food is more than just edible. But that’s the kind of baseline for getting the job done.

In my brief research, interaction design is compared to information architecture, yet a distinction remains. Based on your experience, how would you compare the responsibilities of each–interaction designer versus information architect?

When I talk about UX, the term is a bit of a misnomer – UX involves much more (typically) than an interaction designer has control of. But the way I describe the breakdown is this: interaction design is design for task completion. Interaction design consists of two elements: information architecture, which relates to the content; and interface design, which relates to the function. When it comes to web or digital design, it’s nearly impossible to conceive of a project that doesn’t involve both elements; but the practical context of both is illustrated in the kinds of tasks the user is trying to complete.

That said, I wouldn’t suggest there is any difference in the two, since both competencies are required. In practice, however, I have noticed that there is a subtle difference implied by the two terms. Information architect tends to imply less expertise in the functional side, while interaction designer tends to reflect a more central role in the project. In my experience, companies who define the role as IA conceive of it in a much more limited way – usually because they assume their visual designers are capable of handling interface design. But sometimes there’s no reason at all for the distinction, and interaction design has become so buzz worthy that you’ll see all kinds of variations on the title nowadays.

What is a typical day of work like for you?

Right now (and for the past two years, almost) every day is novel. There’s still a lot of confusion in the organization as to how to use our tiny group, and the range of projects my company has is broad enough that what is needed of us on any given day can run the gamut from accessibility consultation to site maps to detailed wireframes. Most often, though, I get asked for user flows.

Which, incidentally, kind of drives me insane, since what that usually means is that they want some kind (any kind) of documentation to illustrate how everything is supposed to work. Which isn’t necessarily a user flow.

But it might be more helpful to talk about some other places I’ve worked. A typical day for an ID might be spent wiring out a user flow – providing a typically low-fidelity, page-by-page illustration of how a user moves through a specific task (and variations on that task). Or it might be spent in user testing, moderating one-on-one interviews with test subjects against a set of prototypes. It could be what is known as ‘requirements gathering’ – the process of interviews or research / analytics to suss out all the mandatory functional specifications of whatever the final product is supposed to do. Or building annotated wireframes or comps so that the dev team knows how the final product should be built.

Generally speaking, we must be the last word on any details of the project relating to user tasks. We end up being the folks everyone turns to whenever someone asks ‘how is this supposed to work?’ or ‘can we make it do x / y instead?’

What are the most challenging parts or work, presently? When you were first starting out?

The biggest challenge has (and most likely will continue to be) balancing the needs and desires of the client against the needs and desires of the end user. As the user’s primary advocate, we have to fight the battles – sometimes with our own team or account people – when the design starts to get corrupted. And trust me, there is not a day that goes by that someone somewhere on the project does not make a decision that will corrupt it. As the keepers of much of the logic of the design, we have to remind everyone why and when and how the design decisions have been made. And when it comes to clients, we also have to practice our best statecraft when mitigating often ridiculous or contrary requests.

Last but not least, we have to remember to always test our own assumptions – and be mindful of the both overt and subtle ways that those tests can be corrupted themselves. It’s really part visionary, part community organizer, part cop.

What keeps you engaged in your work (what do you love about it)?

Until very recently, what kept me engaged was just that notion of being the sort of ‘user sentinel’ – the defender of the voiceless majority. It was easier to feel like that because I have worked in so many places where the role of UX was respected and understood, and central to the design process. What I realized since coming to this new company is that, as a designer, I don’t simply design for the user – I design for the team and the client, too. Sounds kind of trite, but in my experience it’s all too easily forgotten.

Without getting too zen or whatever, spending years thinking about this kind of thing starts to make you project it onto everything in your life. I think that’s typical for most professions, but I have begun to see it far more than I used to (personally). You begin to notice the subtle and unsubtle ways that people design their interactions – the color and tone and structure that combine to create different meanings or nuance. The intricate dialogue between both people and between people and their environment. It’s absolutely fascinating, and I can say that if I try hard enough, I can use every bit of it in my daily work life.

Recommendations, re: skills, education/classes, experiences. Is it necessary to have formal training to be in your field? What kind of skills/experiences makes one most competitive? Do most people begin as developers/graphic designers and then segue into information architecture?

Considering that I don’t have any formal training, I would that it’s absolutely not required. If I had it to do over again… well, I probably wouldn’t change a thing. Even so, if you had the time and money, a few grad programs are supposed to be outstanding and I bet they would be a ton of fun too. The big three in NYC were the Parsons program, NYU’s ITP program, and the new kid on the block, SVA’s masters in interaction design. But there are more all the time.

Practically, it helps to know how the technology is structured, since that is one of the largest constraints to the design work we do. But while I think it’s useful to have a grasp of programming, I don’t think you need to know how to do it as long as you know how it works and what it is capable of. There’s a lot of this job that revolves around knowing what you know and what you don’t know (and knowing who to ask for help).

Ultimately, though, your instincts are your best friend in this field: a natural curiosity about how things work and why they do and under what circumstances; an attention to detail without losing the larger perspective; the intellectual freedom to do ridiculous things and think ridiculous thoughts – since that’s where a lot of innovative ideas come from. Above all, an interest in improving the world, even if it is just making it easier to set the time on a VCR or sort your email or hit the right button on the ATM. Because while a lot of us work on micro sites or purchase path refinements or site logic, the bigger skills are just plain old design skills and are infinitely extensible.

Oct 11

Open Letters: Focus Groups

fightA good friend of mine emailed recently to ask me if I had any advice regarding the moderation of focus groups. As fate would have it, I do. Here, in the first of a series, is what I told him.

I have sat in on a few focus groups… they are many-headed beasts, and I think oft-misused. I’ve never moderated one myself, and I don’t envy you for it. Here’s what I do know:

First: it’s tough keeping people on track while still giving them enough latitude to answer freely. If you can affect the format at all, try to split them into one-on-one interviews; if not, try to limit the amount a single person can respond and make sure you throw extra time at the quieter subjects in the group.

Second: focus groups magnify unhelpful tendencies. Being in the size group that makes focus group moderation tenable tends to unleash extroverted behavior in individuals who feel more constrained in more typical social situations. It’s as if when you put them in a lab and tell them you want their opinion, some switch gets flipped in their brains that says they no longer need inhibitions. They end up speaking beyond what they actually think, so you tend to get ‘false positives’ from them – simply because they are enjoying the freedom to throw ideas around and talk loudly.

Conversely, introverts often move in the opposite direction. In a small, unfamiliar, and ‘focused’ group setting, introverts will become quieter, more introspective, and more acquiescent to what they perceive as the group consensus – much of which, incidentally, will be what the extroverts are blurting out at every question. What’s worse is that the quieter an introvert gets, the more license the extrovert seems to detect to direct the conversation.

Balancing these two types of subject is probably the hardest thing to do in a focus group, but the most crucial. Short of one-on-one interviews, there are a couple of tactics you can use. If you are familiar with managing procrastination, then you are probably familiar with what I’m about to suggest.

Imagine you are incredibly distracted, try to do whatever works best for you to rein that attention back in – you’ll likely come up with some ideas that will work for the group participants, too. Take frequent breaks. Have and repeat (without sounding antagonistic) very specific questions. Print up posters with the questions you need answered (or put them on whiteboards) and physically direct attention to them to keep people on track. Even if you can’t accommodate one-on-ones, split people into like groups (extroverts with extroverts, etc.) for task-oriented activities. Make people write things down and make their own bulleted lists or draw pictures. Just generally establish rules and boundaries or ask people to establish their own ‘proxy’ boundaries – literally even asking people to sort cards or draw boxes to put things in will subtly reinforce this kind of stuff.

I don’t know specifically what the goals are from these groups, but those tactics (and whatever you can think of to build off of them) will help. And whatever you do, don’t forget that the honorarium is a payment for services – these people are guests, but they are also employees. If you see someone derailing the group or muddying the process, don’t hesitate to be firm early-on. It beats the hell out of spending an hour or more in a stuffy room only to end up with 4-6 flavors of one person’s opinion.

Sep 11

A Note And A Quick Announcement

It’s been a little while since I touched the blog, and not for lack of interest. By way of explanation, if not apology: I’ve been sort of busy. In the past six months, I’ve embarked on a new design ethics project with Mr. Florian Fangohr, began redesigning the blog, and got married, among other things. But for what it’s worth, the time off has given me some space to meditate on the purpose of this real estate and the nature of the content I fill it with, and I’ve realized there are some changes I need to make.

A mix of ambition and obsessive attention to detail have derailed the greater strategy of the space: talking about the how and why of design. In an overwrought attempt to look glossy and corporate, I’ve spent far too much time and effort on polished design – both in content and visuals. Consider this my attempt to bring the narrative back to Earth. There’s just too much to talk about to spend so much time trying to make it marketable.

Which leads me to the quick announcement.

The past (nearly) two years have been a learning experience – both in management skills and in interaction design. I gave up an attempted return to NYC and an exciting opportunity at Huge to join an online advertising agency back in bucolic downtown Austin. I spent nearly two years consulting broadly for said agency, trying to convince them to establish Interaction Design as a discipline (complete with departmental status and a director to lead it). I worked deeply to mentor whoever was close enough to listen and considerate enough to hear me out.

Last week, they took my advice – and handed me the reins.

Now that the parade floats are gone and the confetti has all been swept up, it’s on my shoulders to design, develop, disseminate, and evangelize. It’s an incredible opportunity, and one that makes me excited to get to work every day. But it’s yet unprecedented here (Erin Lynn Young‘s impressive efforts notwithstanding), and it is not being introduced into a vacuum. The agency has a long history and a considerable legacy, both of workflow management, organically-grown methodology, and (like any good agency) no lack of strong opinions.

So starting this week, I am going to attempt to chronicle the thinking, objectives, and tactics I’m using to develop my small group. Hopefully the trials and tribulations of accommodating “big UX” to the idiosyncrasies of integrated marketing will serve as some inspiration (or warning) to those of you facing some of the same challenges. Questions, contrary opinions, comments, and rants are all welcome.

Good luck and godspeed to all of us, and stay tuned for more.

Mar 11

Put Your Process On a Diet: Adapting Lean UX to Agency Process

I recently attended a talk by Jeff Gothelf on what could best be characterized as a variant of Agile methodology tentatively titled “lean UX”; the ‘lean’ here referring to a reduction in the ‘fat’ or ‘weight’ of static deliverables and the culture of long feedback loops and silos that surround them. That talk dealt specifically with internal product development teams, but made brief reference to the issues associated with adapting the model to agency life. This article is my stab at that adaptation. [Edit: I was reminded where exactly I read this somewhere before, despite also having mentioned going to Mr. Gothelf’s talk. Maybe this seems like a rehash? Either way, it’s something I was thinking about before I saw that, so take this all with a significant attribution but bear in mind I share this concept.]

Don’t Forget Why We’re All Here
The point is well taken that deliverables weigh on a design process. These usually codify as much about the process themselves as they do the product being designed; but a client isn’t paying for process, they’re paying for what the process delivers. While it’s true that process deliverables tend to build trust, if your client doesn’t trust that you’re working toward something (or if you don’t trust your own team without these milestones) then you probably have bigger problems than this article deals with.

So how does it work? The first step is one you should be familiar with: agreeing on the goals.

The Project Begins: Don’t Sweat the Details
It’s imperative to clearly define broad goals early on so that the team can move immediately into creating the user stories that will be articulated in design. This is usually covered by a project brief, which is typically the product of an account team (often with some input from discipline leads). This differs mainly in that either more team members are involved or that the goals are less specific. What we’re attempting to avoid here is specificity to a level that constrains creative exploration.

The User Story
User stories derive plain-English objectives from the project goals that the design must deliver on / communicate. User stories are technology / channel independent. (I realize that this is a major feasibility issue, given client-side budgeting or organization. Bear with me, this is as close as I can get to an ideal adaptation.) User stories define objectives from the perspective of the end user, most often oriented toward task completion. User stories differ from use cases in that the latter are groups of tasks combined to create a total experience. Use cases are specific to personas, which are fictional representations of target user groups or distinct sets of constituents. It’s important to note that user stories are not user flows – that is, prescriptive task-completion steps – but simply ‘discrete tasks users want to accomplish’ with this product.

The Lay of the Land: Discovery
Discovery follows the creation of user stories, and is just that: exploration and observation with the goal of achieving a particular outcome. How do users accomplish these tasks using other products? How do they accomplish these things without the benefit of a mediating product? Who uses the current product (if one exists) and how do they use it? What are the bottlenecks or pain points? What gaps are there between what users currently do and what they want to do? Are there analogs to the proposed functionality that we can learn from? What do we know about the user groups and their other constraints / limitations / attributes? It’s important to note that these users include admins – just as this discovery period includes content audits / matrices, inventory of existing assets, and editorial calendars.

The UX Brief is Your Best Friend
Fighting deliverable inflation doesn’t mean abandoning them completely; an excellent example is the UX brief. The UX brief codifies the research inputs, user stories, observations, and other tangibles from the discovery process and presents them as validations of the organizing principles of what will be designed. Any insights synthesized from internal discipline experience or research are distilled to define the techniques that will be employed in the following design phase. The document itself is set up in such a way as to be a constant reference for the duration of the project and a touchstone against which any further artifacts are measured.

The Internet Doesn’t Forget: Take It Online
Tools like group blogs or content sharing sites can be great not only for capturing the moment-to-moment insights of the design process, but also for tagging and inventory of these. Using simple structures like tags and categories, the timeline of user stories’ evolution can be more easily visualized and communicated to the client. Sharing both successful and unsuccessful sketches / prototypes / ideas ensures that no accumulated wisdom is lost, and facilitates ongoing client engagement, not to mention serving as an early-warning device for client impasses.

Behind the Curtain: Design In Action
Concurrent to sketching, whiteboarding, mood boards, and copy exploration, physical interactions are explored and documented by the creation of rapid prototypes. Copy volume can be explored in these prototypes as required, but their main goal is to define interactions. These rapid prototypes are the clearest and most useful expressions of what will be the final product. By deliberately avoiding visual design and brand expression, these functional prototypes solve serious business problems without losing valuable time in conversations about superficial nuance. This approach also avoids marrying or selling in a specific visual concept that may ultimately work to the detriment of functional requirements.

The benefit of rapid prototyping can’t be understated: it illustrates the pros and cons of a specific user interface in ways no other artifact can – both to the team and the client. It integrates dev into the design process, encouraging their ownership and facilitating knowledge share, and the resulting product is immediately available stimulus for formal or informal user testing. Without a true rapid prototype to test theories against, everything we do amounts to an educated guess. Why guess when you can know?

Built intelligently, rapid prototypes become reusable code that jumpstarts the creation of a final product. Technical dependencies are identified earlier in the project cycle, allowing more time to solve for them or design around them. Design isn’t driven by a static visual concept, but rather a real, human interaction – a much more compelling and tactile theme. And since the visual team is involved throughout, they are never relegated to applying ‘lipstick’ or ‘painting wires,’ as can happen so often in the development of complex interactive experiences. Everybody wins.

The process in sum is designed to move away from traditional deliverables. The brief prepared by account is almost always a static document lobbed back and forth over cubicle partitions. The wires or user flows prepared by UX are too often misunderstood by clients. The visual is rarely defined by the message that must be communicated or the interaction / transaction most crucial to the success of the project. Copy decks are unwieldy and disconnected from the shifting form of the interaction and end up unread. Rather than retaining these siloed perspectives, this process requires active and ongoing participation from all team members – not simply according to their strengths, but also according to their general discipline and industry knowledge. The whole team claims ownership of the product, and the client gets to see real progress and return on their investment.

A Roadblock: Is Your Team Ready?
Teams capable of carrying out this kind of process are low-ego, participatory, actively engaged, and open with ideas. Some concepts may originate with more experienced team members but each owns the final product as much as the process; failure belongs to everyone just as much as success. No one can free ride, just like no one can sit back in their chair, arms folded, to spectate. Knowledge or practice experience doesn’t have to be entirely level, but engagement absolutely does. Everyone takes a stand at the whiteboard, and everyone contributes whatever they can.

A Bigger Roadblock: Is Your Client Ready?
The undoing of this method is the client whose internal process is decrepit. It is possible to attempt to achieve some semblance of this quasi-Agile process amid old-school waterfall client reviews, but these defeat the purpose. At the very least, a client must be prepared to be an active participant in their reviews – the week-long circulation / rumination period for feedback is an inertia-killer and wastes valuable time. A client can be educated upfront about how the process will unfold, but without serious and sustained adherence to the core process by both client and project team, the entire product can be compromised. That sounds excessively alarmist, but I have seen it happen.

Summing It Up
So ends this back-of-the-envelope adaptation of lean UX to the agency process – consider it, perhaps, a really large envelope. To be sure, this is a potentially brutal re-education and reconfiguration of internal organization and process; you may not have the political clout to affect it, and it probably doesn’t work exceptionally well in all situations anyway. It made perfect sense in the case of a huge, scratch-built e-commerce design, but may need to be further refined (or mostly ignored) for that upcoming banner campaign. Ultimately, it’s not intended to supplant what currently works for you – only to call into question some problematic, endemic practices I’ve observed. Please hit the comments with any questions or refinements.

Mar 11

The Real Design Problem In Your Design Problem

I throw myself into my work. I throw myself into my process and methodology – my effort, my heart, and my soul. I’ve never been a company man, but I’ve always deeply identified with the quality of the product I’m involved with. Which is why I take it so terribly personally when I see shit work.

It doesn’t just apply to mine, either. Your shitty mobile campaign or half-assed dishwasher interface is offensive to me to an extent I find hard to articulate. Like a living creature found tortured. A profoundly wasted opportunity at beauty and meaning, and an artifact of an individual so totally foreign to me in morality so as to possibly be extraterrestrial. Who could do such a thing?

Yeah, a bit heavy-handed. But true, nonetheless; while the former is an important comment on the meaning of interaction (and the meaning of life), there is a very real fear in the latter. Who was the person who thought this acceptable? How could they think that? How could there be someone else doing all the things that I do, waking up to this career and vocation every morning, who felt like this was an adequate level of quality?

(I realize how much of a hateful, impossibly serious person I must appear; it’s not so dire. I think that in order to survive these constant impulses, one has to develop an incredibly strong sense of humor. Dark or dry though it may be, it’s the most personable coping mechanism I can think of.)

The problem with most design problems is that their roots run deeper than what might show up in an RFP. The crap final artifact is almost never the fault of the designer alone, though they do tend to be enablers. What I know from working with large corporations is that the real, critical problems tend to long predate the resulting design disasters. Accountability, internal communication, effective user research, employee morale – these are the client-side killers of good design, and most certainly much more expensive problems to address than a new logo or styleguide.

Too often, design firms step in to mediate conversations that corporations should be capable of facilitating on their own. Does your marketing team know what kinds of changes are slated for the product lineup? Has your C-suite ever really seen how your customers use your product? Did your IT department have a seat at the table when you drafted that RFP? If the answer is no, then a good design firm will suss it out. By that point, however, it might be too late.

It’s naive and idealistic to suggest that design firms become management consultants or that they solve all of a company’s internal issues. But, there. I said it anyway. Here’s why: design firms can’t polish a product that refuses to shine; they can only bring more eyes to its blemishes. A good design firm can help you pinpoint the issues that bind your internal machinery, and propose ways for you to address those – we designers ought to be experts on the subject, since it’s our job to be observers, communicators, facilitators, and problem-solvers. Regrettably, too few of us excel at all these. And too many clients select us because we reflect their own failures.

I know we can’t always solve our clients’ problems, even if we can sometimes solve the problems with their products. I don’t mean to bitch out the entire design industry. Consider it ‘tough love,’ coming from a human being who cannot see his place anywhere else – and one who truly believes that designers of any stripe practice an art that will one day solve the world’s biggest, most intractable problems. I’m not just a hater. I’m a lover, too.

So what is a firm to do? Perhaps the best we can expect is to solve our own workflow problems, and shore up our own communication streams. The next time you want to bitch about a difficult client or a terrible product, take a long look in the mirror and ask yourself why they chose you to represent them. Was it because you showed them a vision of what they wish they could be? That’s a great first step. Maybe it’s because you really can solve their problems – the large and the small ones. If so, that’s excellent – carry on. Maybe you were just the cheapest option. But maybe you just reflected a process or culture that they identified with. And if so, might it not be educational to consider observing your own internal machinery before you type up that next case study?

Mar 11

SXSWi 2011: A Recap


It’s tempting to suggest that 2011 was the year SXSW Interactive ate itself; devoid of significant launches and crammed with overwrought advertising, what was once an eccentric geek meetup had certainly found a trough. Like the technology that underpins it, mainstream acceptance / interest has tempered the conference’s edge. But as with any movement, wide accessibility and maturity benefit as much as they detract.

As the novelty surrounding social media and always-on location-based services fades, a quiet introspection has surfaced. Alongside group-texting arms races and corporate-sponsored mashups were well-attended panels on “flexible morality” and the inherent social engineering in user-centered design. Debates ensued on the ethics of “brand journalism.” Academic expositions on both design nomenclature and methodology drew top billing. Panelists and presenters addressed what appeared to be a growing interest in humanizing the technology and processes we use to design.

In short: a significant subset of those not seeking jobs or shilling products seemed to be searching for meaning in the tools so many of us have come to take for granted.

It’s too soon to suggest that a revolution is taking place, as exciting as that might be. More likely, an industry long-fueled by youthful exuberance and characterized by neoteny has begun to mature – inspiring some serious navel-gazing alongside its monetization schemes. Whether that is a symptom of an economic downturn, age, or industry expansion isn’t entirely clear.

What is clear, though, is that as corporations seek to be perceived as individuals and individuals begin to brand themselves using corporate marketing techniques, a unique convergence is occurring. As the tools and vocabulary of product development grow ever more accessible to the layperson, our expectations rise; audiences once relatively passive have begun to expect the same reputation and accountability from the corporations whose products and services they consume as they do their friends and family. As marketing has begun to pervade daily life, consumers have begun to seek more and deeper meaning in the transactions they initiate.

I hesitate to use heady language after a week of open-bar and free-grilled-cheese fueled panel-chasing, but a cultural inflection point may be upon us. As empowered producer-consumers, the digital audience is only just beginning to grasp the enormity of the opportunity that has fallen into its hands. From this point on, I forecast more and sophisticated dialogue on the means, methodology, and – most importantly – meaning in digital media. If we’re lucky, an age of public introspection is in ascendancy.

Jul 10

Of Cows, Tweets, and Bureaucrats

cowThe other day, I read an article about tweeting dairy cows. Forgive the mental leap, but I couldn’t help but think of state and federal workers.

When I worked on TheStreet.com overhaul a few years ago, it became clear that one of the primary challenges would be establishing reputation; as a financial advice and opinion site, the efficacy of columnists and contributors to make good calls was an implied baseline success metric of the entire site / product. One of the ways this was achieved was through columnist blogging; the blogs’ archives became reference material against which individual advisors could be judged. The problem was that the time-sensitivity of the most active readers precluded the kind of investment involved in effectively vetting a columnist.

Enter the reputation engine – the community experience that invited thorough and varied qualitative engagement but still maintained a sort of abstracted set of performance metrics for each user.

It’s not exactly groundbreaking at this point, but “likes” or “diggs” or “thumbs up” or “recommends” are easy examples of reputation engines in action. The concept resurfaced in the Nike.com redesign, specifically to be implemented in the couture environment of Nike Sportswear as a means for creating a sort of game mechanics for community members to increase individual standing and influence. With the Verizon Wireless Customer Council, we tagged ‘super users’ and invited them to a special community with product-related perks as a CRM. And again, the concept was integrated into recommendations for Southwest Airlines’ travel community. Your reputation was an abstracted and persistent graph of your engagement (and the community’s opinion of it).


The cows in the above story have been equipped with RFID collars that communicate with their milking machines to track a set of typical metrics – milk produced, for example. The machines record the data and use the RFID to identify the specific data set’s owner. What makes this more interesting is that these data snippets are then transmitted automatically as tweets. Apparently they have these tweeting collars for dogs and cats, too.

So why not donkeys and elephants?


Living in Texas exposes a blue heart to all sorts of red state horrors, but one of the simplest and least subjective dialogues that occurs concerns ‘big-government vs. small-government.’ Setting aside for a moment the ‘states vs. federal’ aspect of it, a more accurate interpretation of this debate might be ‘big bureaucracy vs. small bureaucracy.’ In either case, bureaucracy is a pariah and common evil.

Generally-speaking, bureaucracies are required to process and maintain most non-emergency legislative directives – anything that involves regulatory government oversight or the spending of federal tax receipts. Whether large or small, these bureaucracies are necessary in some form to support the laws that govern civil society. But as anyone who has ever visited a DMV can attest, there are inherent organizational deficiencies; most notably, a lack of internalized quality control or competitive drive.

What does stand to enforce competitive drive in these organizations (at any level – whether municipal, state, or federal), is public oversight. And while general audiences may not be entirely interested in the vigilance required of watchdog groups, it should be a democratic imperative to provide better oversight tools. Enter the tweeting cows.


Obviously milk production and biometrics are fairly straightforward when compared to the vagaries of civil service; however that doesn’t mean bureaucratic performance metrics should be given up as a lost cause. And obvious, too, is the need to carefully vet the metrics used, so as to avoid scofflaws or other potential abusers of this system. Regardless the deficiencies, imagine the kind of beautiful chaos and disarray created by a legislative requirement that government agencies draft, submit, then account for quantitative performance goals – better still, imagine a system that ties goals back to specific individuals. And then tweets a daily or weekly (or activity-based) report of each employee’s performance. Sounds pretty wild.

Wild though it may be, could it really be argued against? Sure, it smacks of micromanagement-by-majority, or worse still, invasion of privacy. But when it comes to maintaining accountability and efficiency of public servants – even (especially?) when government duties are contracted out to private suppliers – who would be audacious enough to suggest these individuals are beyond simple monitoring?

This is the basis of an idea I will continue to develop, both in form and content. Enhanced governance, I believe, should be more than paying your parking tickets online. We have the tools at our disposal to revolutionize the way our particular strain of democracy functions – not simply in representation and action, but in maintenance and oversight. These tweeting cows and bureaucrats – while decidedly unglamorous – may well stand to be the next major advance in civil society.

Jun 10

Electroshock, Automated Phone Systems, and the Barrier Principle


“If it is not ethical, it cannot be beautiful.” – Yves Behar

I’ve spent the last hour or so reading through some of Jon Kolko’s presentations, which is where the above quote was cherry-picked. Whether one agrees with the premise or not, it poses an interesting question: what makes something beautiful? Given the context, I’d refine that further: what makes a design good?

My impulsive reply is “intention and outcome,” although brief reflection is all it takes to call those factors into question, especially when related to each other; intention is rarely the final arbiter of application, and even well-intentioned designs create unforeseen negative externalities. So if not the designer’s imperative, is the relative good or bad of a design left to fate or human whim to decide?

I recently spoke with an old friend from graduate school. She, like many of my fellow urban planning students, pursued a career in economic development – hers was specifically in development consulting for distressed or low-income neighborhoods. She worked with community groups and local planning organizations to help craft development plans and improve conditions.

She is, in effect, one of the purest practitioners of user-centered design.


Abstracted far enough, both our jobs and our non-working lives all begin to appear as interaction design practice, all with the user at center; each interaction we have with another user is an opportunity to iterate and refine. What struck me as interesting about this is the difference our mental models make in the form our interaction designs take.

Seth Godin points out the interesting social engineering that went on in ancient Jewish prohibitions on usury: while it was illegal to charge interest on money lent within one’s community or family, it was entirely accepted when the debt belonged to an outsider. The rules of interaction were designed in such a way as to define community boundaries and strengthen the bonds within; the lending of money between community members created a shared responsibility for a successful outcome.

Juxtapose this to a large corporate organization, and the contrast is stark.


There is a critical point in the population of a corporate entity beyond which the ‘typical’ employee is no longer within reach of the end user. As far as my anecdotal research goes, I’d place that at somewhere around 200-400 employees, though I grant that certain industries (especially those whose organizing groups are small consultant teams) may be exceptions. Generally-speaking, personal access to end users (and vice versa) creates shared responsibility in successful outcome – service or product. The thickness of an intermediate layer breaks or stymies that interactive component, and the results are (albeit often subtly) catastrophic.

What makes the problem more complex is the realization that the confounding layer can take all kinds of forms – automated phone answering systems, excessive levels of management, unintended employee disincentives, etc. Calling a customer service department or visiting a post office are easy examples, but one of the more pernicious ones can simply be organizational imperative.


That verbiage may sound unfamiliar outside of the design business, but the implied assertion of personal agency seems universal. The realization that your client views the final design as a vehicle for marketing blandishments and brand guidelines is like hearing that your signature dish is a vehicle for grocery store condiments. What it signals, though, is that the internal relationships between marketing and management have become stronger than those between the company as a whole and the users who support it.

The Milgram experiment was intended to study conscience and obedience to authority, but I’d suggest there’s something to be learned there about the intermediary layer and human relationship – especially where it comes to designing human interaction. In that scenario, the ‘designer’ was rationally separated from the ‘user’ by both a physical barrier and the implied authority of the researcher. Milgram explained the ‘unwitting’ sociopathic behavior as evidence of the theory of conformism and agentic state theory: the first, a manifestation of greater connection between group members and their hierarchy; the second, a state of viewing oneself as the instrument of another and therefore bearing no responsibility for outcome.

It may be a bit of a stretch to invoke Milgram and controversial psychological research. But if it was that easy for 2/3 of participants to ignore their empathic and moral concerns in the face of organizational imperative, then there may be some truth to this ‘barrier principle.’


Any interaction designer worth his salt will tout the clarifying power of user research; even the most self-concerned will admit it is great insurance. What makes it so important is its implicit requirement of actual human interaction – the fundamental practice that underlies everything we design in the first place. More so, it is human interaction under the designer’s lens – exploring not only the how but also the why and what for. Only by being there can we really know what is good, and why it is beautiful.

May 10

What We Mean When We Talk About Platforms


A couple years back, I worked with R/GA to develop a new design for Nike’s divisional websites that would allow a greater focus on content production and the ability to foster cross-regional resource-sharing among employees. Not long after, I put together a strategy with Razorfish for localizing Dell’s sales efforts (and web presence) in emerging markets, while still hewing to an overall corporate brand standard. Just recently, I offered Chase a concept for in-market testing and optimization of their credit card product messaging. All of these were based on the concept of the website as platform. So what does that term mean?

The platform at its most basic is a structure. It’s been difficult to find a suitable metaphor, but a few have been tested: foundation, chassis, pizza crust, wiki, marketplace. Fundamentally, the platform is just a place where things happen, constrained by a simple set of rules that define limitations and initiate interaction. With this shared language, the heaviest organizational lifting is taken care of, freeing participants to engage each other and the medium as they see fit.


Abstractly – and this is hinted at by the “marketplace” metaphor – we experience this concept on a daily basis: countries have laws and languages that serve as boundaries to behavior and communication; games have rules to similar effect. A balance must be struck between the constraining power of the rules and the chaos of lawlessness; too much on one side or the other will stifle or smother. The same is true at any scale, but the equation changes when the scope of interaction is focused.


What the three client examples had in common was a need for a single, brandable structure that could handle change over time. All three clients needed consistent navigation and page structure, with varying degrees of customization – Dell needed customization of content by locale (in addition to language considerations); Nike required customization of visual design by component brand (basketball vs. football vs. soccer, etc.); Chase, the opportunity to customize content depending on user data. The platform concept addressed all these needs, and provided further benefit.

In a nutshell, this is what makes a platform work:

  • Standardized structure / template – a shared language and structure, comprehensive enough to define boundaries but simple enough accommodate a multiplicity of components
  • Modular components – not simply functional widgets or tools, but content as well
  • Responsive logic – the ability of the structure to react to supplied data; whether that be user, medium, site activity, or others


In future posts, I’ll detail how the platform concept / strategy worked in particular case studies – including the three mentioned above. When I do, I’ll update this post to link to those.

May 10

Who Stole the Soul?


Design, like the individual, is comprised of two fundamental elements: the physical body and the soul. The overlap of these components is absolutely crucial in any situation. Products we use and appreciate appeal to us because we sense ourselves in relationship to them, and we sense the humanity of the individual who created them. This is why parents plaster refrigerators with their kids kindergarten fingerpainting – if only for a very limited audience, these small fragments overflow with the soul and presence of their children.

There is a clear understanding of this phenomenon in the marketplace, too: do a search for wooden furniture and see how often you turn up the phrase “hand-rubbed.” It has nothing to do with the finish and everything to do with the soul imparted to the product by the creator.

Current design theory acknowledges this too – albeit in sort of backhanded way – in the concept of “human-centered design.” The idea is that instead of designing a face to a machine, we design a machine to support a human need. By doing so, we speak to the human user as an implicit participant in the design process. “We made this just for you, because we’re just like you.” It’s a great idea, and an appealing recollection of an age when consumers felt a greater spiritual connection to the products they consumed.


Too often, though, human-centered design is mistaken for a trope. Adding a pseudo-scientific layer to the concept, the design business refers to it as “user-centered;” effectively removing the human from the user. A human isn’t a key performance indicator, or a ‘unique,’ or any other set of correlated statistics. A human has a soul, and any good design has to speak to that soul.

Too many times in my career have I been set to make a concept work, or develop a fake soul to stand in for a real one – to imbue a product with depth and humanity. Products don’t have souls, and it’s impossible to pretend they do. Attempting to do so triggers our defenses, and from that point on, the design struggles with its own inherent antagonism.


In the last six months, I’ve spent most of my time working on the online and mobile marketing of credit card products. These products are usually very similar from one company to the next. Due to the level of competition in the marketplace (and in some cases, lack thereof), each product seeks to derive the maximum amount of usage and revenue from its cardholder while introducing just enough variety to distinguish itself from the rest of the field. These are business instruments, designed around their underlying accounting and financial structures.

For that reason, distinctions between products are hard to make. Advertising companies split their time two ways: attempting to craft stories and sentiment around the products, to imbue human presence; and attempting to streamline delivery to increase awareness and simplify purchase (conversion). The heart and the head; alternately, the gut and the wallet.

Consumers see straight through this.


My typical response is, then, to inform first and tell stories last if at all. Make the text big, make the page comfortable, make the copy informative and simple, and reduce the number of choices the end user is required to make. Bear it all, expose the guts. Apply only as much visual design flourish as required to build trust in the product and not a touch more.

You’d be surprised how poorly this advice is received. Show-hides. Animation. Asterisks. Stock photography. Eight different font sizes. Separate designs to receive separate campaign drivers. Promo spaces. Unsubstantiated claims and underdeveloped ‘ingredient brands.’ ‘Benefit creep.’

Suddenly we’re in the middle of the Hong Kong of credit cards, with neon messages flashing at us from all directions, none of which we can parse or understand. This isn’t human-centered design, and this has no soul. We proceed out of dire need or curiosity only. It’s an unfortunate way to attempt to win the trust of a customer.


I’ll continue to explore this in future posts, but what I see in all this meaningless storytelling is a failure of soul. To approximate soul with artifice – whether that be humor, touching stock photography, soothing music, warm colors, sentimental storytelling, or any other marketing trick – is to fundamentally disrespect the human in both your user and your product.

As is becoming the norm in these entries, I will end this with a sort of apology: I will probably continue to refer to myself as a User Experience professional, since that’s part of the current lexicon. I will probably continue to speak the language of user-centered design to clients, since that is what they understand, and because educating a shift in thought and speak like this is a long-term undertaking and I have to make a living in the meantime. What I will add, however, is that as practitioners, we will all benefit from the appreciation of and attention to the human within us and the soul we impart to our products. When we ask ourselves to neglect that, everybody loses.

May 10

The Best (Practice) Is the Enemy of the Good


La vertu s’avilit à se justifier.
– Voltaire

Virtue debases itself in justifying itself.

We had a debate in the office recently about technology – specifically, about whether or not to standardize the UX team’s wires in Visio or InDesign or whatever. The argument for this – admittedly, a good one – was that we would be better able to share resources and a central repository of accumulated knowledge in the form of a pattern library. I’ve found myself arguing for this same thing in past engagements, but this time I was on the other side. Eschewing my natural instinct to document and organize (sort of important in information architecture), I was advocating dispersal of knowledge and pluralism of platforms.

I struggled for a while to articulate the benefit of such a contrarian concept. It just didn’t feel like a valid use of time and resources, since a couple of us were on Visio, I was on Illustrator / InDesign, and we all seemed to have different skill sets. From the perspective of ROI, from the perspective of knowledge sharing and the reduction in redundant work (cf. Visio stencils), from the perspective of intraoffice standardization – it made total sense. But what, if any, would be the effect on creativity?


Jakob Nielsen is an ass. He writes like an ass and though I’ve never seen him in person, it’s not a stretch to imagine that he’s the guy that everyone has to invite to the party and say hello to, but no one wants to hang out with. How many links should I allow? How do I convince my client to let me design a scrollable page? Remind me again what the generally-accepted heuristics are? Just ask Jakob. He knows you don’t remember or you’re lazy and you want to site him and he’s never going to let you forget it. That’s pretty sad. But the point of this isn’t to knock Jakob Nielsen, only to question the ‘best’ in ‘best practices.’

Don’t get me wrong – books are great, libraries are great. Pattern libraries, just like regular libraries, revolutionized both the gathering and dissemination of individual knowledge. But (both in history and in design), documenting past performance will only indicate past performance. A man ignorant of his past may be doomed to repeat it, but reading a history book will not turn you into a fortuneteller. Just because a user expects a certain outcome on click, doesn’t mean it will always be so, nor should it.

Back in college, they told me that this is called linear history – the idea that present circumstances could be causally-related through a chain of events and thereby traced all the way back to the beginning of time. It reflects a strong tendency in humans to seek out patterns and universal truths where there may be neither; it also reflects the that history is never written in history or even written concurrent to history – that histories are always compiled and distilled as documentation of past events from a future perspective in such a way as to give both past and present meaning. With such an understanding, future events can be conjectured and measured against as they proceed. Activity – events, milestones, data points – are arrayed like dots on a scatter plot where a single path can be traced as a trend.

This makes perfect sense, except when it doesn’t. Malcolm Gladwell wrote a whole book on how randomness causes massive shifts in these clean trend lines. Whether you believe his theory or not, nature at least partially supports it – even if there is a larger pattern that could be computed and documented, its scope is likely beyond our natural perspective.


The argument I’m approaching here is that while what is called ‘best practices’ serves a worthwhile business purpose, it also serves a dubious social one – assisting (and in many cases prescribing) conformity – the tyranny of majority. Will it kill my creativity to copy and paste a drop down menu? Probably not. Am I guaranteed to use a uniform size form element across all my documents? Definitely. Is there a tiny, slight, ever-so-minimally-nefarious reduction in my need to exercise professional judgment about what I am doing? Perhaps. But might I never really have reason think about it again?


I’ll continue to use best practices to make arguments to support the decisions I make in design – at this point, it’s a common language we speak with our clients concerning the viability of our ideas and the risk involved in their investments therein. But in reproducing the same design elements over and over whenever necessary, I allow myself the opportunity to reconsider them in every case – I require myself to focus on them, even if just for a moment.

As design practitioners, we count on our individual experience to inform our work. By documenting our processes and studying the processes of others (and their conclusions) we incrementally expand our understanding beyond what we experience on a daily basis. Of course this is useful and I don’t disparage it. But when we come to rely on these ideas at the expense of our own exploration, I have to wonder if it doesn’t result in some accumulated ossification of our creative faculties?

May 10

The New Incapacity


User testing is always interesting. No matter how much time and energy and effort you spend designing and vetting a site, you will – without fail – find someone who is completely stymied performing even the simplest of tasks.

Obviously part of this is just the luck of the draw, especially when that draw is from a pool of individuals with nothing better to do at 2PM on a Wednesday afternoon. Some of it, too, comes from the fact that no matter how comforting and friendly the moderator is, nearly every subject is prone to some kind of performance anxiety – the kind that makes you forget your wife’s name when making introductions at a party. Not that I’d know, of course, but I’ve heard about it.

What strikes me is how much we as designers accommodate this kind of behavior.

After a recent testing session, I had the luxury of a five hour flight back to Austin and the good fortune of having finished all my books on the inbound trip. Poking around in the airport gift shop, I came across a book I’d been meaning to buy weeks ago: Shop Class As Soulcraft: An Inquiry Into the Value of Work.

The author – a Washington think tank refugee and PhD in Political Thought from the University of Chicago-turned-motorcycle mechanic – contends that there’s been a sort of ethical erosion that’s accompanied society’s shift to white collar primacy and away from the objective reality of trade work. I’m not done with the book yet so I’ll withhold judgment on that particular conclusion, but I did note another of my own: as fewer of us interact meaningfully with our environments, has there grown a lack of attention and focus in parsing design?


That’s a mouthful, and a very long sentence. But here’s what I mean: as designers, we reduce and simplify products because – in large part, anyway – there is so much competition for attention. In an arms race for the attention of an increasingly multi-tasking, texting-while-driving, watching-television-through-dinner population, our designs compete with a thousand potential distractions. It used to take a week to get a letter from Albany to New York City. Now, even five seconds is a lifetime when waiting Gmail to load. So it came to pass. But what is the trend leading toward?

I’m the first person in line to jump onto the next sketchy, geolocating, privacy-violating, credit-card-data-losing social network, so please don’t confuse me with a Luddite apologist. But watching test subjects struggle to explain concepts that are written in plain English on a monitor right in front of them – choosing instead to randomly click around a screen with no recall of their intentions or expectations in doing so – is, frankly, chilling. And I have to wonder if we, as designers, aren’t complicit – enabling this kind of blindness.

These individuals drive next to you on the highway, they process your Medicare benefits forms, they manage your employees. And yet simple tasks like parsing an unfamiliar environment or finding their way around a page render them completely incapacitated. It’s nothing short of horrifying, when you comprehend the scope of it.

What kind of dystopian future can we anticipate after this inexorable march toward touch-surface, mobile, video-wall everything? First, let us consider our complicated history with the basic button and knob.


Mechanical controls illustrated humankind’s first level of tangible abstraction – push a button, and the sluice gates go up or the bulb glows. Somewhere in the casing of the machine there were cogs and gears and wires, but you couldn’t see them, so over time there was no reason to consider them anymore. With solid-state electronics came impenetrable sophistication; I can conceivably tune a carburetor, but I can’t pretend to understand the secret life of the microchip that governs my fuel injector. It’s under the driver’s seat somewhere, from what I’ve heard.

Keyboards and mice were the next step: now we navigate virtual environments that use (in many cases) old mechanical visual cues to help us understand what’s going on – the blinking cursor, the file folder tabs, even the pulsing LED that ‘breathes’ to indicate that a computer is ‘asleep’. Another level of abstraction.

Touch may now require capacitive or resistive surfaces, but the emerging lexicon of abstract gestures (although some are, admittedly, based on tangible mechanics) will probably resolve to surface-less environments and augmented reality – a future of people waving fingers and hands in the air like schizophrenics (or more attractively, like Tom Cruise in Minority Report). Will anyone remember how to change their oil by then? Does anyone really remember how to do it now?

Regardless of the outcome, the trend stands as immediately measurable. And as designers, our job security depends on our ability to play the game per the rules on the field. But in a culture built on mass consumption, the product designer is a social engineer. And when digital media is a significant proportion of ‘consumed product,’ competitive reductionism – in both content and structure – presents a very clear danger both psychological and physical.

May 10

Why I Don't Work in Advertising


I’d pretend to be reticent about admitting this, but I think everyone I know has already heard me say it: I hate advertising. The quintessential advertising archetype is the used car salesman – brash, disingenuous, pushy. Untrustworthy. Advertising has a bad rep because so much of it is polishing turds – masking product deficiencies by way of parlor tricks and sleight of hand. In an age where consumers have ever more perfect access to information, this creates a race to the bottom – the winner will be the most compelling liar. A truly dubious honor.

The future rests in the hands of those who can create real value on a personal level.

Last week I sat through a couple days’ worth of user testing for a credit card product. What struck me (beyond a startling inability of the test subjects to focus on simple tasks) was the heightened distrust of corporate communication displayed across the board. More simply, these folks’ bullshit-o-meters were redlining – exceptionally aware of any whiff of product embellishment or flowery prose. They came into the situation with an expectation of being lied to and manipulated.

I know this isn’t exactly news. But I couldn’t help but ponder what it means for my industry.


There seem to be a lot of design studios / agencies who position themselves as interactive elements of the greater advertising industry. I have to assume that this arises from both a communication of capabilities to potential clients and a reflection of where most project budgets come from; since most online work supports tangible products, client budgets are most often coming out of marketing communications or advertising. That makes sense, and I’m not arguing against that.

What I wonder is how this affects the perspective of a studio or agency with regard to its own development. As we begin as an industry to embrace social media and word-of-mouth marketing, the efficacy of pure advertising – the manipulation and management of brand / product / service perception – starts to break down. As consumers are empowered and engaged to proactively endorse (or malign) products in their own language and on their own terms, the relevance of advertising erodes to simple communication: here’s our product and here’s what it does and here’s where you can get it.

I don’t pretend to assume that the social web will truly level the playing field and create some sort of free market utopia, but what I do expect is that the value of advertising services (versus, say, public relations) will increasingly be called into question.


So what’s left for us? For starters, I’d expect at least another decade of ad industry domination – both in control of message delivery and in the hearts of digital agencies everywhere (if not just on the budget allocations of their clients). What smart agencies will do, however, is learn how to create real value – the content, strategy, and product / service that can be objectively evaluated and capitalized on.

As the cost of maintaining distribution networks decreases, and the use of micropayments increases, I suspect that product development will slip back into the hands of boutiques and individuals. Furthermore, as content and data standards achieve wider adoption, products themselves will require less of the corporate-style product development refinement than they have in the past – further releasing product developers to pursue innovative or unique solutions.

So what does all this have to do with advertising? The smart designer / studio / agency will stop thinking in terms of advertising campaigns and marketing pushes, and will start thinking of themselves as product developers and inventors. I have no doubt that most client work will continue to follow traditional cycles and be voiced in the traditional nomenclature of advertising. What’s important is that we as designers don’t lose sight of the horizon. Clients will come around; they always do.

Feb 10

Social Media Survival Guide

(I put this together as an internal document to illustrate how a risk-averse client might be persuaded to accept a social media strategy that allowed actual user interaction, rather than one-way communication. As far as I know, the client accepted it; so there may well be something of value here. Enjoy.)


Like any social gathering, participating in a social media service like Facebook can invite disaster as quickly as it can success. A little preparation goes a long way.

The social lifestream (in this case, a news feed) can be a fast-running current if it is sufficiently popular. Just like a real current, it works to your advantage to paddle with it rather than struggle against it.

The typical social media crisis for a product or brand is the viral complaint: it only takes one user to start a snowball effect of ill-will that – while typically fleeting – can be painfully memorable. The remedy, however, is simple: respond.

Nasty comments, adolescent remarks, foul language, and other tangential nuisances are easily dealt with – establish your policy early and clearly, and make sure your moderator reminds the thread of that policy whenever an action is taken.

But what of the valid and cleanly-worded complaint? In this case, it is an incredible asset to have a human moderator – humanized with personality, some humor, some compassion – rather than a standard set of boilerplate marketing speak. Some common complaints and their appropriate responses:

“I have a problem with your product.”
How do you feel like we could make it better?
We appreciate your help in improving our product. I have forwarded your input to [name and info] in our product development team.
What about the rest of our fans, how do you feel about this product / feature?

The key is to be open and work with the fan to suss out the root of the problem. Ultimately, you can always give a complainant the contact info of a customer representative. Most complainants will either take you up on it or let it drop. If they take you up on it, you can follow up in the thread later to note that – either way it works in favor of your reputation for active and responsive customer service.

“I have a problem with your company.” or “Why did you do [x] to me?”
I’m sorry that you feel that way – on behalf of my company, I want you to know that [company name] makes every effort to address its customers complaints. But in order to help you, can you be more specific about what is wrong? Is it a problem with [product], [policy], or [etc.]?

Discretion counts when responding to a comment like this. Was it a pot shot? You’re likely fine to let it go without responding; the community is fairly good about self-regulating and will recognize a cheap or unsubstantiated comment and let it fade away quickly.

But what if the comment was substantiated, or what if others have jumped into the conversation? Be prepared to challenge the commenters (always politely and professionally) to specify details. A disproportionate number of complaints of this type are vaguely worded and ill-researched rants, neither accurate nor prescriptive. In cases like this, focus on sussing out details – if the commenters don’t offer any, be ready to offer your own defensible ones. Forcing abrupt and angry people to be detailed and specific tends to defuse their arguments quickly.

“Why won’t your [product] do [x]?”
That’s an interesting suggestion, how do you see that working? I’ll definitely forward that on to [product manager] for their review. [Follow up with comments from a product manager, even if they are only appreciative.]
What is it you’re trying to do exactly? You may be trying to do [x], in which case you might want to check out [feature]. Here’s how it works / more information on it… [description / link away to company site].

Your community manager must be (or have access to) a product expert in cases where specific product questions are asked, but this is perhaps the best possible use of the space – keeping the focus on the product and on the product’s benefits.

“Hey, check out my [company / website / Facebook page, etc.]! Here’s a link!”

If it’s pure spam, just delete it and remind people of your policy (which should cover this type of abuse). If it seems valid – if you can see that the commenter is a customer – feel free to engage them, always keeping the conversation focused generally on the topic area of your product and product features, and steering it back toward how your product and initiatives may have helped them or could help them.

Feb 10

The Future of User Experience is Burnt Toast


The present mobile web experience involves carrying a tiny version of the internet around with you wherever you go. The web itself is just a series of data feeds, requiring some kind of visualization engine to make any sense to the human eye. But will it always? And if not, will it even require a human eye? After all, we have more senses, and communicate in more nuanced and primordial ways. Intriguing, for sure. But while some functions of the web will remain the stuff of viewports and text and pictures, there is a very real trend beginning that won’t be browser-based. In the future (cue Thus Spake Zarathustra), the computers won’t just be there wherever you go – they’ll be in everything, wherever you go.

It’s already happening, to some extent. GPS-supported mapping in our cars was a gimme, but look a little farther: FitBit follows you like a health-based lo-jack, talking to computers (and your friends) about everything you do. Foursquare (and now Yelp) tell your friends where you are and connect you with total strangers, not to mention highly-relevant promotions or deals. Blippy reports your credit card usage and purchases to a cadre of your (hopefully) closest confidants. Nike+ reports your runs to Facebook. And so on, ad infinitum. Human experience has become the lifestream. Slap a data visualization on it, and suddenly life imitates art.

So what’s next? Forrester points to an expansion of mobile web in the near term, and recommends the development of multiplatform personas and, of course, more focus on the mobile experience. Sure. But we have to look at a broader picture – will the future of mobile web still be confined to the legacy of rectangular screens?

To answer that, I’m going to look to another Forrester report, “The Future of Online Customer Experience.” In the report, they suggest an acronym to explain their thesis: CARS – Customized, Aggregated, Relevant, Social. For anyone paying attention, this shouldn’t be a surprise. But while they go on to illustrate these concepts in terms of existing technology, one tenet in particular caught my eye: Aggregation.

The Forrester recommendation was to atomize content and functionality. I wholeheartedly agree – I’ve been advocating design modularity for years; most recently where it came to global localization strategy, but to lesser extent while reluctantly jumping on the Agile bandwagon. But what’s emerging from this atomization is the non-standard interface – the Nike+ sportband and Ambient Orb come to mind – and that’s what I find so intriguing.

At this point, everyone has seen the Perceptive Pixel multitouch walls and the MIT Lab Sixth Sense projected interfaces. They are exceptionally interesting and innovative technologies, and I hope to spend more time working with them. But they’re still new adaptations of legacy technologies – the irony of projecting GUI ‘buttons’ onto your own hand to dial a phone should be lost on no one. What will be the next real advancement in interface design?

Consider the basic analog breakfast interface: the kettle that whistles when boiling, or the toast whose smoke makes us jump up to turn off the toaster – even when we can only smell it. Temperature, movement, smell, sound – non-visual cues on a timeline – how do we as practitioners anticipate and design for these interfaces?

I think the big idea, anticlimactically, is to simply remain flexible and observant; these types of physical interactions are probably some of the most universal and culturally-agnostic, and so provide a great platform for accessibility. Conversely, they are the least sophisticated and may require substantial effort if they are to be the basis of a new interaction language. At least it is reasonable to assume that we are all starting with the same alphabet.

While I’ll continue to make my recommendations based on existing design “paradigms,” I’ll have my eye on the future and continue to explore how our now traditional screen-based systems can be augmented or ditched altogether. Because when the mobile web actually becomes just a bunch of stuff – the internet of things, as I’ve heard it referred to – we will need to learn how to speak its language.

Nov 09

Isolation in Overshare: the State of Privacy

The other day I was watching a National Geographic documentary on prisons which went into great detail about both the daily life of the prisoners and the oppressive architecture (intellectual and physical) of a maximum security prison.

What struck me most was that within this device – painstakingly designed to contain and defuse dangerous criminals – there still existed an impressive degree of individual agency. In the most forcibly public of venues, inmates managed to create and police their own society: crafting weapons; marketing and consuming drugs and other contraband; creating, maintaining, and growing gang ‘communities’. In the least private place on earth, these unwilling participants managed to create their own privacy.

You know, like Facebook.

It’s a mental leap, but bear with me.

At an office meeting the other day, someone brought up the topic of oversharing. As an early adopter of many online social networks (Friendster? Makeout Club? I’m not bragging, but let’s just say I go back to green screens and Telnet), I’ve had a personal interest in this ongoing conversation for some time. In most cases, privacy proponents (“pro-restrainters”) strikes a sentimental chord, and seem to invoke a simpler era when oversharing involved excessive drinking at the office holiday party or answering the door in your underwear. The implication is one of a moral degradation.

To wit: “overshare” was Webster’s 2008 Word of the Year. And there is good reason for concern: like many of human civilization’s major technological advances, this one is accompanied by societal shifts, a redistribution of political control, and the ever-present specter of good ol’ abuse-for-profit.

Your credit card company watches where you spend and what you buy to determine your creditworthiness. A potential future employer finds compromising photos of you on Facebook. A prowler stalks you through your copious geolocated status updates Loopt, Latitude, Foursquare, or Gowalla. Danger abounds. Eyes are everywhere. Big brothers (and sisters, nephews, second cousins) are watching you.

To quote the back window of a Camaro I saw recently, I “ain’t skeered.” Why? Ironically, I thank the prison documentary. What if we can invent new ways of being private? What if our conception of private is shifting? What if our level of public comfort is simply an evolutionary variable? In my mind, the California penal system is as much an evolutionary laboratory as the Biosphere, if not more so. The evidence here points to an intellectual adaptation.

The immediate example is industrialization: the industrial revolution was a technological one that had a profound, wrenching impact on human civilization. Not only did it forge the modern city, but it was also the most fundamental restructuring of social interaction since the development of agriculture – massing humans in heretofore unheard of densities. Social constructs had to evolve. If there was a suburban shift in the 1950s in this country (a decrease in population density), the current sharp reduction in personal ‘space’ may simply be a reversal. Ask an Indian living in Dharavi (Asia’s largest slum, with a population density of nearly 1m per square mile) what their definition is of “overshare.” Or, for that matter, a factory worker in Shanghai.

I’ll continue to watch the situation play out – as though I have a choice! I’ll still be measured in my own online ‘sharing’ (by my own standards, at least), and I’ll still advise my younger siblings to be circumspect (and they’ll continue to ignore me). I’ll reserve judgment on the final social product, but like most pendulum swings, I’ll rest (reasonably) assured that we’ll probably end up somewhere in the middle.

Feb 09

Usability Testing: an Introduction

The past few years have seen a meteoric rise in the reputation of usability testing and a concurrent interest in it on the client side. Despite its popularity, however, there remain pervasive misconceptions about what it is, what it’s used for, who should do it, and how it should be done. While no single article could be entirely comprehensive, this one attempts to explain the term, set some expectations around it, and provide a few tips and tricks in how to go about it.

What Is It, and What Is It For?

It may be easier to start with what isn’t usability testing: it’s not a focus group (but it can be an oblique way to learn what your users want), it’s not a marketing tool (but its findings can be parlayed into marketing insights), and it’s not QA (though it can certainly clue you into poor or missing functionality). With all those caveats, it’s obvious why someone might be confused as to the true purpose of usability testing.

Instead, usability testing should be considered a tool to evaluate fundamental design decisions and distinct, discrete packages of functionality. Where a focus group or marketing research can vet a concept (which itself may be tied to a specific tool), usability testing can inform the structure and delivery of that concept. And while the potential for hands-on, qualitative user input on the larger concept may be tempting (and the user may be more than willing to provide it), I’d argue that folding it into the testing results dilutes the results and questions the legitimacy of the findings.

The basic unit of a usability test is a user flow. At its most elemental, a user flow can be a single step, but is more often some combination steps. For example: paying your phone bill online. While there may be more than one means of initiating the flow and a number of options for completing it, the usability test would likely treat each as a separate flow if it included significantly distinct interactions. A “script” constructs a series of these flows and defines what specific questions need to be asked and what steps they are concerned with.

Testing is preceded by the creation of a screener, a document used to screen potential testing participants. Screeners can be used to target specific audiences or maintain a broad testing base, and can function as a brief introduction to the participants when necessary.

The final product of testing is typically some kind of brief, broken into “topline” or general findings, followed by more detailed findings which specify the method used, any statistics that emerged, the areas studied, and the participants’ backgrounds.

Who Should Do It?

Ideally, usability testing should be handled by a detached third party. I have personally witnessed clients wander wittingly and unwittingly into dangerous territory through uninformed choices in their test administrators. A good rule of thumb is that no one administering or moderating the test should have any incentive in the outcome. But does that preclude a digital agency from doing its own testing?

For obvious reasons, a digital agency has a vested interest in how its product tests with users. And as design professionals, we ought—ethically—to be  capable of taking our lumps alongside our credits. As usability professionals, many of us have invaluable experience in devising, moderating, compiling, and communicating the results of such testing. The balance, I think, is exposed and illustrated in the structure of the testing itself.

How Should It Be Done?

To be truly useful, usability testing has to be relentlessly specific. Too often, lazy moderation techniques cloud results with meaningless metrics. It’s important to push users to explain their reactions, difficulties, and expectations, but special care has to be given to the framing of questions and the moderation of answers. A (sadly) classic example: “Rate how easy the following tasks are on a scale from one to ten.” What is one and what is ten? Easy or hard compared with what? Another example: “I want you to perform the following task by doing… [explains task]. Was that what you expected?”

Without over-dramatizing, it is safe to say that quality usability testing is something of an art. The first example above foists an unstructured and unspecific task onto the user. The second sets their expectations before asking what they are—essentially testing the user’s memory, or potentially highlighting their capacity for what now appears to be personal criticism of the moderator. In one case, the moderator has given the user too much leeway and is almost guaranteed a result that can’t be applied; in the other, the user has been given almost no leeway whatsoever to honestly answer the question, and the moderator has come perilously close to leading a specific result. Not just the type of question, but the tone, the wording, even the delivery can affect the outcome.

General Rules

Plan ahead, and don’t be afraid to shift midstream. From the screener to the user selection to the script, know your test front to back. The more you know about each user, the clearer your understanding of them will be. The better you know your script, the less you’ll have to rely on it and the more you can pay attention to your user. And the better you know what informed your script, the more capable you’ll be in adjusting it midstream if necessary. Being able to shift tactics on the fly can mean the difference between actionable results and wasted time.

Be specific: what you test, who you test it on, and how you test it. Qualitative data can be just as useful as quantitative, but keeping it contextualized and structured enough to be meaningful can be tricky. Keep questions and test paths as simple and straightforward as possible. Set expectations up front without creating a stifling or overly-controlled atmosphere. Watch your user closely and provide as many opportunities as possible for their feedback. Be aware that different users may have wildly varying abilities to respond to your questions. If they can’t answer your question, pause and ask it from a different direction or with a different emphasis. If more than two people don’t provide useful feedback on a given task, redesign it as best you can and try again.

Be your own devil’s advocate. Personal investment in a product or design is not only human, it’s the mark of a great designer. That said, a great product is never developed in a vacuum, and is only as great as it is usable. From a philosophical standpoint, usability testing offers us the chance to self-edit and correct our own mistakes; in the end, the product will probably only be as good as the process that created it. One trick is to pretend that someone else designed the product, and to assume that none of it works—ask the questions you’d need to answer to make it work.

Never work alone. A user in testing may never give you the verbal answers you need, but may speak volumes with their body language and gestures. Moderating requires as close attention to yourself as it does to the user—a task difficult enough without taking notes. Whenever possible, use multiple feedback devices: microphones, video cameras, testing software (such as Morae)—even eye-tracking is possible. And last but not least, double or triple-up with colleagues. They can keep you honest, keep you on track, take notes, remind you of questions or paths you may have missed, and suggest deviations from the script when necessary.

Summing It Up

Usability testing can be an invaluable tool in crafting a good user experience, but it must be performed with care and close attention. In practice, a little poorly-moderated usability testing poses a far greater risk to a project or product than a lot of well-moderated research. The less specific the data that emerges, the more likely it is to be misused or misinterpreted. Wherever possible, keep your testing objectives simple and focused; if it’s market research you need, be explicit about it and create a separate work track for it. Take the time to plan ahead, document as much as possible, always have a second opinion, and never go into testing with your mind made up.