Objectifying Objectivity

“Science is a social phenomenon…It progresses by hunch, vision, and intuition. Much of its change through time is not a closer approach to absolute truth, but the alteration of cultural contexts that influence it. Facts are not pure information; culture also influences what we see and how we see it. Theories are not inexorable deductions from facts; most rely on imagination, which is cultural.” Gould, 1981

Business people often like to think of themselves as scientists of sorts – their science is practical and applied, but first and foremost it is grounded in objectivity and hypothesis testing, the hallmarks of scientific reasoning. Scientists seek concepts and principles, not subjective perspectives. They seek laws, truths and testable, verifiable data.  And we as a society, be the business person or the designer, simply accept objectivity as a fact of life. Thus, we cling to a myth of objectivity: that direct, objective knowledge of the world is obtainable, that our preconceived notions or expectations do not bias this knowledge, and that this knowledge is based on objective weighing of all relevant data on the balance of critical scientific evaluation. And here is where I will no doubt irritate some and flat out piss off others – objectivity is a myth. So from the outset, let’s be clear. I am not implying that objectivity is a fallacy in and of itself. That would be absolutist. Rather, like all myths, objectivity is an ideal for which we strive. The search for objectivity is an intrinsically worthwhile quest, but it should not get in the way of an insight, which frequently happens. If you can’t quantify it, an insight loses its worth. And that is a terrible, terrible thing.

In most business situations the fact of the matter is that we choose which events, numbers, etc. we want to place value on and those we want to dismiss. This is occasionally conscious, but more often is the product of our worldview, what we hope to personally gain from the data we employ (e.g. a promotion), or simply how tired we are when we sit in on our 300th interview at the end of a long day.  Our beliefs and expectations exert a profound control on perceptions. In other words, we see what we expect to see, and we remember what we want to remember. If we believe that moms are the primary decision makers when it comes to buying groceries, we overlook the roles of other family members in the process, roles that may in fact be more important. So, while people misrepresent themselves in most traditional research (itself another topic of discussion for a later date), we in fact twist reality one turn further. Out of all the occurrences going on in the environment, we select those that have some significance for us from our own egocentric position.

What all this means is that the first problem with obtaining objectivity is that perception strengthens opinions, and perception is biased in favor of expectations. The second is, that our involvement by definition alters the situation. In 1927, Werner Heisenberg, in examining the implications of quantum mechanics, developed the principle of indeterminacy, more commonly known as “the Heisenberg uncertainty principle.”  He showed that indeterminacy is unavoidable, because the process of observation invariably changes the observed object. Whether we run a focus group or ask someone to fill out 20 questions in a survey, we are altering “normal” behavior and therefore the how an idea, a product or a brand would play out in real life. What this means is that probability has replaced determinism, and that scientific certainty is an illusion.

So what are we to do? How can we reconcile the profound success of the scientific method with the conclusion that the perception and process make objectivity an unobtainable ideal? Well, we accept a few things and move on. Science depends less on complete objectivity than most of us imagine. Business even less so, especially as it pertains to things like advertising and branding.  Admitting that allows us to use a biased balance to weigh and evaluate data, experiences and good old-fashioned gut reactions. If we’re aware of the limitations by which we assess and measure our area of study, be it cereal shopping habits or car purchase decisions, we can use those biases effectively. To improve the accuracy of a balance, we must know its sources of error.

Pitfalls of subjectivity abound. Some can be avoided entirely; some can only be reduced. The trick is to know when and how to use them to get at a real insight. Some of the more common pitfalls are:

  • Ignoring relevant variables: We tend to ignore those variables that we consider irrelevant, even if others have suggested that these variables are significant. We ignore variables if we know of no way to remove them, because considering them forces us to admit that the experiment has ambiguities. If two variables may be responsible for an effect, we concentrate on the dominant one and ignore the other. The point is, we cherry pick and doing so leads to flaws.
  • Confirmation bias: During the time spent doing our initial research (that stuff we used to call a Lit Review), we may preferentially seek and find evidence that confirms our beliefs or preferred hypothesis. Thus, we select the experiment most likely to support our beliefs. This insidiously frequent pitfall allows us to maintain the illusion of objectivity (for us as well as for others) by carrying out a rigorous experiment, while nevertheless obtaining a result that is comfortably consistent with expectations and desires.
  • Biased sampling: Subjective sampling that unconsciously favors the desired outcome is easily avoided by randomization. Too often, we fail to consider the relevance of this problem during research design, leading to suspect insights.
  • Missing important background characteristics: Research can be affected by a bias of human senses, which are more sensitive to detecting change than to noticing constant detail. In the midst of collecting data, however you chose to think of it, it is easy to miss subtle changes in context. That, unfortunately, often leads to overlooking interrelationships between people, events, etc. In other words, it means you overlook important information because you can’t tear yourself away from what you perceive to be important.
  • Conformation bias in data interpretation: Data interpretation is subjective, and it can be dominated by prior belief. We should separate the interpretation of new data from the comparison of these data to prior results.

Ultimately, there is nothing wrong with embracing our subjective side, our interpretative side, our artistic side. This doesn’t necessarily mean rejecting the search for objectivity (although sometimes that is in fact the best course of action), but it does mean we should recognize that when a client starts freaking out about our research results and, more importantly, our insights, we should be prepared and address it head on rather than trying to defend ourselves as “objective observers”. After all, I’ll be the first to say that I love mythology. That said, I don’t believe life sprang from body of Ymir (look it up) but I do believe we can learn quite a bit from the story about our humanity. Similarly, if we embrace the realities of a subjective, or at least causal world, we produce better thinking, better insights and better results.

 

Divorce

Package it, slap a label on it and sell it for $4.99 a pound. It’s as simple as that when you’re selling groceries, right? Hardly. Food, meat in particular, is tied to cultural sensibilities about production, cleanliness, family values and a host of other topics.
Meat, like Norman Rockwell images of the American farm, is myth. We’ve been conditioned to turn away from the origins of our food and respond to blood and death with repulsion. Or have we?
With wealth comes the desire to learn about where our food comes from, how it’s produced and what exactly is in it. The point is that shopping for food is an increasingly complex process as has less to do with securing calories than it does with symbols and meaning.

Context and the Changing Mobile Landscape

Marketers increasingly think about consumers in complex ways. It is understood that in a changing digital landscape, the context in which they learn and shop influences what messages we deliver and how we deliver them.  But we rarely define “context”. It is one thing to design a usable app that conforms to human factors and cognitive requirements, but it is quite another to design a stage in an environment when there are innumerable semi-autonomous devices mediating in a swirl of information.

Physical Context

Physical context refers to the notion of infusing devices with a sense of “place.”  In other words, devices can distinguish the environments in which they “live” and react to them. But this is difficult. Mapping out longitude and latitude is one thing, but reacting to features (political, natural, social, etc.) is much more problematic. Getting beyond the boundaries of identifiable borders and structures, means coming to grips with “place”.

Think of a mall.  There are hundreds of stores, each with hundreds of devices. The device now has to decode what information is relevant and how it will deliver information. What competing retailer apps get precedence over others? When you receive an offer, will the device “tell” other retailers in order to generate real-time counter offers? The digital landscape is continuous at all points throughout the day and getting design right means understanding the systems in which people operate.

Device Context

Just as various kinds of sensory apparatus (GPS-receivers, proximity sensors, etc.) are the means by which mobile devices will become geographically aware, another class of sensors makes it possible for devices to become aware of each other. This presents a series of problems that are different than those of physical context.

Technology is on the verge of existing in a world with zero-infrastructure networks that can spring up anywhere, anytime. Devices will exist in a constant state of discovery.  Returning to the mall, imagine that you are with a friend whose device is communicating with yours.  In the mall are a couple of thousand devices, all of which are discovering each other.  What happens now?  Assuming we’ve dealt with the problem of one friend’s device communicating with the other friend’s device while blocking out the other 2000 devices, you still have several thousand potential “identities” that may have useful information.  How is it decided what to manage without devoting significant time to setting up the hundreds of variables?

Information Context

This is the realm of information architecture. Data no longer resides “in” our computers.  Devices are extensions of the cloud and exist as something akin to perceptual prostheses.  They exist to manipulate data in the same way a joy stick allows us to handle the arms of robot in a factory.  This reflects a shift in how we use information because all information is transitory.

Storage issues are essentially removed from the equation.  Content can leap from place to place and device to device in an instant. Content will be customizable and reflect the human-application interaction rather than shaping it. Devices will find themselves in the fourth kind of context of social interaction, with all its contingencies. Just as behavior is shaped by the moment, so too will the apps and information needed to adapt.

Socio-Cultural Context

Each person is unique to contrasting cultures, tongues, traditions and world views. A cultural context may exist on levels as diverse as a workplace, a family, a building, a county, a continent, a hemisphere. Cultural context provides a framework for what “works” for each consumer in the world.

It is at this point where a better perspective is gained on what will and will not be accepted in the mobile universe. Take a beer pouring app that mimics the pouring of a beer when the device is tilted.  It serves no direct function and yet it has been successful because of the cultural needs it to which it speaks – workplace breaks, male-to-male bonding, etc. But in another context, say Saudi Arabia, the context shifts. Success lies in understanding the reasons behind the consumers beliefs and actions in the symbolic exchanges, and the ability to code and decode those exchanges.  Marketing mishaps come from a lack of comprehension.

So What?

Our great technological leaps forward have also produced more complexity, leading to a greater need to make sense of insights. Without a means to categorize context, marketers will miss identifying trends that matter most. What to do?

  • Rethink the problem. Frequently, “the problem” is a facet of something else. For example, when researching an eBook the problem to be solved isn’t technology, it is understanding why people read different material in different contexts. It may be about displaying books as a means of gaining status. The point is the problem seen may not be the problem at all.
  • Define the contexts. Defining the contexts helps articulate the range of possibilities for observation. For example, if the consumer behavior is drinking beer, all contexts in which beer is purchased and consumed need to be articulated.
  • Think through the sample. Who is the marketing targeting? What are the social circles that will shape the event? It isn’t enough to define a demographic sample, you need to think in terms of cultural systems.
  • Make a plan that involves experiential information gathering, not just statistics. Develop a guide to navigate the data collection and a method for managing the data (everything is data). Don’t  just think about the questions to ask, but also include opportunities for observation and participation.
  • Head into the field. This is the heart of the process. Meaningful insights and moments of “truth” are slow to get at. Low-hanging fruit will be easy to spot, but the goal should be to find those deeper meanings. Because everything is data, from attitudes to artifacts, it is important to capture as much as possible.
  • Do the analysis. Analysis is the most difficult, but also the most rewarding. The goal is to bring a deep understanding of cultural behavior to the analysis process. This goes beyond casual observation and gets to the underlying structures of why people do what they do.

The process is more time consuming than traditional approaches, but it ultimately yields greater insight and reduces time and costs on the back end. The end result is that you create greater value for the client and for the company.

Anthropology and Usability: Getting Dirty

There are significant methodological and philosophical differences between ethnographic processes and laboratory-based processes in the product development cycle.  All too frequently, proponents of these data collection methods are set at odds, with members on both sides pointing fingers and declaring the shortcomings of  the methods in question.  Methodological purity, ownership and expertise are debated, with both ends of the spectrum becoming so engrossed in justifying themselves that the fundamental issues of product development are compromised.  Namely, will the product work in the broadest sense of term. One side throws out accusations of a lack of measures and scientific rigor.  The other side levels accusations about the irrelevance of a sterile, contextually detached laboratory environment.  At the end of the day, the both sides make valid points and the truth, such as it is, lies somewhere between the two extremes in the debate.  As such, we suggest that rather than treating usability and exploratory work as separate projects, that a mixed approach be used.

So why bridge methodological boundaries? Too frequently final interface design and product planning begin after testing in a laboratory setting has yielded reliable, measurable data.  The results often prove or disprove the functionality of a product and any errors that may take place during task execution.  Error and success rates are tabulated and tweaks are made to the system in the hopes of increasing performance and/or rooting out major problems that may delay product or site release and user satisfaction.  The problem is that while copious amounts of data are produced and legitimate design changes ensue, they do not necessarily yield data that are valid in a real-life context.  The data are reliable in a controlled situation, but may not necessarily be valid when seen in context. It is perfectly possible to obtain perfect reliability with no validity when testing. But perfect validity would assure perfect reliability because every test observation would yield the complete and exact truth.  Unfortunately, neither perfection nor quantifiable truth does exist in the real world, at least as it relates to human performance.  Reliable data must be supported with valid data which can best be found through field research.

Increasingly, people have turned to field observations as an effective way of checking validity.  Often, an anthropologist or someone using the moniker of “ethnographer” enters the field and spends enough time with potential users to understand how environment and culture shape what they do.  Ideally, these observations lead to product innovation and improved design.  At this point, unfortunately, the field expert is dropped from the equation and the product or website moves forward with little cross-functional interaction. The experts in UI take over and the “scientists” take charge of ensuring the product meets measures that are, often, somewhat arbitrary.  The “scientists” and the “humanists” do not work hand in hand to ensure the product works as it should in the hands of users going about their daily lives.

Often the divide stems from the argument that the lack of a controlled environment destroys the “scientific value” of research (a similar argument is made over the often small sample size), but by its very nature qualitative research always has a degree of subjectivity.  But to be fair, small performance changes are given statistical relevance when they should not.  In fact, any and all research, involves degrees of subjectivity and personal bias.  We’re not usually taught this epistemological reality by our professors when we learn our respective trades, but it is true nonetheless.  Indeed, if examining the history of science, there countless examples of hypothesis testing and discovery that would, if we apply the rules of scientific method used by most people, be considered less than scientifically ideal James Lind’s discovery of the cure for scurvy or Henri Becquerel discovery the existence of radioactivity serve as two such examples.  Bad science from the standpoint of sample size and environmental control, brilliant science if you’re one of  the millions of to people to have benefited from these discoveries.  The underlying problem is that testing can exist in a pure state and that testing should be pristine.  Unfortunately, if we miss the context we usually overlook the real problem. A product may conform to every aspect of anthropometrics, ergonomics, and established principles of interface design.  It may meet every requirement and have every feature potential consumers asked for or commented on during the various testing phases. You may get an improvement of a second in reaction time in a lab, but what if someone using an interface is chest deep in mud while bullets fly overhead.  Suddenly something that was well designed in a lab becomes useless because no one accounted for shaking hands, decrease in computational skills under physical and psychological stress, or the fact that someone is laying on their belly as they work with the interface.  Context, and how it impacts performance with a web application, software application, or any kind of UI now becomes of supreme importance, and knowing the right question to ask and the right action to measure become central to usability.

So what do we do?  We combine elements of ethnography and means-based testing, of course, documenting performance and the independent variables as part of the evaluation process.  This means detaching ourselves from a fixation with controlled environments and the subconscious (sometimes conscious) belief that our job is to yield the same sorts of material that would be used in designing, say, the structural integrity of the Space Shuttle.  The reality is that most of what we design is more dependent on context and environment than it is on being able to increase performance speed by 1%.  Consequently, for field usability to work, the first step is being honest with what we can do. A willingness to adapt to new or unfamiliar methodologies is one of the principal requirements for testing in the field, and is one of the primary considerations that should be taken into account when determining whether a team member should be directly involved.

The process begins with identifying the various contexts in which a product or UI will be put to use.  This may involve taking the product into their home and having them use it with all the external stresses going on around them.  It may mean performing tasks as bullets fly overhead and sleep deprivation sets in.  The point is to define the settings where use will take place, catalog stresses and distractions, then learn how these stresses impact performance, cognition, memory, etc.  For example, if you’re testing an electronic reading device, such as the Kindle, it would make sense to test it on the subway or when people are laying in bed (and thus at an odd angle), because those are the situations in which most people read — external variables are included in the final analysis and recommendations.  Does the position in bed influence necessary lumens or button size? Do people physically shrink in on themselves when using public transportation and how does this impact use?  The idea is simply to test the product under the lived conditions in which it will find use.  Years ago I did testing on an interface to be used in combat.  It worked well in the lab, but under combat conditions the interface was essentially useless.  What are seemingly minor issues dramatically changed the look, feel, and logic of the site. Is it possible to document every variable and context in which a product or application will see use?  No. However, the bulk of these situations will be uncovered.  And those which remain unaddressed frequently produce the same physiological and cognitive responses as the ones that were uncovered.  Of course, we do not suggest foregoing measurement of success and failure, time of task, click path or anything else.  These are still fundamental to usability.  We are simply advocating understanding how the situation shapes usability and designing with those variables in mind.

Once the initial test is done, we usually leave the product with the participant for about two weeks, then come back and run a different series of tests.  This allows the testing team to measure learnability as well as providing test participants time to catalog their experience with the product or application.  During this time, participants are asked to document everything they can about not only their interaction with the product, but also what is going on in the environment.  Once the research team returns, participants walk us through behavioral changes that have been the result of the product or interface.  There are times when a client gets everything right in terms of usability, but the user still rejects the product because it is too disruptive to their normal activities (or simply isn’t relevant to their condition).  In that case, you have to rethink what the product does and why.

Finally, there is the issue of delivery of the data.  Nine times out of ten the reader is looking for information that is quite literal and instructional.  Ambiguity and/or involved anecdotal descriptions are usually rejected in favor of what is more concrete. The struggle is how to provide this experience-near information.  It means doing more than providing numbers.  Information should be broken down into a structure such that each “theme” is easily identifiable within the first sentence.  More often than not, specific recommendations are preferred to implications and must be presented to the audience in concrete, usable ways.  Contextual data and its impact on use need the same approach.

A product or UI design’s usability is only relevant when taken outside the lab.  Rather than separating exploratory and testing processes into two activities that have minimal influence on each other, a mixed field method should be used in most testing.  In the final analysis, innovation and great design do not stem from one methodological process, but a combination of the two.

Semiotics and Brand Development

A brand is more than one iconic symbol, it’s a system of interconnected images, actions and signs that create a response in your consumers. While it is often put down to something as simple as logo design (which is anything but simple, in fact), identity and branding work extends beyond the creation of a company logo or trademark. The identity of any particular corporation, product or service encompasses a variety of materials including business cards, marketing materials, staff uniforms, advertisements, commercials, web presence, etc. All of this is created to establish an identity that the consumer comes to value beyond the direct benefits of the company.

A part of establishing the company brand, the identity work is important in conveying the principles, ideas and standards of the organization for which it is developed. Designers work together with strategists, copywriters, marketing directors and a host of other professionals to ensure that a brand identity is communicated effectively and efficiently from the client to the consumer. And in an age of social media and assumed shared interests, the communication is increasingly a multi-faceted conversation.

Most design firms and agencies create branding and identity work for their clients on some level, others specialize in identity and branding only. In any case, brand development involves deep thinking and a commitment to understanding the symbolic interconnectedness of the parties engaged with the brand. This is the art and science of semiotics. But why bother?  There are a number of simple reasons.

Understanding

Semiotics can help you dig into the underlying meanings in communication and establish a richer connection with consumers. On a practical level, a semiotic approach allows you to determine why an ad, a web page or a new product’s design is or isn’t working. It allows you to isolate components, but it also allows you to determine how they work or don’t work in relation to other elements.

Renovation

Over time symbols change and without constant care brands fall apart. A brand can keep making small changes, but ultimately, this process doesn’t work. Eventually you have to strip right back to bare bones and rebuild the brand completely. Semiotics can be used to deconstruct brands and categories, exposing truths that can be used to reconstruct them, and make them stronger.

Articulation

Semiotics can help articulate the problem you actually have, as opposed to the symptom you are trying to address. The approach allows you to move beyond intuition and get to the deeper issues behind what is happening with your brand.

Research

A semiotic approach can help you improve your qualitative work, by helping you redevelop your line of questioning, or listening for different things. Rather than focusing on traditional needs-based questioning and observation, a semiotics approach uncovers deeper issues and subconscious triggers that strengthen the meaning behind the brand.  There is a strong tradition in ethnographic research specifically of employing a semiotic approach.  Both methods are observational and interpretive. Ethnographic research aims to understand what consumers do and why they do it, rather than what they say. In other words, it assumes that human behavior is more complex than what people tell you. Similarly, semiotics assumes that how human beings interact with and understand the world is more than what they tell you.

Briefs

Ultimately, semiotics creates richer, deeper briefs and platforms that creative teams can actually work from. Rather than simply providing data, it provides avenues of expression that the creative team can build upon and use to explore a range of opportunities for communication. It can provide platforms from which to strengthen your communication, be that advertising or design.

As Halloween Approaches (Even in September)

Halloween is more than two months away, but already I’ve seen products and displays going up in a few places. For better or worse, the holidays creep further and further out from their actual date as retailers see opportunities to sell their goods. And to add to the impending spookiness that awaits us, I spent part of my Friday night watching a scary movie with my children, fully aware that it would necessitate cramming four people into a single bed, somewhere around midnight – I was, of course, proven right.  All of this has me reflecting on the socio-cultural significances of Halloween as a reflection of cultural transformation, even if it is a single night. Yes, even the simplest things start the mind wandering.

A few years back, an associate professor of human development and family studies at Penn State’s Delaware County Campus, noted that parents need to realize that scaring our kids isn’t necessarily a way to mitigate kids’ fears of death and other things frightening.  Rightfully, she contended that Halloween is a time when we expose kids to behavior that is not the norm and that children connect the holiday with death.  The argument goes that we, regardless of who “We” are, typically distance ourselves from death and shield children from it, but in this case, young children encounter their fears when they face decorations of skeletons and tombstones. This can be scarring. This, of course, is bad.  Or is it?  Is it even accurate?

First, we expose our children to death regularly.  What we shield our kids from is pollution associated with decay.  In the case of Halloween, we are presenting our children with a sanitized, safe form of death that has none of the associations with contamination.  Second, children are exposed to death when they play video games, tune in to the TV or deal with the loss of a grandparent.  We may try to lessen the pain or deflect the underlying causality, but death itself is indeed part of a child’s upbringing, though it may not be as overt as it is at Halloween.  I will concede that we expose our children to death less than we perhaps did in the past, when people worked the farm together and were accustomed to things like slaughter, but to assume children are shielded from death is fantasy. We’ve simply changed the medium.

And should we even be shielding kids in the first place?  We often work under the assumption that it is somehow our duty as parents to protect children from any and all discomfort, but there is nothing out there to prove that doing so benefits the child. Fear teaches, particularly when it is safe.  Discomfort teaches, particularly when it isn’t overwhelming.  Children are, I would contend, smarter than we often think.  To assume they can’t make the leap between the literal and the symbolic is a bit obtuse.  While Halloween teaches children about death, it also teaches them about the nature of symbolism, rules of reciprocity, a sense of self-reliance, creativity and a host of other positive elements of personhood.

As my oldest daughter walked from house to house last Halloween with her friend from Egypt, getting treats from homes comprised of people from a wide range of nations (our neighborhood happens to have large south Asian and Middle Eastern populations) it struck me how important this holiday is, because it is so public and because it is wrapped up in a universal need to deflect the fear of death.  It is a holiday that encourages parents and kids of other cultures to join in the fun and feel like they are welcome and integral parts of the adopted culture.  It exposes the children and parents of the adoptive culture to people and worldviews they may not have otherwise interacted with.  The experience can be thought of as enculturation, the process by which a person learns the requirements of the culture by which he or she is surrounded, and acquiring values and behaviors that are appropriate or necessary in that culture.  This has often been conceived to be a unidimensional, zero-sum cultural conflict in which the minority’s culture is diminished by the dominant group’s culture, but it’s not that simple.  There is an exchange of sorts going on. There are a couple of ways a person learns a culture. Direct teaching of a culture is what happens when you don’t pay attention, mostly by the parents, when a person is told to do something because it is right and to not do something because it is bad. For example, when children ask for something, they are constantly asked “What do you say?” and the child is expected to remember to say “please.” A second conscious way a person learns a culture is to watch others around them and to emulate their behavior. But in doing so, they often alter elements of it and reshape the culture – culture isn’t fixed, after all, it is a matter of practice, negation and shared invention.

What this means is that Halloween becomes a way of learning and exchanging.  Day of the Dead decorations find new uses, costumes come to reflect the sensibilities of the minority population and new ways of defining and interacting with the world emerge.  And there are very real, very meaningful results.  Businesses alter their merchandise, retailers decorate differently and new modes of shopping arise.  People develop new interests and curiosity about their world.  So, yes, Halloween may indeed scare the children, but the benefits of being scared outweigh a night of belly aches and spooky dreams.

Getting Over Ourselves: Make research meaningful

The other day I was privy to a discussion by a researcher who was decidedly upset about having to “dumb down” the research report he had completed. The client was impressed by the depth of the work, but equally frustrated with the seemingly academic depth of the language of the report and the use of jargon that was, realistically, more appropriate to anthropological circles than to a business environment. The researcher was upset by the client’s request to strip out discussions of agency, systems design theory, identity formation, etc., and stated something along the lines of “I had to learn this sort of thing in grad school, so they should take the time to do the same”. And while I think it would be lovely (and perhaps beneficial) if clients took such an interest in what we as researchers study, I have to say my views on the matter are very different. Making what we learn useful and meaningful to the client isn’t “dumbing it down”, it’s performing the task for which we were hired. We do not receive grants and write peer-reviewed articles when businesses hire us. Indeed, we may not write at all. What we do is produce insights and information that they can use, from their design team to their CEO. If they aren’t asking us to become expert in supply chain models or accounting, then asking them to embrace often daunting concepts in socio-cultural theory is both unrealistic and, frankly, arrogant.

In general, companies hire ethnographers (anthropologist, sociologists, etc.) for a simple reason: to uncover new ways to achieve competitive advantage and make more money. This translates, most often, into research to understanding new product opportunities, brand positioning, or salient marketing messages. Unfortunately, our clients often have no idea what to do with the research. But more often than not, the fault lies with ethnographers, not the client, and can be overcome if we apply ourselves just a bit.

Usefulness means being a guide, not a lecturer. So why are we so often disinclined to make what we do useful to business people? Part of it, I believe, stems from an unwillingness to address our own biases openly and honestly. There is a tendency among many of us coming out of what have traditionally been academic disciplines to ridicule or react negatively to people in the business world. To be honest, it’s why we chose, say, an anthropology program over a business program in college. We often, consciously or subconsciously, hold these people in contempt and believe that it is they who should bend, not us, as if we are providing secret knowledge are indeed of a higher order of life than they. We resent the idea that these lesser minds would have to audacity to ask us to curb our genius. And yet, there’s nothing new in making complex ideas useful, simple, or intelligible to people without advanced training in the social sciences. Look at any Anthro 101 course and you realize we’ve been doing this for a very long time already. The fact of the matter is that in order to be relevant and to get the client excited about what we do and to value the thinking behind our work, we have to remember that not everyone wants to be an expert in social science any more than they want to be physicians or painters – they want us to be the experts and to know what we’re doing, including crafting what we learn into something they can grasp and apply even as they try to balance their own work load. Balancing jargon with meaning is, or should be, the goal.

Another struggling point I often think stems from how many of us were trained. Traditionally, the researcher is either left to work alone or as part of a very small team. The findings are analyzed, complied and shared with a small group of like-minded individuals. (We would like to believe that the numbers of people who care about what we write are larger, but the truth is most of us don’t give the work of our colleagues the attention they deserve or would at least like to believe they deserve.) Our careers are built on proving our intelligence, which means making an intellectual case that addresses every possible theoretical angle in great detail. But in the business context, to whom are we proving our intelligence? And do they care? They hire us precisely because we are the experts, not to prove how smart we are. This isn’t to say that we can or should forego the rigor good ethnographic research should employ, but it is to say that whether we like it or not, most of the theoretical models we use should end up in the appendix, not in what the client sees, hears or reads. Not only does it overcomplicate our findings, it often comes across as either arrogant or needy, neither quality being something the client finds particularly enticing or reassuring.

The fact is that we do ourselves and the discipline a disservice by not learning the language and needs of business people. We complain that untrained people are slowly “taking over” ethnography, but it’s our own doing nine times out of ten. It isn’t enough to have a better grasp of the complexities of the human condition, we have to learn to translate our work and come to terms with the fact that the people hiring us have a very real, practical need for our findings. If it cannot be translated into something that can be grasped in the first two minutes, then in their way of seeing the world, it is money wasted.

Are we there to educate or inform? Our work is frequently deemed too academic. So what does it mean when a client says, “It’s too academic.”?
 It means that they didn’t hire you to teach a class about anthropological theory and method. It means they don’t want to sit through a 100 page Power Point presentation before getting to the heart of the matter. They are in business and have neither the time nor the interest of a scholar or student.  Again, this doesn’t mean you don’t do the work or fail to set up the points you are trying to make, but it does mean that you be cognizant of the  fact that the audience hired you to improve their business and products, not teach a course on anthropological methods.  And indeed, some concepts are simply too complex to turn into a couple of bullet points. But that doesn’t mean we cannot try, particularly if we hope to get more work from the client.

The people with the luxury of sitting through a lengthy presentation or who have the time to discuss the intricacies of social theory rarely have a significant amount of authority in the decision-making process, and they rarely hold the purse strings.  This isn’t to say that those two hours of research findings we present aren’t meaningful, but rather that presentations need to be tailored to the needs of the people buying your service (research) and product (recommendations). For the business community, the product is not knowledge, but intelligence.  In other words, the product is knowledge that is actionable and useful. And to be fair, it’s worth noting that the client is the one who pays for our work. If the idea of providing them with the service and product they need is unpalatable, then I would argue that the ethnographer needs to quit complaining and start exploring a different line of work, plain and simple.

The researcher, research team, creative team, client, and everyone invested in the project need to work toward turning information into something they can act upon. When the time comes to sit down with the client and explain what you learned, the ethnographer must be prepared to also explain what to do with it next in a simple, clear way.

 

 

Translating culture and opening markets

Success translates well into narrative. Who hasn’t heard those wonderful stories of marketing campaigns gone astray when introduced into a global setting? Remember when Puffs tissue started marketing their tissues in Germany and it didn’t do so well because “Puff” means “brothel” in German?  Or when Bacardi launched a fruit drink named Pavian in France it translated into slang as “chick,” but when they promoted it in Germany the same word meant “Baboon?”

We’ve all heard of these mistakes and we all get a chuckle, but the business ramifications of not doing your cultural homework are tremendous. And this goes well beyond something as superficial as a mistranslation.  We are prone to imposing our way of seeing the world on others, but what we may see in the developed world as universal may be significantly different in developing countries. Culture shapes how we use, interpret and shop for goods and what US shoppers may see as simply, say, buying chicken for dinner may mean much more in another part of the world. In other words, retailers and manufacturers need to understand what matters and why it matters according to different cultural perceptions.

Returning to our example of purchasing chicken at the grocery in the US, take concepts of cleanliness and food safety. As a population that has had easy access to meat for longer than most of us can remember, our concerns revolve around the promotion of “health” as a means of reducing fat in the diet. Increasingly, we make decisions based on the sanitary conditions of the farms where chickens are raised and the ethical treatment of the animals.  We increasingly associate “healthy” with being “green” (another wonderfully loaded and vague word). That has led to a push for reduced packaging as proof of sustainability and healthy living.

Now, take China. In a place where access to meat was – until fairly recently – limited, chicken is associated with status and upward mobility.  In the past, the source of the meat itself was often suspect because you may have purchased it in less than uniform locations.  Consequently, what we would see as excessive packaging is understood differently – the factory setting implies progress, wealth and modernity, which in turn imply good “health.”  Meat is something you want to show off to your friends and family because it is associated with status, which is associated with good health. Add to that the fact that people in much of world (unlike the US) have traditionally seen the chicken as something other than a pure commodity.  Indeed, there are many poems written about chickens (He Crows the Morning by Hsieh Ling-Yun or The Most Noble Fowl by Mohammad Ibn Sina). The result is that if you position chicken in the developing world as you might in the US, as a low-fat, easy to prepare alternative source of protein, it won’t correspond to the local worldview and your brand won’t gain traction.  You will invest a lot of money and may get very little in return. And China is only one example; expand this to the BRIC nations or the Middle East.

Of course, this is only one example, but the idea cuts across all categories. Don’t believe it? Tropicana initially failed when pushing orange juice in South America because it was pushed as a breakfast drink, which in South America it is frequently not – our beloved breakfast icon is something for the afternoon, a treat and a snack.  Papa John’s, on the other hand, is doing wonderfully in Egypt by maintaining it’s “American” mystique while incorporating toppings and product names that reflect local tastes.

Understanding what it means to shop on a global, national and local level is central to developing successful new products, sales channels and marketing campaigns. That means going beyond the product or retail environment and asking bigger questions:

Question: How does shopping convey status and wealth?

Answer: Pabst Blue Ribbon is a premium brand in China and signifies wealth because it has been positioned as a classic American Lager rather than a hipster yard beer. In China, it conveys a sense of worldliness, refinement and cultivated taste.

Question: What cultural norms shape how people interact with you brand and your store? Answer: Victoria’s Secret can’t be promoted in Riyadh or Bangalore the way it is in London.  Attitudes outside the West about sexuality, exposure of the human body and gender roles are radically different, shaping everything from marketing content to store displays.

And this could go on and on.  So what does it mean for marketing your brand in the developing world (in fact, what does it mean for marketing your brand in Alabama vs. LA)? It means that before you decide to launch or even reposition a brand or product around the world you need to spend some time digging and learning why people live the way they do and how your brand can fit into that complex system of practices and beliefs.  It isn’t enough to make sure the language is translated correctly or the color pallet makes sense. You have to come to understand the population the way you understand your neighbor. That’s where you find new opportunities and that’s where you find growth, both in terms of brand equity and the bottom line.

Bricks, Clicks and the “New” Retail Paradigm

Since the emergence of internet shopping, companies have tended to structure their way of thinking about shopping channels in silos that reflect their operations. Shopping behavior is segmented according to the channel and the shopper is relegated to a specific trajectory. Shopping is usually thought of in terms of work – procuring goods, meeting needs, etc.  Shopping is seen first as a function and secondarily as something that serves emotional and social needs. Even as we talk about retail therapy, we revert in marketing to discussions about seemingly rational behavior.  But it isn’t so simple anymore. Unfortunately, with the ubiquity of internet access, be it from a fixed location or via a mobile device, the truth is those lines between the off-line and online experience have become so blurred as to be meaningless.  Rather than individual silos, shopping processes function as part of a complex, adaptive system that is increasingly driven by social interaction and socio-cultural needs, not transactional needs.

If a company is to grow its brand (and thereby its bottom line), it is wise to think about how this system emerges and understand how the act of shopping has fundamentally changed at a deep cultural level. What this means for shopper marketing is that the best retail experiences, those with the highest degrees of loyalty and sales, are those that project a story and invite the shopper into the narrative.

Bricks

Fifty years ago, the retail space was the only real way to interact with customers.  Yes, there was the option of the catalog, but it was, and is, a one-way conversation.  The retail space was more of a transactional space and advertising was simply a list, though cleverly done, of the goods available.  As shopping has become more convenient and the transactional element has been driven into new realms, and the retail spaces and brands that everyone admires have begun to touch shoppers on a more visceral level.

Shopping is about more than getting more stuff.  Brick and mortar shopping as it is practiced today in particular jumps the line between a functional/transactional and social/symbolic experience. Shopping is as much about entertainment, establishing cultural roles and teaching cultural norms (or rebelling against them) as it is about anything else. Often, the decision to enter into one retail space over another is about experiential elements more than it is price or convenience. Because experience is rooted increasingly in dialog between members of social groups (e.g. moms, bicyclists, rockabilly fans, etc.), the retail experience actually begins well before we set foot in the store, in conversations where people congregate.

Clicks

Digital shopping (online or with a mobile device) is highly personal, portable and an increasingly participatory experience. When it first began, the online shopping experience was largely fixed in one location and the interactions, primarily transactional in nature, were almost exclusively between an individual and what a company chose to present to them.  But this process was quickly modified as people began posting product reviews, blogging about their experiences, etc. Even so, the process of investigating a company was largely between an individual and either an institution or an abstract person in an unknown location.  And then social media was born changing the nature of the web and the shopping landscape forever. The highly individual, highly transactional nature of the online shopping experience became subject to the same social and cultural drivers as the brick and mortar experience.

Shopping ahs become as much about structuring peer groups as the transaction. The shopping and the purchase itself represent the groups we interact with and our places/roles in them. Because social media tools help us craft public identity, so do our purchase choices. With the increased use of mobile devices online shopping, and hence social media interaction at the point of shopping, has moved from the individual sitting at his or her kitchen table to a very public dialog. Peer group members (no, Ginger, we didn’t say “demographic” or “segment”) interact with each other and the retail environment simultaneously, creating a shopping experience that can draw literally thousands of people into the conversation from the point of consideration to the point of purchase.

Blenders

Retailers can blend the physical and social experience of brick and mortar shopping with the participatory (read: social network) experience of digital shopping to achieve a greater percent of brand loyalists (which currently and historically sits at 5%) and higher multi-channel revenue streams.

The first step is to examine in a bit more detail why people participate in digital shopping and what it means for the retail experience in its totality.

  • Social network: When shopping is done with others, as a family or with a friend, it is as much about establishing social bonds and being an outing as it is about fulfilling specific needs. It doesn’t matter if the shopping is in a physical location, in virtual space or a blending of the two.  Shopping has replaced the park, the lake, etc. Retail spaces and social media spaces that encourage people to interact both with each other and the brand lead to a greater sense of belonging and reinforce the roles people have adopted for that shopping excursion. For example, placing small sweets throughout a lingerie store (returning to our bra example) increases the sense of romanticism and allows people to “play” to the underlying storyline the shopper and her counterpart are seeking. Add to this the ability to share that experience with others and it becomes more real, more meaningful.  That in turn builds both interest and loyalty amongst your shoppers.
  • Entertainment and gaming: The store is indicative of a stage, a field on which we play games.  The same is true in social media.  People assume roles which they use to create a game-like environment, one-upping others and competing for cultural, psychic and monetary capital. Even without the direct associations with a specific story line a retail space and the social media environment should still conform to some very basic principles.  Namely, escape, fantasy, and inclusion. The total experience speaks to cultural and psychological triggers of enjoyment and participation. People create memories within places if storylines develop and form personal connections. The stronger the connection, the more likely they are to frequent the space and to buy. A good brand needs to be create a shared identity, connecting the company and the shopper by developing clear imagery and displays that create the sense that there is a narrative behind the façade.
  • Rewards as social influence: Rewards and bonuses are about more than getting goods for cheap.  The underlying motivations are largely drawn from the need to attain a sense of mastery that isn’t too far removed from the pleasure our ancestors derived from the hunt.  Not only do you get the good deal, but your sense of self worth and accomplishment is inflated.  Going beyond the need for mastery is the pride derived from demonstrating to the world that you are skilled.  You gain influence and cultural capital.  Add to the mix the element of social media, mobile social media more precisely, and the validation you receive is immediate and more expansive. The entire world shares in your success and you gain a degree of prestige that is tied to the exact moment of shopping, not as an afterthought. The result is that the brand, the store and the online presence become an integrated experience that is far more powerful for the shopper.

The trick for retailers is determining the proper mix of each of these elements to create the ideal shopping experiences for their brand. In the end, retail shopping is becoming more complex. With the increased use of online shopping and the ease of access to a more and more locations, people are making choices based on underlying desires, not just functional needs. Anything a retailer can do to improve the experience is a key differentiator. Differentiate your store and you increase loyalty and sales.

From Personas to Stories: Creating Better Tools for Design and Marketing

Design ethnography takes the position than human behavior and the ways in which people construct meaning of their lives are contextually mitigated, highly variable and culturally specific. on the central premise of ethnography is that it assumes that we must first discover what people actually do and why they do it before we can assign to their actions and behaviors to design changes or innovation. The ultimate goal is to uncover pertinent insights about a population’s experience and translate their actions, goals, worldview and perspectives as they directly relate to a brand, object or activity, and the role that these pieces play with regards to interactions with their environment. Often, the information results in a large-scale, broad document, but it also often results in the development of personas.

The idea is that personas bring customer research to life and make it actionable, ensuring the right decisions are made by a design or marketing team based on the right information. The approach to persona development typically draws from both quantitative and qualitative tools and methodologies, but because of the very personal nature of ethnography, the methodology often leads the charge. The use of ethnographic research helps the creation of a number of archetype (fictions, in the most positive sense) that can be used to develop products that deliver positive user experiences. They personalize the information and allow designers and marketers to think about creating around specific individuals.

But there are problems with personas. Don’t get me wrong, I believe personas can be useful and help design teams. But I also believe they can reduce the human condition to a series of attributes and lose the spirit of what personas are designed to do. First, in terms of scientific logic, because personas are fictional, they have no clear relationship to real customers and therefore cannot be considered scientific. So much for the science.

For practical implementation, personas often distance a team from engagement with real users and their needs by reducing them to a series of parts. The personas, then, do the opposite of what they are intended to, forcing design teams down a path that gives the illusion of user-centricity while actually reflecting the interpretations or the individual designers. Creating hypothetical users with real names, stories and personalities may seem unserious and whimsical to some teams within an organization and be, consequently, dismissed as so much fluff. But by far, the biggest problem, at least to my way of seeing things, is that while we want to use personas to humanize potential customers and users, we in fact reduce them to objects and a laundry list of actions, personality quirks and minimalist descriptions.

I’m not advocating the dismissal of personas, but I am suggesting that perhaps there are alternatives. One place to start is to admit we are writing fiction when we construct these tools and expand upon that notion. We should be adding to the mix humanistic narratives. Customer novellas, so to speak. It requires more time and effort, both on the part of the person/people creating them as well as those using them, but it also gives greater depth and insight into the needs, beliefs and practices of the people for whom we design and to whom we market. Rather than relying exclusively on a dry report or a poster with a list of attributes.  In this model, the idea is to create a short story in which actors (the eventual personas) engage with each other, a wider range of people, and a range of contexts. Doing so allows us to see interactions and situations that lead to greater insights. It allows us to look at symbolic and functional relationships and tease out elements that get at the heart of the fictional characters we create.

Why is that important? Because it does precisely what personas are meant to do but typically fail at – provide depth and characterization, establish a sense of personal connection between designers and users and provide breakthrough insight and inspiration. Anyone who has read history vs. historical novels is familiar with the idea. It is easy to reduce Julius Caesar to a series of exploits and personality traits, but in doing so we lose the feel for who the man was. A historical novel, in contrast, adds flavor by injecting conversation, feelings, motivations and interactions. We walk away with a feeling for who he was and what affect he had on others, good and bad.

Imagine developing a persona for Frodo from The Lord of the Rings. We could say the following and attribute it to all Hobbits: Frodo is enamored by adventure but frightened by it. He loves mushrooms, has no wife, is extremely loyal to his friends and will work at any task he is given until it is done, regardless of the difficulty or potential for personal harm. He disdains shoes and has a love of waist coats.

There’s nothing wrong with this description, but for anyone who had read the trilogy or even seen the movies, the shortcomings are obvious. We miss the bulk of Frodo’s personality. In exploring the novel, we come to develop a rich understanding of Frodo, a deep understanding of his motivations and personality and his relationship with other members of the party, including the Ring.

For the literalists out there, I am not suggesting we create anything as vast as a novel, particularly one as expansive as The Lord of the Rings, but I am suggesting that we move beyond attributes and create stories that more fully develop the people behind the personas. Several pages of engaging writing is sufficient. Not only does it provide deeper insights, but it engages the reader more fully, inspiring them to go beyond the “data” and explore a wider array of design, brand and marketing options. Again, it isn’t meant to replace personas (or the research report), but to add to it. It requires more effort and time on the part of the person creating it as well as the person consuming it, something people are often disinclined to do, but the end result is better design, greater innovation and a more complete vision of what could be.