Storytelling, Presenting and Getting Past the Stick in Your Bum

The other day I was thinking about how to present findings to a client about what was, frankly, a seemingly dry subject. Numerous stakeholders would be involved and would range from the CMO down to brand managers, product engineers, etc. So, knowing I had a dry subject and a conservative audience, I decided to rethink the question a bit.  Was the goal to present findings or was it something more? The goal is ultimately to shake the client’s foundations of belief, to rattle his or her assumptions, to create a new state a awareness.  Any good  presentation serves to evoke a participatory feeling in the viewers and bring them into the moment of experience, compelling them to consider new ways of classifying and thinking about their world, as well as their processes. The report will come later, but the presentation is about changing minds.

That brings us back to storytelling. When we bring our research and strategic thinking to life, the story we weave is less a list of data points than an interpretation and distillation of a series of experiences, Details are selectively recounted including all the “odds and ends that are associated with remembered events”  (see VanMaanen  1988).  The audience is drawn into the story created both by the author/editor and participant(s) – in other words, a good story, and a good presentation, is a shared experience, co-created in the moment. Bore the audience and there is almost no chance of affecting change. Selective packaging to exemplify generalized constructs is a standard practice. What we present needs to illustrate, provocate and elucidate. This is doubly so when addressing the needs of business and design teams with distinct, targeted problems and limited time.  Our editorial choices make points clear in what might otherwise be murky waters – we make learning sexy.  And that means becoming marvelous storytellers.

So what do we need to do to make a good story? First, start thinking in terms of symbols and metaphor. Stories are conveyed through language, which is by definition a symbolic system. The key to successful engagement is to move from structural aspects of a story to the symbolic, uncovering systems of meaning that resonate with clients and compel them to action. These symbolic dimensions that emerge in the narrative add value to brands by fulfilling culturally constructed concepts (quality, status, age, belonging, etc.). A brand is a signal that triggers a field of meanings in the consumer’s mind. These meanings are conveyed directly and inferentially through stories. By harnessing the symbolic power behind these meanings, strong brands move beyond the codes governing a product category and enter the personal space of the consumer.  The same holds true for the client.  Through storytelling and presentation of symbolic codes, clients move from fixating on the product line and can rethink what the brand means in a wider context.

Second, strip the presentation of text. You’re hear to talk and the image on the wall behind you is there to produce a response. Text, then, becomes a distraction unless you intend to use it as a visual manifestation of an idea (imagine a giant “NO” in lieu of something like a stop sign). The media tool we use, be it PowerPoint or something similar, is the comforting factor for audience and presenter alike, not the content. That means we can use the program for displaying images, visual cues and video, but we cannot let it become the focal point – it is like a set on which an actor performs. Don’t let it overshadow the actor.

Third, just because you’re using PowerPoint, it doesn’t mean that you can’t alter the stage. A presentation is like a play – so why not do it “in the round”? Promote physicality, discussion and direct interaction between you and the audience members. Give people small tasks throughout the presentation so that they are not passive recipients of information but co-creators. The more interaction, the more likely they will be to internalize the story you present.

Finally, have fun. It seems self evident, but it is perhaps the hardest thing most people find to do – they may talk about it, but they can’t actually do it. Remember, your role is to produce change, not recite facts.

Defining Context

Planners, researchers and marketers increasingly think about consumer in complex ways. We understand that in a changing digital landscape, where people are dialed in 27/7, the context in which they learn and shop is incredibly important and influences what messages we deliver and how we deliver them.  So increasingly, we are thinking about what situations govern behavior and designing to fit that complexity. 

We spend a great deal of time talking about context, but rarely use models to define elements of it.  This particularly true when talking about mobile devices and accounts for the hit-and-miss quality of  most apps available on the market.  It is one thing to design a usable app that conforms to human factors and cognitive requirements, but it is quite another to design a stage in an environment, or an environment itself, when there are innumerable semi-autonomous devices mediating an swirl of information.  Consequently, it makes sense for us to think about how we structure context so that we can determine what exactly we can affect.

Physical Context

From the computational side of things, physical context refers to the notion of imbuing devices with a sense of “place.”  In other words, devices can distinguish the environments in which they “live” at any given moment and react to them. But this is much more difficult than it at first appears. Mapping out longitude and latitude is one thing, but reacting to socio-cultural features (political, natural, social, etc.) is much more problematic. Getting beyond demarcation of identifiable borders and structures, means coming to grips with place (as opposed to space).  That in turns having to be “aware” on some level. 

Think of a mall.  Within that mall are hundreds of stores, each with hundreds of devices and/or nodes of information. The device now has to decode what information is most relevant to itself, what information is most relevant to the user and how it will deliver that information.  Returning to the mall example, we have to think about a host of things in order to make any app relevant.  What competing retailer apps get precedence over others? When you receive an offer from one store, will the device “tell” other retailers in order to generate real-time counter offers?  When someone else is holding your device for you (say, while trying on clothing but needing to set the iPad aside or while your child plays Angry Birds on the couch in the evening), how will the device know what incoming content is private and what is public?  How will the device communicate with a location or with other devices as it moves throughout the mall? Is it even necessary? The point is simply this; we increasingly have access to the digital landscape at all points throughout the day and getting design right means understanding the systems in which people operate.

Device Context

Just as various kinds of sensory apparatus (GPS-receivers, proximity sensors, etc.) are the means by which mobile devices will become geographically aware, another class of sensors makes it possible for devices to become aware of each other. There is a fundamental difference between the ability to transmit data between devices and the ability (and desire) of devices to discover each other. And this presents a series of problems that are different in nature than those of physical context. Because this deals with choices of communication.

We are on the verge of existing in a world with zero-infrastructure networks that can spring up anywhere, anytime. That means that devices are in a potentially constant state of discovery.  Returning to the mall for a moment, imagine that you are with a friend whose device is communicating with yours.  In there mall are a couple of thousand devices, all of which are discovering each other.  What happens now?  Assuming we’ve dealt with the problem of my mobile phone communicating with my friend’s phone while blocking out the other 2000 devices, we still have several thousand potentially “identities” that may have useful information for us.  How do we select how to manage that without devoting a ridiculous amount of time to setting up the hundreds of variables that shape what we do and don’t want at any given time? Perhaps more importantly, how do we develop a process to manage it that mimics, or at least compliments, the human brain and cultural patterns of behavior? All this is couched in a neat little world defined within a single, bounded  geographical unit.  So understanding device context is as important as understanding physical context.

Information Context

This is the realm of information architecture, plain and simple.  But with the advent of pervasive mobile, this topic is becoming even more complex.  Specifically, data no longer resides, literally or figuratively, “in” our computers.  Our devices are extensions of the cloud and exist as something akin to perceptual prostheses.  They exist to manipulate data in the same way a joy stick allows us to handle the arms of robot in a factory.  And this is important because it reflects a shift in how we think about and use information because all information (and the aps that carry that information) is transitory and by and large, public. 

 This changes the nature of what the device has to actually be. Storage issues are essentially removed from the equation.  Content can leap from place to place and device to device in an instant. All content will be customizable and reflect the human-application interaction rather than shaping it. This leads to the point that devices, and the people who use them, will find themselves in the 4th kind of context of social interaction, with all its peculiarities and contingencies. Just as our behavior and worldview shapes and is shaped by the moment in which we find ourselves, so too will our apps and information need to adapt to the moment.  In other words, devices will need to be more human.

Socio-Cultural Context

The whole humankind is riven with contrasting practices, cultures, tongues, traditions and world views. A cultural context may exist on levels as diverse as a workplace, a family, a building, a city, a county, a state, a nation, a continent, a hemisphere etc. A cultural context provides a shared understanding of meaning provides a framework for what “works” in the world. It is what helps you recognize “your kind” in all senses of the word.

And it is at the point of socio-cultural understanding where we gain a better perspective on what will and will not be accepted in the mobile universe.  We need to understand the essence behind the veil of design and usage to uncover meaning.  Take the beer pouring app as an example.  Here we have a simple app that mimics the pouring of a beer when you tilt your device.  On the surface it has little relevance to our daily lives.  It serves no direct function and yet it has been tremendously successful because of the cultural needs it to which it speaks – workplace breaks from the mundane, the ability to show off the newest thing, male-to-male bonding, etc.  Its absurdity is precisely what makes it relevant.  But in another context, say Saudi Arabia, the context shifts and meaning must change to fit that particular milieu.

The nature of our successes lies in understanding the reasons behind our beliefs and actions, in the symbolic exchanges we are part of and our abilities to code and decode those symbolic exchanges.  The nature of our mistakes essentially lies in a lack of comprehension. It leads to UI and app development that speak to a minority of the population even as they try to sell to the masses. Without understand the underlying epistemological constructs of a group (or more accurately, a mix of often associated groups at different points of interaction and interpretation) then we miss opportunities.

So What?

So why does any of this matter?  It matters because good design and messaging are increasingly difficult to master.  Our great technological leaps forward have also produced more complexity, which in turn leads to a greater need to make sense of what is “going on” in the broadest sense of the term when it comes to gathering insights and translating them into design and business applications. Without a means by which to categorize context, we can’t isolate those things that matter most and we miss enormous opportunities. So how do we get at underlying contexts? To be perfectly blunt, there is no perfect system because contexts change if we’ve done our jobs well (cause and effect), but there are ways to come close. Depending on the project, questions may be very tactical and specific or very strategic and broad. In either case, the first step is to clearly articulate what the overarching goal is.

First, rethink the problem. Frequently, what we see as the problem is in fact a facet of something else. For example, when researching something like an eBook the problem to be solved isn’t technology, it may be understanding why people read different material in different contexts. It may be about displaying books for colleagues and friends as a means of gaining status. The point is that the problem we see may not be the problem at all and we need to think about possibilities before we enter the field.

Second, begin defining the contexts.
Where does an activity or practice take place? Defining the contexts we want to examine helps articulate the range of possibilities for observation. For example, if we’re studying beer drinking, we need to articulate all the possible contexts in which beer is purchased and consumed.

Third, think through the complexity of the sample.
Who are the people we want to talk with? What are the social and cultural circles that will shape the event? It isn’t enough to define a demographic sample, you need to think in terms of cultural, social, professional and environmental systems, determining not only who will be the primary participants, but also the actors that shape the context.

Fourth, make a game plan that involves direct experiential information gathering, don’t just dig into statistics. Put together a guide to help navigate the data collection and a method for managing the data (remember, everything is data and it is easy to become overwhelmed without a plan). Having a series of key questions and observational points to explore is the first component. But don’t just think about the questions you will ask, but also include opportunities for observation, mapping, and participation.

Fifth, head into the field.
This is the heart of the process. Meaningful insights and moments of “truth” are slow to get at. Low-hanging fruit will be easy to spot, but the goal should be to find those deeper practices and meanings. Because everything is data, from attitudes to mannerisms to artifacts, it is important to capture as much as possible. Take notes, draw maps and sketches, take photographs, shoot video, and collect audio – the smallest piece of information may have the greatest impact

Sixth, do the analysis. Hands down, analysis is the most difficult, but also the most rewarding part of research. A trained ethnographer, for example, will do more than report anecdotes. A trained ethnographer will bring a deep understanding of cultural understanding and social theory to the analysis process. This goes beyond casual observation and starts to pull together the web of significances and practices that get to the underlying structures of why people do what they do. Analysis should always work within a framework grounded in the social sciences. Analysis takes time, but the results will include modes of behavior, models of practice, experience frameworks, design principles, and cultural patterns. Once the data has been analyzed and crafted into something meaningful, the research team should be able to provide a rich story with a clear set of “aha” findings.

Finally, it isn’t enough to simply hand off results. As compelling as we may find our insights, that doesn’t always translate into someone seeing immediately how to apply them. Once insights and findings are shared, you need to work with others to craft those findings into action plans, product ideas, etc.

The end result is that you create greater value for the client and for yourself. The process is, admittedly, more time consuming than traditional approaches, but it ultimately yields greater insight and reduces time and costs on the back end. It also yields better work that will impact the customer or end user more significantly. 

Context and the Changing Mobile Landscape

Marketers increasingly think about consumers in complex ways. It is understood that in a changing digital landscape, the context in which they learn and shop influences what messages we deliver and how we deliver them.  But we rarely define “context”. It is one thing to design a usable app that conforms to human factors and cognitive requirements, but it is quite another to design a stage in an environment when there are innumerable semi-autonomous devices mediating in a swirl of information.

Physical Context

Physical context refers to the notion of infusing devices with a sense of “place.”  In other words, devices can distinguish the environments in which they “live” and react to them. But this is difficult. Mapping out longitude and latitude is one thing, but reacting to features (political, natural, social, etc.) is much more problematic. Getting beyond the boundaries of identifiable borders and structures, means coming to grips with “place”.

Think of a mall.  There are hundreds of stores, each with hundreds of devices. The device now has to decode what information is relevant and how it will deliver information. What competing retailer apps get precedence over others? When you receive an offer, will the device “tell” other retailers in order to generate real-time counter offers? The digital landscape is continuous at all points throughout the day and getting design right means understanding the systems in which people operate.

Device Context

Just as various kinds of sensory apparatus (GPS-receivers, proximity sensors, etc.) are the means by which mobile devices will become geographically aware, another class of sensors makes it possible for devices to become aware of each other. This presents a series of problems that are different than those of physical context.

Technology is on the verge of existing in a world with zero-infrastructure networks that can spring up anywhere, anytime. Devices will exist in a constant state of discovery.  Returning to the mall, imagine that you are with a friend whose device is communicating with yours.  In the mall are a couple of thousand devices, all of which are discovering each other.  What happens now?  Assuming we’ve dealt with the problem of one friend’s device communicating with the other friend’s device while blocking out the other 2000 devices, you still have several thousand potential “identities” that may have useful information.  How is it decided what to manage without devoting significant time to setting up the hundreds of variables?

Information Context

This is the realm of information architecture. Data no longer resides “in” our computers.  Devices are extensions of the cloud and exist as something akin to perceptual prostheses.  They exist to manipulate data in the same way a joy stick allows us to handle the arms of robot in a factory.  This reflects a shift in how we use information because all information is transitory.

Storage issues are essentially removed from the equation.  Content can leap from place to place and device to device in an instant. Content will be customizable and reflect the human-application interaction rather than shaping it. Devices will find themselves in the fourth kind of context of social interaction, with all its contingencies. Just as behavior is shaped by the moment, so too will the apps and information needed to adapt.

Socio-Cultural Context

Each person is unique to contrasting cultures, tongues, traditions and world views. A cultural context may exist on levels as diverse as a workplace, a family, a building, a county, a continent, a hemisphere. Cultural context provides a framework for what “works” for each consumer in the world.

It is at this point where a better perspective is gained on what will and will not be accepted in the mobile universe. Take a beer pouring app that mimics the pouring of a beer when the device is tilted.  It serves no direct function and yet it has been successful because of the cultural needs it to which it speaks – workplace breaks, male-to-male bonding, etc. But in another context, say Saudi Arabia, the context shifts. Success lies in understanding the reasons behind the consumers beliefs and actions in the symbolic exchanges, and the ability to code and decode those exchanges.  Marketing mishaps come from a lack of comprehension.

So What?

Our great technological leaps forward have also produced more complexity, leading to a greater need to make sense of insights. Without a means to categorize context, marketers will miss identifying trends that matter most. What to do?

  • Rethink the problem. Frequently, “the problem” is a facet of something else. For example, when researching an eBook the problem to be solved isn’t technology, it is understanding why people read different material in different contexts. It may be about displaying books as a means of gaining status. The point is the problem seen may not be the problem at all.
  • Define the contexts. Defining the contexts helps articulate the range of possibilities for observation. For example, if the consumer behavior is drinking beer, all contexts in which beer is purchased and consumed need to be articulated.
  • Think through the sample. Who is the marketing targeting? What are the social circles that will shape the event? It isn’t enough to define a demographic sample, you need to think in terms of cultural systems.
  • Make a plan that involves experiential information gathering, not just statistics. Develop a guide to navigate the data collection and a method for managing the data (everything is data). Don’t  just think about the questions to ask, but also include opportunities for observation and participation.
  • Head into the field. This is the heart of the process. Meaningful insights and moments of “truth” are slow to get at. Low-hanging fruit will be easy to spot, but the goal should be to find those deeper meanings. Because everything is data, from attitudes to artifacts, it is important to capture as much as possible.
  • Do the analysis. Analysis is the most difficult, but also the most rewarding. The goal is to bring a deep understanding of cultural behavior to the analysis process. This goes beyond casual observation and gets to the underlying structures of why people do what they do.

The process is more time consuming than traditional approaches, but it ultimately yields greater insight and reduces time and costs on the back end. The end result is that you create greater value for the client and for the company.

Anthropology and Usability: Getting Dirty

There are significant methodological and philosophical differences between ethnographic processes and laboratory-based processes in the product development cycle.  All too frequently, proponents of these data collection methods are set at odds, with members on both sides pointing fingers and declaring the shortcomings of  the methods in question.  Methodological purity, ownership and expertise are debated, with both ends of the spectrum becoming so engrossed in justifying themselves that the fundamental issues of product development are compromised.  Namely, will the product work in the broadest sense of term. One side throws out accusations of a lack of measures and scientific rigor.  The other side levels accusations about the irrelevance of a sterile, contextually detached laboratory environment.  At the end of the day, the both sides make valid points and the truth, such as it is, lies somewhere between the two extremes in the debate.  As such, we suggest that rather than treating usability and exploratory work as separate projects, that a mixed approach be used.

So why bridge methodological boundaries? Too frequently final interface design and product planning begin after testing in a laboratory setting has yielded reliable, measurable data.  The results often prove or disprove the functionality of a product and any errors that may take place during task execution.  Error and success rates are tabulated and tweaks are made to the system in the hopes of increasing performance and/or rooting out major problems that may delay product or site release and user satisfaction.  The problem is that while copious amounts of data are produced and legitimate design changes ensue, they do not necessarily yield data that are valid in a real-life context.  The data are reliable in a controlled situation, but may not necessarily be valid when seen in context. It is perfectly possible to obtain perfect reliability with no validity when testing. But perfect validity would assure perfect reliability because every test observation would yield the complete and exact truth.  Unfortunately, neither perfection nor quantifiable truth does exist in the real world, at least as it relates to human performance.  Reliable data must be supported with valid data which can best be found through field research.

Increasingly, people have turned to field observations as an effective way of checking validity.  Often, an anthropologist or someone using the moniker of “ethnographer” enters the field and spends enough time with potential users to understand how environment and culture shape what they do.  Ideally, these observations lead to product innovation and improved design.  At this point, unfortunately, the field expert is dropped from the equation and the product or website moves forward with little cross-functional interaction. The experts in UI take over and the “scientists” take charge of ensuring the product meets measures that are, often, somewhat arbitrary.  The “scientists” and the “humanists” do not work hand in hand to ensure the product works as it should in the hands of users going about their daily lives.

Often the divide stems from the argument that the lack of a controlled environment destroys the “scientific value” of research (a similar argument is made over the often small sample size), but by its very nature qualitative research always has a degree of subjectivity.  But to be fair, small performance changes are given statistical relevance when they should not.  In fact, any and all research, involves degrees of subjectivity and personal bias.  We’re not usually taught this epistemological reality by our professors when we learn our respective trades, but it is true nonetheless.  Indeed, if examining the history of science, there countless examples of hypothesis testing and discovery that would, if we apply the rules of scientific method used by most people, be considered less than scientifically ideal James Lind’s discovery of the cure for scurvy or Henri Becquerel discovery the existence of radioactivity serve as two such examples.  Bad science from the standpoint of sample size and environmental control, brilliant science if you’re one of  the millions of to people to have benefited from these discoveries.  The underlying problem is that testing can exist in a pure state and that testing should be pristine.  Unfortunately, if we miss the context we usually overlook the real problem. A product may conform to every aspect of anthropometrics, ergonomics, and established principles of interface design.  It may meet every requirement and have every feature potential consumers asked for or commented on during the various testing phases. You may get an improvement of a second in reaction time in a lab, but what if someone using an interface is chest deep in mud while bullets fly overhead.  Suddenly something that was well designed in a lab becomes useless because no one accounted for shaking hands, decrease in computational skills under physical and psychological stress, or the fact that someone is laying on their belly as they work with the interface.  Context, and how it impacts performance with a web application, software application, or any kind of UI now becomes of supreme importance, and knowing the right question to ask and the right action to measure become central to usability.

So what do we do?  We combine elements of ethnography and means-based testing, of course, documenting performance and the independent variables as part of the evaluation process.  This means detaching ourselves from a fixation with controlled environments and the subconscious (sometimes conscious) belief that our job is to yield the same sorts of material that would be used in designing, say, the structural integrity of the Space Shuttle.  The reality is that most of what we design is more dependent on context and environment than it is on being able to increase performance speed by 1%.  Consequently, for field usability to work, the first step is being honest with what we can do. A willingness to adapt to new or unfamiliar methodologies is one of the principal requirements for testing in the field, and is one of the primary considerations that should be taken into account when determining whether a team member should be directly involved.

The process begins with identifying the various contexts in which a product or UI will be put to use.  This may involve taking the product into their home and having them use it with all the external stresses going on around them.  It may mean performing tasks as bullets fly overhead and sleep deprivation sets in.  The point is to define the settings where use will take place, catalog stresses and distractions, then learn how these stresses impact performance, cognition, memory, etc.  For example, if you’re testing an electronic reading device, such as the Kindle, it would make sense to test it on the subway or when people are laying in bed (and thus at an odd angle), because those are the situations in which most people read — external variables are included in the final analysis and recommendations.  Does the position in bed influence necessary lumens or button size? Do people physically shrink in on themselves when using public transportation and how does this impact use?  The idea is simply to test the product under the lived conditions in which it will find use.  Years ago I did testing on an interface to be used in combat.  It worked well in the lab, but under combat conditions the interface was essentially useless.  What are seemingly minor issues dramatically changed the look, feel, and logic of the site. Is it possible to document every variable and context in which a product or application will see use?  No. However, the bulk of these situations will be uncovered.  And those which remain unaddressed frequently produce the same physiological and cognitive responses as the ones that were uncovered.  Of course, we do not suggest foregoing measurement of success and failure, time of task, click path or anything else.  These are still fundamental to usability.  We are simply advocating understanding how the situation shapes usability and designing with those variables in mind.

Once the initial test is done, we usually leave the product with the participant for about two weeks, then come back and run a different series of tests.  This allows the testing team to measure learnability as well as providing test participants time to catalog their experience with the product or application.  During this time, participants are asked to document everything they can about not only their interaction with the product, but also what is going on in the environment.  Once the research team returns, participants walk us through behavioral changes that have been the result of the product or interface.  There are times when a client gets everything right in terms of usability, but the user still rejects the product because it is too disruptive to their normal activities (or simply isn’t relevant to their condition).  In that case, you have to rethink what the product does and why.

Finally, there is the issue of delivery of the data.  Nine times out of ten the reader is looking for information that is quite literal and instructional.  Ambiguity and/or involved anecdotal descriptions are usually rejected in favor of what is more concrete. The struggle is how to provide this experience-near information.  It means doing more than providing numbers.  Information should be broken down into a structure such that each “theme” is easily identifiable within the first sentence.  More often than not, specific recommendations are preferred to implications and must be presented to the audience in concrete, usable ways.  Contextual data and its impact on use need the same approach.

A product or UI design’s usability is only relevant when taken outside the lab.  Rather than separating exploratory and testing processes into two activities that have minimal influence on each other, a mixed field method should be used in most testing.  In the final analysis, innovation and great design do not stem from one methodological process, but a combination of the two.

Taking Clients Along for the Ride

In the last few years, ethnography has shifted from a novel and often misunderstood methodology to a do-it-or-die necessity in many marketers’ and product designers’ tool kits. The idea of ethnography has a logical appeal for business clients; market intelligence born from the homes and hearts of customers. It’s an ethnographer’s job to talk to and observe people, as they go about their daily routines, using sociology and anthropology methods for data collection and analysis – giving clients true-to-life, informed insights and a firsthand understanding of their customers.

Perhaps naively, many ethnographers assumed that we would work in a vacuum when they learned their trade. We’d go into the field – people’s homes, workplaces, and leisure areas – and then report to clients what we learned. However, we soon realize that some clients take us literally when we state ethnography will bring them into their customers’ homes. They aren’t always satisfied with just overseeing the project or telling us what they want to learn and why. This is a great opportunity for clients to see customers using their products in real situations and a chance to get to know the customers personally. But it presents ethnographers with certain challenges. 

Involvement Risks

Ethnographers tread delicately. Every time we perform fieldwork we need to become instant friends with participants. We need them comfortable enough to behave “normally” while we point a camera at them, and to feel that they can tell us anything – even if they’re just talking about peanut butter. The field is spontaneous and sensitive, and anything can happen. That means making sure we and our clients do all we can to ensure that the field remains as natural as possible.

Clients have varying levels of fieldwork experience. Some are qualitative market researchers with a little in-context interviewing under their belts, and others don’t have much first-hand knowledge of qualitative research or the human sciences. Consequently, clients might interfere with the interview process, misinterpret the data, or overlook important but subtle information. However, ethnographers can take steps to mitigate these concerns.

1. Explore Motives

Understand why clients need to go into the field and what their expectations are of the project. DO they want direct exposure to generate ideas, ease issues of trust/competency/legality, train their in-house ethnographer, or simply be more engaged in the process? For the sake of both the research and the client-ethnographer relationship, articulating these issues is essential.

It’s paramount that clients communicate goals for a smooth operation. On one occasion, a busy client of ours wanted to see his products used in context, so he attended two field visits early in the project. Knowing his reason and planned number of outings, we ensured they’d include use of his products. Everything went well, and his observations were eye-opening. Because he didn’t have time to invest in more fieldwork, we sent him a video document every time someone used his products during the project.

2. Establish Boundaries

Before fieldwork, ethnographers must communicate the research boundaries and client role. Clients should recognize that ethnographers’ expertise consists of more than an ability to build rapport with strangers; their skills are rooted in a keen understanding of social theory and methodological rigor, and entail years of training.

Ethnographers have a process and particular mindset that directs the interview, interaction, and interpretation, so guiding client input before starting a project will help prevent everyone from asking leading questions or biasing conversations. Limits ensure quality work and allow clients to make the most of a field visit.

It also permits them to function within a frame of hierarchical authority, lessening their need to be project leader. In other words, clients understand that the context reduces or removes a layer of authority. It lets them focus on learning and executing predetermined tasks, instead of feeling compelled to handle everything. They can filter information through a training perspective while taking a holistic approach.

3. Allocate Responsibilities

Providing clients an indispensable role in the projects, such as videotaping an interview, helps them feel more like team members and less like visitors. It also raises comfort levels of everyone involved. Assigning tasks s also a practical necessity: Clients can replace research assistants in the field. Two researchers plus a client can threaten and crowd a participant, who just wants to demonstrate the best way to clean a bathroom countertop.

4. Encourage Reciprocation

It’s important to know clients well and be thoughtful about their flexibility, political realities, and character traits. Unfortunately, there often isn’t enough time to do so in-depth. Clients might arrive a half-hour earl for an afternoon interview and leave that evening, never to go into the field again. In this case, an ethnographer can only outline some expectations and techniques – through phone and e-mail conversations beforehand, and on the spot (frequently while sitting on cushy hotel-lobby chairs).

When clients have more time to invest in the ethnography, there are two parts to building a solid team and guaranteeing productive fieldwork (despite their lack of experience.) Clients must be willing to adapt to new or unfamiliar methodologies – techniques for data gathering and interpretation – regardless of their backgrounds (e.g. design, business strategy, engineering). And ethnographers must appreciate and incorporate clients’ theoretical and practical contributions. Success requires devoting time and energy to discovering the capabilities of all the team members – ethnographer and client alike.

Each team member can learn to apply findings across a range of activities. After all, a key to business achievement is using seemingly disconnected information to build new products, brands, and business models. Learning how best to conduct research and understanding individual roles in the field ultimately helps the client use the gathered information most effectively.

Protection and Collaboration

As ethnography becomes a staple of market research, we just might see marketers and product designers make an exodus to the field – with or without us. Ethnographers need to prepare for the possible outcomes. They should do so by not only preventing research from being disturbed, but also by harnessing clients’ intelligence and know-how – using their involvement as a springboard for more effective and actionable ethnography. In the future, most marketing decisions and product innovations will be based on real-world experiences with ordinary people.

Snack Time

In it’s simplest definition, a snack is a small portion of food meant to hold one over between meals. In contrast, a meal is typically comprised of multiple items, has higher caloric content and is usually tied to rituals of time and location.

 Historically, snacks were prepared from ingredients commonly available in the home. This has changed considerably over time with the new norm existing today as pre-made foods that are conveniently packaged and last seemingly forever.

But snack foods are not just treats anymore. They have to become part of the larger ingredient mix along with potatoes, carrots or butter. Frito Pie is on the menu alongside the $25 dish of shrimp etouffee. This may not seem important to the producer as long as products are selling at the store. But it validates a fundamental element of consumer behavior – the end user decides how to use any product he or she purchases. The challenge for the producer is to recognize the innovative ways consumers use their products and facilitate strategies that will help keep the trend going.  This means understanding the underlying cultural processes that have allowed this transformation to take place and how to capitalize on it in order to grow sales.

Some credit to the changing role of snack foods must of course be attributed to the inventiveness of snack producers. Restaurateurs and chefs have also been and will continue to be tremendous influencers.  Consumers, rather than turning to manufacturer websites and cook books are looking to the Food Network and local chefs not just for ideas, but also for validation of their culinary choices. Even subculture icons like Lux Interior of The Cramps (a rockabilly/punk fusion band founded in the 1970s) have helped shape the use of snacks in cooking – Mr. Interior had a deep penchant for Doritos Quiche.

To be sure, the snack is the inspiration. We see evidence to support this notion starting back in the 50′s with the introduction of recipe ideas for everything from corn flakes to Cheetos. But what accounts for the resurgence of using snacks in cooking in an age dominated by “healthy” foods, “quality” ingredients and of haute cuisine in the home? And what does this mean for a marketer or product development team? The simple is answer is that by understanding the deeper issues driving the transformation of how snack foods are used, it is possible to better innovate and drive sales over time. We have identified several areas that deserve special attention.

Snacks as Symbols

Meaning is produced and reproduced within a culture through various practices, phenomena and activities that serve as systems. Rituals associated with food represent a deeply ingrained structure by which meaning is propagated within a culture. In other words, a potato chip is more than food; it is representative of childhood memories, concepts of being a good or bad parent, regional affiliation and other symbolically charged concepts.

The brand itself is equally symbolically charged. This explains why a generic brand of corn flakes to top your tuna casserole may not be “good enough.” Only Kellogg’s communicates that the cook cares enough about the people eating. This also explains, in part, the reluctance of many to buy store-branded products (although other factors come into play as we see, for example, in times of economic crisis).

Flavor is less the issue than the need to create a dish that fits within the symbolic framework in which it is constructed and consumed. The implication is that it recipe ideas aren’t enough. These ideas must be tied to richer symbols. Package design, shelf positioning, etc. must all reflect greater symbolic structures and lead to the construction of new and unique traditions that work within the existing framework.

The Invention of Tradition

Traditions exist to preserve a wide range of commonly held ideas, practices and methods used by distinct populations. Food joins other elements like music, folklore and clothing to create culture. Beliefs or customs are taught by one generation to the next and actions are reinforced over time. The preservation of culture, however, becomes much more difficult in a postmodern world.

Through the emergence of tribal subcultures along with the ease and means to communicate and cross-pollinate we see many using brands as badges of affiliation. In practice, people are “inventing” tradition by endowing products with rich symbolic meaning. Product, therefore, becomes a means by which people artificially establish a past and validate identity in the present. Mom may have never actually made Frito Pie, but it helps the consumer maintain a sense of identity to believe that she could have.

Food as Novelty and Play

Finally, using snack foods as ingredients speaks to the very basic need to invent and play. Snack foods used in a way different from their “intended purpose” is novel. At a psychological level, novelty speaks to four basic principle elements:

  1. Thrill Seeking: the pursuit of activities and objects that are exciting, unusual and potentially dangerous.
  2. Experience Seeking: the pursuit of unfamiliar and complex environmental stimuli, as through cooking.
  3. Disinhibition: Sensation-seeking through engagement with other people; searching for opportunities to lose inhibitions by engaging in variety in food, sex, alcohol, etc.
  4. Boredom Susceptibility: the tendency to be easily bored by familiar or repetitive situations or people, or by routine work.

Beyond the sensory benefits of novelty, there is the need to use experimentation as a means of establishing cultural capital. Snack foods have become a means by which people not only attain psychological stimulation but also display to friends and loved ones that they are inventive and interesting.

Implications

It may be interesting, but what does it all mean? Simply put, it means that whoever can tap into these unconscious motivations, symbols, and practices can increase sales, grow customer loyalty and develop brands that are synonymous with enjoyment. We often interpret our products through a self-limiting, narrow focus. Understanding snack foods from the vantage point of “ingredient” opens a new series of delivery systems, product possibilities and messaging strategies.

After all, the customer will always decide how to use your product.

 

 

 

 

 

Innovation Is Creative Thinking With Purpose

Innovation is creativity with a purpose. It is the creation and use of knowledge with intent. It is not only creating new ideas but creating with a specific intention and with plans to take those ideas and make something that will find purpose the world. Innovation is ideas in action, not the ideas themselves. Innovation is also a word that gets thrown about, often without really considering the reality that it is, in fact, damn hard work. What makes it hard work isn’t the generation of new ideas, but the fact that turning complexities into simple, clear realities can be excruciatingly difficult, but that is precisely what needs to be done to make innovation useful. Simplicity and clarity are tough to do.

Innovation, whether we’re talking about product design or a marketing plan, should be simple, understandable, and open for a wide range of people. Innovation is becoming more of an open process, or it should be. The days of the closed-door R&D session is gone as we incorporate more engagement of users, customers, stakeholders, subject matter experts, and employees in the process. Most companies are very good at launching, promoting and selling their products and services, but they often struggle with the front end of the innovation process, those stages dealing with turning research and brainstorming insights into new ideas.  The creating, analyzing, and developing side of things is often murky or done in a haphazard way. Articulating a simple system with clearly defined activities is central to bringing innovation to life and involving a wide variety of stakeholders and collaborators who can understand and engage in making the beginning stage of the innovation process less confused. It is as much art as it is science.

Easier said than done – you need a starting point. The simplest and most obvious element in this is to begin with a system of innovation best practices. You would typically generate multiple ideas and then synthesize relevant multiple ideas logically together in the form of a well-developed concept. This is the no-holds-barred side of the idea generation process and allows for people to begin exploring multiple trajectories. The key is to make sure the ideas don’t remain in a vacuum, but are open to everyone. With that in mind, it is extremely important to ensure that ideas are captured and stored in one place, whether electronically or on a wall (literally) dedicated to the task. Truly breakthrough innovations are not solitary work, they are part of a shared experience where ideas build on each other. They are the result of collaboration. This means that the work involves others to help you generate ideas, develop concepts, and communicate the concepts in meaningful and memorable ways. The more open the process, the more likely it is to get buy-in as people engage directly in the innovation process.

Next, make sure people have access to all the information available to them. Research around a problem or a people is often lost once the report is handed over and the presentation of findings complete. Central to the success of an innovation project is to make sure themes and experiences are captured and easily available to the people tasked with generating ideas. So make it visible, make it simple and make sure people are returning to the research (and researchers) again and again. This is about more than posting personas on boards around a room. It involves thinking about and articulating cultural practices in such a way that they are visible, clear and upfront. As people think and create they should constantly be reminded of the people and contexts for which they are creating.

Once the stage is set, the problem and hopeful outcomes need to be made clear. This is fairly obvious, but it’s easy to drift away from the goals as ideas emerge and people have time to simply forget why we’re innovating (or attempting to innovate ate any rate). So make them real, crystallize the problems and challenges. Make them visible at every step of the process.  In addition to posting the goals, be sure to have space to pose questions that are grounded in the problems or opportunities for innovation. Categorize the types of questions and ask that people visit them every step of the way to ensure the process stays on track and is grounded in the goals of the project. Categories of question types to consider might include:

  • How Will This Impact the Community: How can we help people, build communities and reflect the cultures and practices for which we are designing?
  • What is the Opportunity: How can we create something that provides a better life for the intended users?
  • Is It New or are We Simply Tweaking Something: How can the thing we’re creating change the current situation or are we simply creating a variation on an established theme?
  • How Will It Be Interpreted: What challenges do we face in getting people to accept the concepts and what cultural or psychological barriers do we need to overcome?

These are just a few examples, but they represent some of the ideas that might emerge when thinking of new designs, models and messaging strategies. They will, of course, vary depending on the goals of the organization. If your goal is to build a new delivery system for medications or if it is to do something as broad as change the way people eat, then the questions will change. The point is to have a space that opens up the dialog, not just a space to throw out ideas.

The point to all this is that in order to innovate, you need to clarify a simple system that all the various contributors can use. Establish a system and stick to it. Identify and write down the areas you would like to innovate in, get all the parties who will contribute involved and make sure they engage in an open environment. Create questions to ask and areas of exploration. Do that and you will move from a complex mess to something that can be acted upon.

Experimenting With Ethnography

Ethnography means many things to many people these days and heaven knows I’ve spouted off about that topic on more than one occasion, so I won’t go down that path again (at least not for today). But there are underlying currents in how people define ethnography that seem to be representative of a larger degree of consensus. One of the central themes that emerges again and again is that notion of ethnographer as simple observer.  We document, we learn and we report but rarely do we experiment. And that is something I think we need to see change.

“Experimental ethnography” emerged as a general movement in anthropology that focused on issues of representation in ethnographic writing in the aftermath of the “writing culture” critique of the 1980s. Those critiques were largely informed by the poststructuralist, feminist and Marxist assessments of the historical relativism and construction of Western sciences. Long story short, the nature of how we construct, conduct and think about ethnographic research and representation was challenged. The primary meaning of experimental ethnography was the experimentation of writing ethnographies and the representation of cultural worlds, traditions, and things. Interestingly enough, this is also the period when ethnographers began leaving academia for the business and design worlds in noticeable numbers. However, the notion of experimental ethnography remained largely inside academic and/or public sector fields of study.

So, traditionally what are we talking about when we say “experimental ethnography”? Experimental ethnography is a mode of fieldwork in which given, prior and assumed areas of knowledge are used and recirculated in fieldwork activities, dynamics, and practices. The goal is to produce outcomes that hold direct relevance to and for the communities with which research is conducted. From its inception, experimental ethnography then had an affinity to applied anthropology with the goals of effecting a “social change” in a community, producing knowledge for use in policy generation or aiding communities to rediscover and revitalize aspects of their cultural traditions. Again, while these are all noble and worthy pursuits, this approach to how we gain and use knowledge remained in areas other than the private sector. And that needs to change.  Why?

Because it produces better results for our clients, plain and simple. We are here to help the people who hire us build better things. That can certainly spring from a purely observational model, indeed it frequently does, but it also limits our trajectory.  In this emergent paradigm of experimental ethnography, “knowledge” is not being “tested” for truth to produce facts by a determined structure of fieldwork procedures. Rather, fieldwork practices are recombined to explore their utility through the activity of the exploratory bricolage. In other words, the experimentation is not about testing but about fluid modes acquiring knowledge and considering methods of co-constructing outputs. This exploration for utility is where a different notion of experimentality enters into play. In thinking about ethnographic fieldwork in this way, it allows us to incorporate techniques from various fields when working with participants in a methodologically sound way, rather than simply pulling in a range of techniques with little or no clear system or rigor.

As this model of ethnography plays out, the idea is that by engaging the participants, the designers and the ethnographer in a dialog in the field, the participant gains both in terms of good product development and in terms of psychological investment. All parties have a direct connection to the process and therefore the end results. It also means that the parties engaged in the fieldwork and creation/translations of the insights that emerge are not tied to the underlying one-for-one trade of information. The roles are stripped bare and the researcher, designer and participant take on a shared understanding that the intent is to create rather engage in the transaction of knowledge.

Of course, this means that the researcher needs to be well versed in a range of methods and nimble enough to change direction quickly. It also means letting go of the notion, a myth in fact, that purely objective observation is possible. A terrifying notion to some, no doubt, but very real nonetheless. Power, politics, environment, etc. all factor into how fieldwork unfolds. Tricking ourselves into a belief that the more removed we are, the more valid the results, is perhaps the first thing that needs to be discarded. After all, the point of ethnography is exploration and learning, not recreating in a live setting what one gets from a survey. Open the possibilities of an experimental approach to ethnography means opening the door to a host of outcomes that may be overlooked.

Experimental designs offer greater internal validity for learning what the effects of a social program are, and ethnographic methods offer greater insight into why the effects were produced. The prospects for such integration depend on the capacity of parties within social science to work together for the common goal of discovering insights and how to implement them.

 

 

Getting Over Ourselves: Make research meaningful

The other day I was privy to a discussion by a researcher who was decidedly upset about having to “dumb down” the research report he had completed. The client was impressed by the depth of the work, but equally frustrated with the seemingly academic depth of the language of the report and the use of jargon that was, realistically, more appropriate to anthropological circles than to a business environment. The researcher was upset by the client’s request to strip out discussions of agency, systems design theory, identity formation, etc., and stated something along the lines of “I had to learn this sort of thing in grad school, so they should take the time to do the same”. And while I think it would be lovely (and perhaps beneficial) if clients took such an interest in what we as researchers study, I have to say my views on the matter are very different. Making what we learn useful and meaningful to the client isn’t “dumbing it down”, it’s performing the task for which we were hired. We do not receive grants and write peer-reviewed articles when businesses hire us. Indeed, we may not write at all. What we do is produce insights and information that they can use, from their design team to their CEO. If they aren’t asking us to become expert in supply chain models or accounting, then asking them to embrace often daunting concepts in socio-cultural theory is both unrealistic and, frankly, arrogant.

In general, companies hire ethnographers (anthropologist, sociologists, etc.) for a simple reason: to uncover new ways to achieve competitive advantage and make more money. This translates, most often, into research to understanding new product opportunities, brand positioning, or salient marketing messages. Unfortunately, our clients often have no idea what to do with the research. But more often than not, the fault lies with ethnographers, not the client, and can be overcome if we apply ourselves just a bit.

Usefulness means being a guide, not a lecturer. So why are we so often disinclined to make what we do useful to business people? Part of it, I believe, stems from an unwillingness to address our own biases openly and honestly. There is a tendency among many of us coming out of what have traditionally been academic disciplines to ridicule or react negatively to people in the business world. To be honest, it’s why we chose, say, an anthropology program over a business program in college. We often, consciously or subconsciously, hold these people in contempt and believe that it is they who should bend, not us, as if we are providing secret knowledge are indeed of a higher order of life than they. We resent the idea that these lesser minds would have to audacity to ask us to curb our genius. And yet, there’s nothing new in making complex ideas useful, simple, or intelligible to people without advanced training in the social sciences. Look at any Anthro 101 course and you realize we’ve been doing this for a very long time already. The fact of the matter is that in order to be relevant and to get the client excited about what we do and to value the thinking behind our work, we have to remember that not everyone wants to be an expert in social science any more than they want to be physicians or painters – they want us to be the experts and to know what we’re doing, including crafting what we learn into something they can grasp and apply even as they try to balance their own work load. Balancing jargon with meaning is, or should be, the goal.

Another struggling point I often think stems from how many of us were trained. Traditionally, the researcher is either left to work alone or as part of a very small team. The findings are analyzed, complied and shared with a small group of like-minded individuals. (We would like to believe that the numbers of people who care about what we write are larger, but the truth is most of us don’t give the work of our colleagues the attention they deserve or would at least like to believe they deserve.) Our careers are built on proving our intelligence, which means making an intellectual case that addresses every possible theoretical angle in great detail. But in the business context, to whom are we proving our intelligence? And do they care? They hire us precisely because we are the experts, not to prove how smart we are. This isn’t to say that we can or should forego the rigor good ethnographic research should employ, but it is to say that whether we like it or not, most of the theoretical models we use should end up in the appendix, not in what the client sees, hears or reads. Not only does it overcomplicate our findings, it often comes across as either arrogant or needy, neither quality being something the client finds particularly enticing or reassuring.

The fact is that we do ourselves and the discipline a disservice by not learning the language and needs of business people. We complain that untrained people are slowly “taking over” ethnography, but it’s our own doing nine times out of ten. It isn’t enough to have a better grasp of the complexities of the human condition, we have to learn to translate our work and come to terms with the fact that the people hiring us have a very real, practical need for our findings. If it cannot be translated into something that can be grasped in the first two minutes, then in their way of seeing the world, it is money wasted.

Are we there to educate or inform? Our work is frequently deemed too academic. So what does it mean when a client says, “It’s too academic.”?
 It means that they didn’t hire you to teach a class about anthropological theory and method. It means they don’t want to sit through a 100 page Power Point presentation before getting to the heart of the matter. They are in business and have neither the time nor the interest of a scholar or student.  Again, this doesn’t mean you don’t do the work or fail to set up the points you are trying to make, but it does mean that you be cognizant of the  fact that the audience hired you to improve their business and products, not teach a course on anthropological methods.  And indeed, some concepts are simply too complex to turn into a couple of bullet points. But that doesn’t mean we cannot try, particularly if we hope to get more work from the client.

The people with the luxury of sitting through a lengthy presentation or who have the time to discuss the intricacies of social theory rarely have a significant amount of authority in the decision-making process, and they rarely hold the purse strings.  This isn’t to say that those two hours of research findings we present aren’t meaningful, but rather that presentations need to be tailored to the needs of the people buying your service (research) and product (recommendations). For the business community, the product is not knowledge, but intelligence.  In other words, the product is knowledge that is actionable and useful. And to be fair, it’s worth noting that the client is the one who pays for our work. If the idea of providing them with the service and product they need is unpalatable, then I would argue that the ethnographer needs to quit complaining and start exploring a different line of work, plain and simple.

The researcher, research team, creative team, client, and everyone invested in the project need to work toward turning information into something they can act upon. When the time comes to sit down with the client and explain what you learned, the ethnographer must be prepared to also explain what to do with it next in a simple, clear way.

 

 

Translating culture and opening markets

Success translates well into narrative. Who hasn’t heard those wonderful stories of marketing campaigns gone astray when introduced into a global setting? Remember when Puffs tissue started marketing their tissues in Germany and it didn’t do so well because “Puff” means “brothel” in German?  Or when Bacardi launched a fruit drink named Pavian in France it translated into slang as “chick,” but when they promoted it in Germany the same word meant “Baboon?”

We’ve all heard of these mistakes and we all get a chuckle, but the business ramifications of not doing your cultural homework are tremendous. And this goes well beyond something as superficial as a mistranslation.  We are prone to imposing our way of seeing the world on others, but what we may see in the developed world as universal may be significantly different in developing countries. Culture shapes how we use, interpret and shop for goods and what US shoppers may see as simply, say, buying chicken for dinner may mean much more in another part of the world. In other words, retailers and manufacturers need to understand what matters and why it matters according to different cultural perceptions.

Returning to our example of purchasing chicken at the grocery in the US, take concepts of cleanliness and food safety. As a population that has had easy access to meat for longer than most of us can remember, our concerns revolve around the promotion of “health” as a means of reducing fat in the diet. Increasingly, we make decisions based on the sanitary conditions of the farms where chickens are raised and the ethical treatment of the animals.  We increasingly associate “healthy” with being “green” (another wonderfully loaded and vague word). That has led to a push for reduced packaging as proof of sustainability and healthy living.

Now, take China. In a place where access to meat was – until fairly recently – limited, chicken is associated with status and upward mobility.  In the past, the source of the meat itself was often suspect because you may have purchased it in less than uniform locations.  Consequently, what we would see as excessive packaging is understood differently – the factory setting implies progress, wealth and modernity, which in turn imply good “health.”  Meat is something you want to show off to your friends and family because it is associated with status, which is associated with good health. Add to that the fact that people in much of world (unlike the US) have traditionally seen the chicken as something other than a pure commodity.  Indeed, there are many poems written about chickens (He Crows the Morning by Hsieh Ling-Yun or The Most Noble Fowl by Mohammad Ibn Sina). The result is that if you position chicken in the developing world as you might in the US, as a low-fat, easy to prepare alternative source of protein, it won’t correspond to the local worldview and your brand won’t gain traction.  You will invest a lot of money and may get very little in return. And China is only one example; expand this to the BRIC nations or the Middle East.

Of course, this is only one example, but the idea cuts across all categories. Don’t believe it? Tropicana initially failed when pushing orange juice in South America because it was pushed as a breakfast drink, which in South America it is frequently not – our beloved breakfast icon is something for the afternoon, a treat and a snack.  Papa John’s, on the other hand, is doing wonderfully in Egypt by maintaining it’s “American” mystique while incorporating toppings and product names that reflect local tastes.

Understanding what it means to shop on a global, national and local level is central to developing successful new products, sales channels and marketing campaigns. That means going beyond the product or retail environment and asking bigger questions:

Question: How does shopping convey status and wealth?

Answer: Pabst Blue Ribbon is a premium brand in China and signifies wealth because it has been positioned as a classic American Lager rather than a hipster yard beer. In China, it conveys a sense of worldliness, refinement and cultivated taste.

Question: What cultural norms shape how people interact with you brand and your store? Answer: Victoria’s Secret can’t be promoted in Riyadh or Bangalore the way it is in London.  Attitudes outside the West about sexuality, exposure of the human body and gender roles are radically different, shaping everything from marketing content to store displays.

And this could go on and on.  So what does it mean for marketing your brand in the developing world (in fact, what does it mean for marketing your brand in Alabama vs. LA)? It means that before you decide to launch or even reposition a brand or product around the world you need to spend some time digging and learning why people live the way they do and how your brand can fit into that complex system of practices and beliefs.  It isn’t enough to make sure the language is translated correctly or the color pallet makes sense. You have to come to understand the population the way you understand your neighbor. That’s where you find new opportunities and that’s where you find growth, both in terms of brand equity and the bottom line.