Tuesday, December 1, 2009

Extracurriculars: School’s Many Hidden Lessons

I’ve recently started substitute teaching, which is a fascinating vantage point from which to conduct all manner of ethnography. Having previously tutored at an assortment of after-school programs targeted specifically towards assisting students who struggle in school, I have long been aware of how much academic success is connected to the mastery of specific behavioral patterns. For instance, the abilities to sit still and quietly for long periods of time, to process information delivered orally and received aurally, and to adhere strictly to deadlines and time schedules, are, I would propose, more closely correlated to success in school than is intelligence.

I was recently linked to an article entitled The Six-Lesson Schoolteacher, which pretty clearly delineates a number of the implicit cultural lessons which the structure of the American public school system tends to promote. It’s not too long, so I highly recommend you pop over and give it a read, but to summarize, the six lessons he highlights are (1) stay wherever you are told you belong, (2) learn to awaken and dismiss interest in a topic on command, (3) surrender your will to arbitrary authority, (4) ignore your own interests and focus on what you are told to engage with, (5) depend on the assessments of others for your own sense of self-worth, and (6) expect constant surveillance, giving up all claims to privacy.

I have certainly seen each of these lessons in action in schools scattered across the country (and, during the past several days of substituting, have been expected to advance them myself), and would guess they are pretty universal, at least among American public schools. And yet I don’t think most teachers sit down and consciously decide that these lessons are particularly valuable, and then contemplate how best to reinforce them in the classroom. Rather, I suspect that they organize their classrooms in the ways that they do because (1) that is how their classrooms were organized when they were students, (2) their teaching instructors have taught them specific mechanisms of classroom management, and (3) they have found that the type of teaching they are expected to do tends to be easier under the conditions these organizational structures and classroom management mechanisms foster. They most likely don’t stop to wonder what more global behavioral lessons are being taught, and how these might affect the expectations, inclinations, and abilities of their students in other circumstances.

I’m not arguing that these lessons are inherently evil or terrible (as with regard to fashion, that might be an argument I would make if asked, but it isn’t the one I’m making here). I’m merely noting that these are lessons that are being taught, even though the people who are teaching them did not intentionally set out to do so, and might or might not agree that they should be taught if they were asked overtly.

I suspect that the extent to which schools teach all manner of cultural lessons beyond those explicitly listed in the curriculum is markedly underestimated, by teachers and students alike. I will present one example of this. Today, I administered a writing prompt which asked students to take a position on the question of whether or not schools should stop selling soda pop and other unhealthy snacks in vending machines and cafeterias in response to rising obesity levels and other health problems among adolescents. Afterwards, I snuck a peek at the responses. I was impressed by the variety of points which the students (most of whom argued against the change) raised to support their position—it might decrease school concession revenues, students could procure unhealthy foods elsewhere, health-conscious students who wanted an occasional snack should not be penalized for the lack of self-control of unhealthy students, and even (my personal favorite for cultural insight) that it might lead to targeting and bullying of overweight students perceived to be the cause of the change.

Students arguing in favor of the change had little to support their argument beyond repeating the claims made in the prompt about rising levels of adolescent obesity. They pointed out that limiting the availability of snack foods would force students to eat more healthily while in school, though many lamented that this would probably not do much good overall, since unhealthy foods would still be available elsewhere. Interestingly, nobody mentioned the observation I personally considered most compelling—that the structural set-up of schools teaches all manner of implicit lessons, and that the presence or absence of junk food amongst school fare offerings is likely to shape students’ perceptions of typical, appropriate, or desirable meals and snacks.

If students walk into the school cafeteria every day to a salad bar, whole-grain-bread sandwiches, and fruit, as opposed to a smorgasbord of greasy pizza, french fries, and donuts, this will contribute to their image of what a good meal should look like, whether or not there is somebody standing there lecturing about proper dietary habits. (Indeed, as I have mentioned several times already, I personally believe that such implicit messages can be even more influential than ones overt enough to alert us to the fact that somebody is trying to persuade us of something.)

This is especially true at school, since it provides a pervasive (students are there five days a week, nine months a year, for 12+ years of their life) model which on many other levels is presented as a place of preparation for appropriate behavior “out in the real world.” So I propose that even though unhealthy foods would still be available in other places, such as at home or at stores and fast food restaurants, students would be less likely to think of purchasing these things when they were in these places if such foods fell outside of those students’ mental image of a typical thing to consume.

(If you don’t believe this, think about your own shopping habits. Daily, weekly, or monthly, you walk into a grocery store that I am guessing is filled with hundreds if not thousands of products that you have never purchased and would never think of purchasing. The things you do purchase probably fall within a specific and fairly consistent subset shaped by the sorts of things your parents would buy, as well as by various other regular food-providing places in your past.)

I found it interesting that not a single student mentioned the particularly formative role that schools play in providing a model of desirable behavior—even though they are currently personally immersed in precisely that process. I take it as an indication of precisely how unconscious we all may be of the strongest sources of cultural influence in our day-to-day lives.

Wednesday, November 25, 2009

The Lure of the Shortcut

Assuming that my claim that controlling culture just might be the most effective route to world domination is true, what is it that gives culture this degree of power? Why are we so likely to be led to do things we would not otherwise prefer simply because someone else has told us, overtly or implicitly, that it’s how we should behave? Aren’t we smart enough to notice when we’re being tricked in this way, and to overcome any pernicious influence that cultural beliefs, inclinations, and habits might otherwise have on us?

I have a couple of different explanations for why this is often more difficult than it seems like it should be, but for today’s article I would like to restrict my focus to one of these: the lure of the shortcut. Shortcuts are appealing in most contexts, from cooking to computing to commuting—especially among people who perceive more demands on their time, energy, and attention than they feel capable of satisfying.

And culture nearly always functions as some sort of shortcut. We save time by learning the best way to build a house or solve a calculus problem from other people rather than figuring it out on our own. We don’t have to spend years pondering the most effective way to raise our children if we pretty much echo what our parents did with us. And we can conserve all manner of mental energy by assuming that people will behave predictably on the basis of a few easy-to-identify characteristics such as age, gender, and skin color, rather than attempting to dig down to assess deeper but more situationally salient traits.

It is our capacity for culture that has allowed humans to achieve much of what we have accomplished. I very much doubt any person could have made it to the moon without taking on faith a whole host of previous discoveries and basic beliefs. There’s not much one can do if one has to start totally from scratch at birth. If we didn’t follow certain expected social patterns, we would have to renegotiate each new relationship, and cordial exchanges with strangers (for instance, a visit to a restaurant) would be unthinkable.

There are thus many convincing arguments for the benefit of taking cultural shortcuts, and I am not saying we should never resort to them. My main point here, and a process I am hoping through writing these articles to facilitate both for myself and for any interested readers, is that it behooves us to consciously recognize the fact that we are taking shortcuts, and to acknowledge that those shortcuts are liable, at least on occasion, to fall, well, short.

If we mistakenly believe that our culturally-shaped perceptions fully reflect the underlying reality of the world, we will fail to second-guess them in those moments when the time or energy saved by the shortcut is not worth the cost of operating under false pretenses. Let me use an example which is perhaps more of a metaphor than anything else, but will I think at least begin to make my point. In day-to-day life it is harmless to operate under the mistaken but visually and culturally reinforced perception that the sun goes around the earth once each day. But when calculating certain weather patterns, it is necessary to recognize the fallacy of that assumption in order to predict and account for the effects of the earth’s rotation.

In the same way, many other cultural shortcuts suffice, under most circumstances, to lead us through life without much derailment, but if we don’t acknowledge that they are, in fact, mere shortcuts, then we may fail to discount them under the few circumstances in which they are apt to lead us far astray. And that’s not to mention the cultural shortcuts that frequently or almost always lead us astray, but have become so deeply entrenched in our society that we continue to follow them at our own unwitting expense.

Want examples? Stay tuned.

Monday, November 23, 2009

Environmentalism and World Domination

I wrote in my last article about one way that culture can have (what I believe are) negative effects on those under its influence, particularly when it is perceived as reflecting the underlying nature of reality rather than being recognized as something both imposed and imitated. But culture can also have (what I at least believe to be) positive effects, and so today I’d like to briefly discuss one of those.

Consider the phenomenon of environmentalism. This is a movement that seems to have gained enormous momentum over the course of the past several decades. (I say “seems to have” rather than “has,” because (1) I haven’t done the research to assess this and (2) informal perception about cultural changes over time is something I am generally inclined to distrust, mainly because a significant proportion of cultural belief systems include a whole segment describing “how things used to be and how the way things are now is different”—so that perceptions of the cultural system of the past are themselves shaped by the cultural system of the present, rather than being pure, unadulterated, comparative memories.)

In any event, I would argue that whatever rise in environmentalist behaviors (recycling, reduced consumption, green building practices, etc.) may have occurred over the past half-century, has done so as a direct result of developments in the cultural belief systems of the individuals engaging in these behaviors. Specifically, beliefs such as “the health of the planet, the happiness of future generations, and the survival of the human race depend on exercising greater care in our use of environmental resources,” or “if I don’t toss this can in the recycling bin, everyone who sees me is going to think badly of me,” are the most effective mechanisms to motivate environmentalist activities.

Where do these beliefs come from? Some have been instilled through overt cultural campaigns carefully constructed by people already convinced of their importance. For instance, as a child I remember attending mandatory school-wide assemblies, complete with catchy songs and animated cartoons, encouraging me to “Recycle, Reduce, Re-use, and Close the Loop!” and recommending five fun crafts I could try at home that would make use of my six-pack soda-rings so they didn’t end up around the neck of some poor seagull or cute little fishie.

But there are also subtler mechanisms with similar effects. The proliferation of recycling bins next to trash cans in places like the Seattle airport not only makes recycling easier—it also sends an implicit message that recycling is something everyone should be sure to do, wherever they find themselves. And I don’t know about Macs, but on my PC where there once upon a time was a trash can to which I could drag unwanted files, there is now a “Recycle Bin.” Does dragging my files to a Recycle Bin rather than a Trash Can ensure that the bits and pixels of which it was composed will be processed for re-use rather than sent to languish in a landfill? No, but it does get me in the habit of looking first for a recycle bin when I want to get rid of something cluttering up my desktop, which is likely to influence my behavior in the three-dimensional realm, as well.

Because this second category often has just as much impact on our beliefs and behaviors as the first (indeed, sometimes even more of an impact—since, as I pointed out with respect to the innocent questions we ask our children, the subtlest messages are often the hardest ones to resist), and because many of us live in societies which are responsive, both politically and commercially, to popular expectations, a feedback cycle develops wherein beliefs in the importance of environmentalism foster more beliefs in the importance of environmentalism.

That is, because politicians are out to please voters (in order to get re-elected), and because producers are out to please consumers (in order to sell more of their products), it is in their interests to respond to any preferences that a significant portion of the general population holds. If enough people consider it important to prevent dolphins from being killed when tuna fish is caught for consumption, then companies will respond by making their tuna fish “dolphin-safe” and then advertising it as such, and governments may even make laws prohibiting dolphin-killing by tuna-catchers. But also, the advertisements for dolphin-safe tuna are likely to awaken people who otherwise wouldn’t have thought to care about the impact of their tuna consumption on dolphin populations to the notion that maybe this is something about which they should be concerned, thereby increasing the number of people who make dolphin safety a priority in their tuna-purchasing decisions.

The proliferation of cultural beliefs and expectations can thus significantly impact the state of reality—and sometimes this can be for the better. Indeed, I would argue that if one wants to improve the world, it is far more effective to work to change prevailing cultural value systems in such a way that people are convinced of the importance of a given change (trusting that the people best positioned to make that change will then find it in their best interests to do so) than to attempt to enact restrictions or regulations enforcing a change which most people consider unimportant or even undesirable.

In other words, if you really want to take over the world, you may not need infinite wealth or superior military force. A convincing cultural construction might just do the trick.

Saturday, November 21, 2009

The (Frightening?) Function of Fashion

While I’m on the subject of color, allow me to turn for a moment to the topic of fashion or style—a topic I personally find to be simultaneously fascinating and vaguely disturbing, not to mention yet another excellent illustration of some of the points I’ve been trying to make about the nature and influence of culture on the human experience.

Color plays an often-central role in the distinguishing of different styles or fashions in modern American society (I have no doubt it does so in other societies as well, but I’m going to limit myself to speaking of areas with which I have some personal familiarity). Certain colors are associated with specific seasons, years, or decades, and clothing, décor, and items whose coloration fails to align with the current chromatic expectations is immediately deemed “out of style.”

Of course, color is not the only element that identifies something as being in or out of style. Size, shape, and outline are some others, as will become apparent if you compare the cut of clothing, the contour of cars, and the construction of couches that are produced from year to year. I find it fascinating that it is possible to identify the decade, and sometimes even the year and month, in which an article of clothing was created to be worn just by looking at it. Or to look at toys currently being sold and to see how distinctly different they look from toys with the exact same functions (a dollhouse, a swingset, a baby bouncer) being sold back when I was a child.

The thing I find vaguely disturbing about this phenomenon is that it is one of the most blatant ways in which I see culture being mobilized to trick people into doing things they otherwise wouldn’t be inclined to do, in ways that just happen to benefit the people who are perpetuating the trickery. When we become convinced that “style” is a necessary concern when it comes to the various products we employ (rather than focusing purely on functionality), then we can be enticed to purchase a new sweater/backpack/refrigerator even though the one we currently possess is still perfectly capable of performing its designated task, and to do so as often as the people who make these products (and thus benefit from increased demand for their purchase) choose to alter their color or shape and thereby proclaim previous shapes and colors to be hopelessly demodé.

I distinctly remember when I first took note of this phenomenon. Back when I was in high school, stick-straight hair was considered far superior to any other style, and I watched many a girl not naturally endowed with such tresses spend weeks saving up to buy a ceramic straightener, beg her parents for de-frizzing gels and conditioners, and even undergo the painful and expensive process of chemical straightening. A few years later soft, wavy curls became the rage, and not only did the once-fortunate straight-hair girls have to go out to buy diffusers, mousse, and curlers to achieve the coveted look—so did the once-naturally-curly-haired ones who had undergone permanent straightening. Even then I marveled at the brilliance of this marketing ploy.

Now, I am not here making the argument that fashion and style are inherently worthless and evil things with which we shouldn’t be at all concerned (not that I wouldn’t make that argument if asked; simply that this is not the place for me to do so). But what I am concerned about is the way that the assumption that style is a valuable consideration when it comes to the products we purchase and employ has come, through cultural imposition and imitation, to be taken for granted and left utterly unquestioned.

This is one of the (many) reasons I think a more attuned awareness to the anthropology of the everyday is worthwhile to pursue. It helps us to see the way that culture can be used to control us, and allows us to decide for ourselves, more thoughtfully and consciously, what factors and considerations are really important in guiding the way we live our lives.

Thursday, November 19, 2009

“What’s Your Favorite Color?”

…and other leading questions we ask our children.

After having spent my past several articles laying out some basic conceptual groundwork on the topics of anthropology and culture, I would now like to plunge into the realm I plan to spend most of the rest of my articles in this blog exploring—namely, the way that the influence of culture crops up in the most unexpected aspects of our everyday lives, and how the concept of culture can help us to gain a better understanding of what exactly is going on underneath these seemingly straightforward situations.

The first example I’m going to take a look at is a question most of my readers have probably heard asked at some point in their lives—in fact, they have almost certainly answered it at least once, and may even have asked it themselves. It is the (seemingly) age-old question, “What is your favorite color?” This apparently harmless question provides me with the opportunity to revisit both of my previously discussed definitions of culture (as imitation and as imposition), hopefully clarifying anything that was left ambiguous by my more theoretical treatments.

First of all, the very idea of “color” illustrates what I mean by cultural imposition on the world around us. Yes, color names refer to an undeniable physical property of the world around us—the wavelength of light than an object reflects to our eyes. However, the way we break up what is, in the physical world, a continuous spectrum into distinct color categories (red, orange, yellow, etc.) is just as undeniably something we have invented and imposed. Different societies have broken up the color spectrum in different ways,* and I’m sure enough of us have gotten into dispute about whether a certain shirt is actually a bluish green or a greenish blue to recognize that even different individuals within the same society might have different places where they draw those distinctions.

But the “what’s your favorite color?” question also demonstrates my points about cultural imitation. In addition to the fact that color distinction is one of the things we spend a great deal of conscious effort teaching our children from a very young age** (“What color is your shoe, Ctaci? That’s right, it’s brown.” “Abel, can you bring me the pink elephant?”), the very form of this seemingly simple inquiry implicitly reinforces a whole host of cultural lessons you probably never expected.

First of all, asking someone (and I would wager most people who are asked this question are first asked it at a relatively young age) what their favorite color is takes for granted the notion that that person has a favorite color. It thus implies that a favorite color is something every self-respecting person should have, and encourages the individual, if s/he does not yet happen to have a “favorite color” picked out, to select one on the double.

This, in turn, reinforces a variety of even deeper notions. Some of these are connected to favoritism, competition, and superiority. The very form of the question sets up the idea that in any field of multiple options, it is possible (and advisable) to judge one to be universally “best” (rather than simply acknowledging that different options might be preferable for different purposes or under different circumstances). Other implications relate to the concept of identity—posing such a query serves to instill the belief that people can and should be individualized by certain types of traits, that one has preferences which both persist over time and connect intimately to one’s sense of self, and that others can and should keep track of these traits in order to better understand or distinguish between their fellows.

These might seem like subtle or stretched messages for a question frequently posed to five-year-olds and requiring nothing more than a one-word response, but that is precisely my point. The external simplicity of this inquiry masks the vast array of cultural lessons it conveys to its often highly impressionable recipients. Overt discussions of cultural values (“You see, Daphne, this is why we think it’s important to share”) are actually much easier to question and resist than are these sneaky verbal Trojan Horses, especially because the person presenting them may not even realize the underlying messages they have thereby introduced. This goes to demonstrate just how widespread and invisible the little elements of culture we have spent our lives thus far learning to imitate can be.

And don’t even get me started on the whole color/gender connection.


* Interestingly, some research has indicated that there are universal cross-cultural similarities in the places where people break up the color spectrum (though distinct differences in the extent to which they do so). This makes some sense, as there are physiological similarities across cultures in terms of the ways that humans perceive color. Even if color distinctions are entirely physiologically based, they are still impositions, since these distinctions derive from the structure of eyes and brains rather than the outside world. The really interesting question is whether having a physiological basis disqualifies something as "cultural." I'm sure some would support this criterion, but to do so demands that one provide non-physiological explanations for any tendencies one wants to deem cultural, which I am not convinced is possible. Though I certainly welcome attempts!

** This raises the question of why color identification (along with shape recognition, size and orientation terms, barnyard animal noise pairing, and other such often-emphasized subjects of baby books and enthusiastic parental coaching) are of sufficient importance to merit the energy we devote to imparting them to our children—a topic for a future article, I suppose.


Monday, November 16, 2009

Culture as Imposition

In my previous article, I proposed one possible way of defining culture—as that which we learn from others rather than figure out on our own (which I called “culture as imitation”). In this article, I will propose another possible definition—namely, as that which is enacted upon the outside world rather than being inherently present within it (I will call this “culture as imposition”). These two definitions overlap in a number of ways, and I will explore the interactions between them, and their respective benefits as ways of thinking about culture, in future articles. But first I would like to delve a little more deeply into this idea of “imposition”—to explain what exactly I mean when I use the term, and to give some examples of how it operates in our everyday lives.

I have stolen (or imitated) the term “imposition” from Ralph Holloway, who (as I mentioned in this blog’s first article) defines culture as “the imposition of arbitrary form on the environment.” I find this to be a particularly insightful turn of phrase, because it covers a number of different phenomena. Most obviously, it refers to the tangible, material activities of building, shaping, marking, and the like, whereby we transform our surroundings into easier and more comfortable places to live. From the first time someone picked up a rock and hit it with another rock in order to better shape it for some purpose, humanity has been imposing arbitrary form upon the environment in this way.

But the imposition of arbitrary form on the environment also happens at a much more abstract, conceptual level—and I, personally, find this level to be the far more interesting one. Because this process of imposition occurs, not just in our physical interactions with the world, but in our psychological interactions with it as well.

What do I mean by that? I mean that whenever we think about the world, we impose all sorts of psychological forms and concepts on it that don’t actually exist out there in the world itself. For example, if you stop reading for a moment and take a look around you, you will probably see a number of different objects—chairs, tables, windows, people, clouds, trees, etc.

But all of those “things,” which you perceive as distinct, bounded entities, are actually just amorphous sections of one giant, interconnected blob that is the universe. They aren’t surrounded by neat outlines and they don’t bounce around in bubbles (as would be especially clear if you took increasingly powerful microscopes and tried to find the exact place where any one supposed “object” stopped and another one started).

Even more arbitrary than the idea of distinct, bounded objects is the idea of categories into which we place them. So when you look at a chair, you not only think that is a thing which is both unified into a whole and distinct from all the other “things” around it, but you also believe it bears some inherent property of chair-ness, which it shares with everything else you would designate as a chair, but not with anything you would designate as a table, a turtle, or a tree. But I would argue that the category of “chair-ness” resides, not in the chair itself, but in your brain—from whence you impose it upon whatever environment you happen to encounter.

You can get some idea of what Holloway and I mean when we say that these imposed forms are “arbitrary” when you think about all the items you would call a “chair” if you happened upon them, and the wide variety of shapes, sizes, materials, and uses these items might have. You can get an even better idea if you imagine trying to teach an alien from outer space how to identify something as a chair if she happens upon it. You can also imagine that alien just as avidly attempting to convince you that a certain lamp, your pet gerbil, and your pinky fingernail are all “quigleys.”

Another way to think about it is to look up at the stars and to realize how many different ways people have clustered those stars into constellations over the years. Is Orion a hunter? No, it’s a scattering of light points with a variety of brightnesses and a specific distribution. Maybe to you it looks more like an hourglass. In fact, “it” is not even an “it” at all, because you could just as easily imagine connecting some of the stars in Orion to some of the stars in Canis Major, and some others to some of the stars in Taurus, to create two different constellations. What I’m trying to say is that calling something “a chair” is like naming a constellation in the night sky. You’re arbitrarily imposing unity on a collection of particles that could just as easily be conceptually grouped in any number of alternative ways.

This arbitrariness is even more apparent when it comes, not to tangible objects, but to more theoretical categories—for instance, when we identify a certain occurrence as a “tragedy,” or a certain person as a “troublemaker,” or a certain group of words as a “joke.” These designations do not identify some sort of innate property of the circumstance, individual, or statement. Rather, they dwell in our minds, and are part of the form that we are arbitrarily imposing upon that segment of our environment.

Do you see now why I think conceptual imposition is so much more interesting than mere tangible imposition? It’s really hard to fully wrap your brain around, because, well, these concepts are inside our brains to begin with. Which means they typically seem to be totally natural and self-evident. When you call something a chair you never stop to think that that title is actually something you are imposing upon that bunch of particles. You think you are describing the state of reality. But if you take a step back and think harder, you might realize that somebody else (aliens are always handy for this type of thought experiment) might look at the same thing and perceive it in a completely different way.

I have to warn you, before you finish reading this article and go back to your everyday life, that this way of thinking, once you let it into your brain, has a tendency to crawl around in there and explode at the most unexpected moments. You might be doing a perfectly ordinary thing—taking a walk, going shopping, mowing the lawn—when all of a sudden you’ll realize, “Hey, wait a second, what am I actually doing here, and why am I calling it what I’m calling it, and just how arbitrary are all the structures I’m using to understand it all, and how might somebody else with a different set of conceptual forms bouncing around in their head see all this in a completely different way?”

And that’s when you’ll know—your inner anthropologist has been awakened.

Tuesday, November 10, 2009

Culture as Imitation

Think back for a moment on all the things you have done in the course of your lifetime. How many of them did you come up with on your own, and how many of them did you learn to do through imitating others? Initially, I would guess that you're inclined to put the majority of your past activities in the former category of the self-invented, with only a few simple behaviors defined as imitations. But I'd like to encourage you to rethink that assumption, and to recognize the overwhelming role that imitation actually plays in shaping the way you—and all other humans—behave.

First of all, consider language. Admittedly it's true that as we emerge from infancy our specific utterances cease to be mere copycat repetitions of whatever we're hearing (say "ma-ma") and become (for the most part) novel constructions uniquely appropriate to our immediate circumstances (whether those involve writing a blog post, proposing marriage, or begging our mom for another cookie). But the words from which we compose these constructions (in addition to the grammatical rules which structure their composition, and indeed the very notion of using speech—rather than gesture, drawing, or other means—to communicate) were learned through imitation of others, who in turn learned these things through imitating others who came before them. In addition, our speech is littered with pre-fabricated phrases we have collected through observation and repetition, which range from common figures of speech ("it's just a hop, skip, and a jump away") to the formulaic terms of transition with which even this paragraph itself teems (like "first of all" and "in addition," neither of which, I hate to break it to you (and there's another!) I came up with on the spot).

But the list of imitated activities extends far beyond our use of language. Take a moment to ask yourself why you wear the clothes you do and not garments in any number of alternate forms of equal (or even greater) comfort and functionality—or, for that matter, why you wear clothes at all. Did you wake up one morning, naked and slightly chilly, and decide, "Hm, I think I'll go weave myself some fabric and sew it into a sweatshirt"? My guess is that the answer is no. Rather, people began dressing you before you had any say in the matter, and then as you grew up you noted that people around you generally wore clothes of a particular sort, and you chose particular models of clothing-wearing to imitate.

Using these two examples, reconsider your previous assessment of the proportion of self-invented to imitated behaviors within your own personal repertoire. In fact, try to think of even one thing you have done which was entirely initiated by you, and in no way based on anything you have observed or heard of someone else doing. If you can come up with something, please leave it as a comment—it will be my anthropological challenge to find an element of imitation hidden within whatever you propose.

Note that when I talk about imitation, I am including not only things you learned to do by watching others unconscientiously perform these activities in your presence, but also things that you were actively taught—teaching being, in my opinion, merely a mechanism humans have developed in order to facilitate improved imitation. Any scenario in which the idea to do something comes, not from your own independent and spontaneous invention, but from some recreation or recombination of behavioral elements you have become acquainted with through the activities of others, falls under "the imitated" in my current definition.

I am not arguing that humans are not creative, nor that there is not something impressive, exciting, and meaningful in the ways that humans recombine what they have observed in others into constructions novel in content if not in concept. All I am pointing out is that the majority of what we do, as humans, comes from copying (and then building upon) what others before us have already done. This is not a weakness or failure of humanity. Rather, I consider it one of our species' greatest strengths. (And in saying that I am not denying that other species possess such imitative abilities, though some would like to call that claim into question. I am merely noting that human behavior happens to be strongly characterized by the impulse to imitate.)

This ubiquitous inclination to imitate serves as the first of the two definitions of "culture" which I will employ in my discussions of the anthropology of the everyday. Anything humans do because some other human showed them how is a "cultural" activity. Given your newfound awareness of just how many of the things we do involve some element of imitation, you may at this point be objecting, "But that means that almost everything anybody does is cultural!" That is precisely my point, and the reason that I find the anthropology of the everyday to be such a fascinating subject of study.

For those wondering about the purpose of creating a definition broad enough to encompass practically every aspect of human behavior, my justification involves two claims, which I will merely state at the moment, but hope to more concretely illustrate as I make use of the definition in subsequent articles. Claim 1: Even if this definition applies to almost everything we do, it highlights an aspect of those activities which is often hidden. And claim 2: The fact that much of what we do is learned or imitated from others has important implications and ramifications, which it will benefit us to recognize more overtly.

Wednesday, October 28, 2009

Who Is This Everyday Anthropologist?

(And what, exactly, IS anthropology, anyway?)

Hello, my name is Elizabeth Edwards, and I will be your anthropologist for the day. Perhaps even for the everyday (that's my personal goal, at least). But before I can convince you to give me a perch somewhere inside your brain, from whence I hope to whisper anthropological nothings into your ear as you wander through your daily life, I should probably tell you a little bit about myself.

I have recently completed my master's degree, with an emphasis in anthropology, through the University of Chicago's intensive one-year Master of Arts Program in the Social Sciences (MAPSS). This was the first time the official title of "Anthropologist" was ever bestowed upon me, and in fact it was only a little under three years ago that I first considered seeking such a title for myself. But, unbeknownst even to me, I was an anthropologist long before I had any kind of clear idea what the term meant.

In fact, let's pause for a moment to define the term "anthropology," because I've noticed of late that its meaning is not exactly straightforward or widely understood.

Anthropology Defined

First of all, an anthropologist is NOT the same thing as an archaeologist. (I've realized that the two words are very closely linked in many people's minds, a discovery achieved through extensive ethnographic observation—by which, in this case, I mean telling people that I'm studying anthropology, and noticing that the most frequent response is "Oh, so do you go on a lot of digs?")

"Anthropology" comes from the Greek word "anthropos," meaning "human," and thus an anthropologist is someone who studies humans or humanity. An archaeologist is a specific kind of anthropologist—one who studies ancient societies. ("Archaeology" comes from the Greek word "arkhaios," meaning "ancient.") This often involves excavation, but extends beyond that to assorted other techniques and processes as well.

Anthropology, in turn, extends far beyond the archaeological. Other subfields of the discipline include physical anthropology (the study of the human body, ranging from evolutionary studies of humanity's ancestors through modern medical study), linguistic anthropology (the study of language, including its components and development), and cultural anthropology. Cultural anthropology is my personal area of specialization, and involves (as the name suggests) the study of culture, though the question of what precisely "culture" entails turns out to be particularly tricky to pin down.

Culture (Provisionally) Defined

The word "culture" comes from the same root as the word "cultivate," and was initially used by self-designated "civilized" people (who believed they had raised themselves from a "state of nature") to distinguish themselves from the so-called "savages" whom they perceived to be totally devoid of any semblance of self-cultivation. Eventually people (and to their credit, anthropologists were some of the main champions of this revelation) realized that that was a bit of a biased perspective, and started using the term "culture" to refer to the habits and worldviews that seemed to characterize and distinguish ALL people groups.

"Culture" has since been subject to an enormous proliferation of meanings, which are often inconsistent if not wholly contradictory. Some use the term to refer to specific genres of activity such as formalized art, music, and literature, while others understand it to incorporate the whole range of human activity. Some claim that culture consists solely of immaterial components such as symbols, beliefs, and values; others believe that material artifacts are also manifestations of culture. Some speak of Culture with a capital C as something universal to all humanity, while others speak of differentiated small-c cultures, each with its own language, traditions, and other distinguishing characteristics.

While the debate over which is the "true" or "proper" meaning of culture is not one I will make any effort to settle at the moment, there are two definitions of culture I find particularly interesting, which I will be using to frame the various topics I explore in future articles. The first identifies that which is learned or acquired through imitation of others (rather than being independently devised or discovered), and the second involves what Ralph Holloway has called "the imposition of arbitrary form on the environment." (Of course, Holloway constructs this definition in order to assert that humans are the only animals exhibiting culture, a claim by which I am not personally completely convinced, but that is beside the point—for the moment, at least.)

I will delve more deeply into these two definitions, and their implications, in a subsequent article, as I would like for the moment to turn from this tangent back to the subject at hand—namely, my own introduction. But I hope this has provided some helpful background regarding the admittedly slippery term "anthropology," and the even more slippery concept of "culture" (at least enough to tide us over for the moment).

Back to the Anthropologist

As I said, I became an anthropologist long before I had any inkling about the definition of the term. What do I mean by this? For as long as I can remember—and for the archaeologists among my readership, the written record (as preserved in my diaries and other assorted compositions) confirms this claim beginning in January of 1992—I have been fascinated with the patterns of human behavior, the meanings that people attribute to those behaviors, and the ways that beliefs and understandings impact how people interact with one another. I have a habit of carefully observing the world around me, taking note of interesting social phenomena, and formulating theories to explain both the regularities and irregularities I came across.

It was not until after I graduated from college (without taking a single course in anthropology—in fact, it was almost as though fate had intentionally intervened to prevent me from doing so, since the only courses that scheduling reasons ever forced me to drop during my undergraduate career were the two Comparative Sociology classes in which I had enrolled) that I learned that the study of humanity was not merely a fascinating hobby, but a legitimately recognized academic discipline. When I began looking into the matter further and encountered a description of ethnography (the primary research method of the cultural anthropologist, which involves immersing oneself in a particular field site through engaged participant-observation, recording detailed written fieldnotes of one's experiences, and subjecting these notes to subsequent formal analysis), I flipped back through the fifteen volumes of journal entries I had been keeping since second grade and realized I had undergone a full apprenticeship in the process without even realizing it.

Since I was, at the time of this epiphanic revelation, in the process of deciding what sort of graduate program to enroll in, I decided to formalize my obvious anthropological inclinations through academic training. This led me to the University of Chicago's one-year master's program, which provided me with a thorough and rigorous grounding in social science theory in general, and academic anthropology in particular.

Although I relished the challenges the program presented and appreciated the opportunity to see the mode of inquiry I had long been pursuing on my own terms applied in a more official, widely recognized format, upon completion of the program I ultimately realized that my true passion did not lie in conducting theoretical studies and translating them into "academese" for the perusal of a few other experts in some obscure field. Rather, I wanted to continue to examine the anthropology of the everyday—using the methodological tools and theoretical frameworks I had learned through my academic training to gain insights which I could then express in a manner accessible to the general public. It is for this reason, and to this purpose, that I have started this blog.

I believe firmly (for reasons to be discussed at greater length in subsequent articles) that an anthropological perspective can be of both interest and benefit to all people, whether or not they themselves are as anthropologically inclined as I am. Anthropology encourages an open-mindedness increasingly necessary to navigate the modern world of ever-expanding horizons. It probes the underpinnings of the basic patterns and frameworks that shape our lives, revealing the ways that our seemingly freely chosen actions are often shaped and constrained by our own cultural context. Awareness of this process can, I believe, pave the way for increased independence from these controlling influences, allowing us greater control over our personal behaviors and over, not only our individual destinies, but the destiny of humanity as a whole.

I hope you will read on, and give me the opportunity to convince you that becoming an everyday anthropologist can be as rewarding, and ultimately indespensable, as I have found it to be.


*Note: I have intentionally attempted to keep the definitional portions of this article simple and non-technical. If you are interested in more specific or in-depth background on the topics of anthropology and culture, I am happy to point you in the right direction. Feel free to post questions or areas of interest in the comments.