Monday, May 31, 2004

Beads, beads, everywhere and not a thought to think ...

Beads. Thousands upon thousands of beads, everywhere you look. Where this traditional symbol of exuberance came from or how it was introduced to New Orleans is often debated. There are both cultural explanations and religious explanations.

Why someone would bare parts of their anatomy for cheap plastic beads is beyond me. I mean, in most stores in the area, $3.95 buys you 10 strings of such beads, and no clothing removal is required.

I’ve been informed that buying one’s beads is not as “authentic” as “earning” them. Women are encouraged to flash their chests, and men are encouraged to … well … display their nether regions. What a bizarre city …

Speaking of, I don’t see how people get anything done in this city. The only “native” Louisianans I meet are selling tourist items and capitalizing on the spending potential of visitors. I do understand that technically, this IS “work” for them, but I just don’t see how people keep a clear head about industry or their educational careers with a 24/7 party atmosphere brewing throughout the city.

Of course, Austin does have 6th street. And flashers of our own during Mardi Gras (not to mention some funny ideas about the legality of public exposure)

Saturday, May 29, 2004

A Cajun Perspective ...

So, I’m writing this in gator country. No, not Florida, the OTHER gator country, home of Creole cooking and the Cajun culture: New Orleans, Louisiana.

As one of my research papers was submitted to this summer’s International Communication Association’s summer conference, I found myself headed for the Bayou.

As I prepared my travel plans, I weighed my travel options. It was a 45 minute flight from Austin to Dallas, and about an hour flight from Dallas to New Orleans. Or, I could embark on an 8-hour drive through southeast Texas and southern Louisiana. At first the cost of each route seemed rather comparable. Then, my girlfriend managed to free up her schedule enough to come with me, which tipped the financial equation in favor of the road trip.

So we embarked on a long road trip.


I do believe that driving to a location, particularly over long distances, gives you a better sense of a place. As the terrain slowly shifts to resemble an alien climate (where the air is moister, the division between solid ground and freestanding water seems less distinct than one thought possible), I find I gain a deeper appreciation for cultural differences of those who live there.

When we first entered Louisiana, we stopped at a nearby park. We walked around and found a small lake, with a rather interesting warning sign that I had never seen in my home state.



But what we saw in the tourist locations and along the road was nothing compared to what we experienced on the road. Apparently, Louisianans do not require as much cushion in their road design as Texas. Not only were the driving lanes noticeably shorter in places, but the lack of a shoulder along much of the highway was downright disturbing. I was amazed at how much … nothing … there is between major cities. Texas is littered with small towns and convenience stores. During several stretches, we drove for dozens miles without seeing any sign of civilization off the roadway.

The stretch between Baton Rouge and New Orleans was the most disturbing, for we drove 50 miles without seeing so much as a convenience store. I was shocked. Surely they should post warnings for us out-of-staters not used to budgeting our gas usage over such long distances.

I believe that it’s because of this distance between major cultural centers that Louisianans drive the way they do.

Monday, May 24, 2004

Goodbye History ...

I just returned from having seen a charming little German film titled Good bye Lenin. The story is drawn from the lives of an East Berlin family struggling to cope with the changing world as their way of life is challenged. The father, having reportedly left the family for the West years before, is not present and the mother replaces her spousal needs with the love of her country and its way of life.

The premise of the film centers on the frail mother, who falls into a coma mere weeks before the fall of the Berlin wall. Eight months later, she regains consciousness, and her children are told not to excite her, lest she have another episode.

Bound by their love of their mother, the son and daughter seek to shield her from the changes in her culture. In their apartment, they recreate the conditions of the world she remembers, right down to the labels on the food they serve her. As the mother comes into contact with the inevitable disparities between her new world and the one she remembers, the son compounds the deception, eventually creating false newscasts to explain the phenomena she witnesses in a manner more consistent with her core assumptions of life.

The film is touching, tender, funny and dramatic. However, two elements really drew me in: the historical construction and the plot device of deception. The first will be the focus of this post, the second will be for a later time.

The historical construction was the way in which the son, through his efforts to explain the increasingly Westernized elements of German society his mother observes, recreates East Germany as the country he could have faith in. As he recreates history to incorporate current events, he softens the harshness of the party rhetoric, reforming the socialistic ideal closer to the compassion for the masses and the acceptance of the “enemy” capitalists. The film makes ample use of actual news footage in his narrative, footage that adds sharp contrast to Alex’s version.

This contrast is a striking reminder about how much of our social conscience is constructed through the lenses we choose to observe reality and recall history. Alex had quickly come to give up his socialist devotion (though the film does make it clear form the beginning that the adult Alex was already disenchanted with it). But as Alex fabricates news reports and artifacts for the illusion he’s providing his mother, he actually appears to be inventing a system of socialism that he can feel proud of. It’s almost as if in trying to console his mother, he connects to her by reinterpreting her world into something he can interface with, building common ground.

How much of our own social history is constructed in this manner? We champion our own system of free market democracy as the “city on the hill” for other nations. We raise up the virtues of our freedom and individuality (and there are undisputedly many virtues), while ignoring some of the more sorted historical results it has yielded. We choose which portions of our history we celebrate, and which portions we condemn to academic obscurity.

Americans use history to construct our national mythology. Like Homer and Virgil before us, we compose idealized stories of virtue and create narratives that resound with the language of legendary epics. And because of this mythology building exercise, we often fail to see our own cultural reality for the flawed imperfect collection of group effort that it is. That’s why we feel so betrayed when our leaders make simple human mistakes or we see representatives of our culture participating in a manner that runs counter to our values.

No where is this phenomenon so pronounced as when it comes to our national leaders. We look back on our founding fathers and through our myth building, elevate them to superhuman stature. Our high school students may not remember what wars Washington fought in or what political initiatives he took but they remember that he cut down a (fictional) cherry tree and refused to lie about it.

We remember the elegant words that our predecessors crafted without remembering the pain and suffering their efforts exacted from other people. We remember that Thomas Jefferson advocated “Equal and exact justice to all men, of whatever state or persuasion, religious or political …” while conveniently forgetting that he was ambivalent at best to the degree that freedom extended to those in a state of slavery. We forget that founding fathers quarreled, that at times they misrepresented each other’s interest to foreign leaders and that on occasion may have even tried to kill one another.

The founding fathers we remembered were well educated, civil and wise. Against this tapestry of myth we watch contemporary politics play out, trying desperately to spin events into frameworks that reinforce our desires for justice and virtue.

We are all Alex, trying to reconstruct a new view of history that makes us more proud of where we come from. We invent and reinvent history to suit our needs and like Alex, do so in the name of providing a safe environment (or better way of life) for others.

Thursday, May 13, 2004

Irony vs. Coincidence

I would first like to take this moment to exegete the terms "ironic" and "coincidence" as they apply to my discussions. Simply because people have often used the one I wouldn't have, and I'm often making use of the one that drives others crazy. I'm not trying to convince anyone to adopt my definitions, only to understand them so they know what I mean when I use the "wrong word" according to their vocabulary in future conversations.

The real problem is that I don't believe in "coincidences." I think when we say "what a coincidence" we are recognizing a pattern that we haven't identified yet, and are dismissing it as a happenstance gathering of variables that slightly impede our sense of acceptable probability. And we're removing any sense of purpose from the events.

"Irony" in its original form is the will of the fates or gods played out through the lives of mortals. And most often seen through an incongruity between intention and purpose. Now it's true that when you start placing an author (say Shakespeare) in the place of the fates and gods and slip his or her characters (say Hamlet) in to the roles of mortals, the definition tightens up, because the author makes sure the audience is present for all the necessary components that make irony (and tragedy, but that's another discussion) detectable.

Life is not so neat. I believe that God has a wonderfully sophisticated sense of irony in my life and in the lives of those around me. Often when I experience what some people call "coincidences" (clusters of related events whose proximity stretch my expectations of probability), I slow down and look for patterns. And sometimes I find them.

Example. I never watch Oprah. Can't stand her, her guests or her content. But recently, I became involved in the third discrete discussion I've had about a single particular show. Which I didn't even see. Also, the previous Monday, in the midst of migraine-induced intensity, I became involved in the longest and deepest discussion about psychological connections between couples and social implication of getting married that I've had in several years. Finally, my former roommate IMed me the previous day to talk about rooming together, which forced me to articulate precisely why I think that's a bad idea, forcing me to consider my own boundaries in terms of space and relationship.

Coincidence? Perhaps. But I prefer to think of clusters like this as "ironic," as though God were giving me the materials I was going to need to be able to really discuss this in a pinch.

I think this when a professor thrusts a book in my hands and tells me I need to read it and two weeks later some controversy in church is caused by a breakdown in communication that was illustrated in the book. I think that's not coincidence.

So when I say "irony" in this context, I probably mean in more as "providence." But "providence" has virtually dropped out of our vocabulary.

I NEVER mean it in the way Alanis Morrisette uses it, because she never makes connections that mean anything. And the meaning is what makes something ironic. I think she's really after "tragedy," but even that's a stretch in her music.

Wednesday, May 12, 2004

Did ABC go too far? (Abu Ghraib)

I recently received a forwarded email from someone I care about that contained an editorial from the National Review that decried ABC’s decision to air the photos from Abu Ghraib.

I normally do not comment on emails like these that I receive, but in this case, I felt compelled. I had a few issues with the claims made in this article. I'm not a National Review reader, mainly because I find Rich Lowry and staffers like him to be prone to over-generalization and partisanship in their reportage.

But my main disagreement with this particular article resides with the base premise. Point of fact, I cannot bring myself to criticize ABC for drawing attention to this story.

The article argued that investigations were already underway, and had been since January. In fact, the very evidence presented in the article about the investigations underway are the pieces of the puzzle that make me the most uncomfortable.

The fact that an investigation was launched in January of this year, but had seemingly stalled at Donald Rumsfeld's desk until now, is not a positive element to this story. On Friday, the secretary of defense testified before Congress that in all these months he himself had never laid eyes on the photographs, and as a result, did not fully comprehend the scope of the infractions.

I understand that the political ramifications of his testimony (that he may have been trying to save his job), but his explanation to Congress was that the reports he received did not lead him to believe the events were as terrible as the photos revealed.

If ABC had not aired the photographs, we (the citizenry and the members of Congress, and presumably Mr. Rumsfeld himself, if his testimony is to be taken at face value) would still know little to nothing of these events, nor would any further action likely have been taken. The plight of these people would still be contained in a file on Mr. Rumsfeld's desk (or in a drawer somewhere), and justice for these people would be no closer to reality.

I think the photos needed to be aired. We are a democracy built on the freedom of expression, and we have a long history of commissioning the news media to serve as an unofficial fourth branch in the checks and balances process to ensure our motives are pure and our actions honorable. In fact, American historians have argued that this tradition goes back to the colonial press reporting of British infractions against Americans, which allowed public outrage to build up to the point that the American Revolution occurred.

As a Christian, I believe we should be the first to admit our shortcomings and ask for forgiveness. And particularly from our enemies.

I was also a little concerned about the sweeping scope of statements like “Lost is the fact that in America torturers get punished, while in the Arab world they get promotions.”

First of all, the first clause is disturbing. Who remembers the scandal of the reported abuses at Guantanamo Bay? Here was an example of a story about the American mistreatment of Arabs reported without pictures. All of the guards involved are reportedly still serving in our military and were merely reprimanded and transferred. And no one above those who actually committed the acts was punished, even though it seems the same intelligence forces being indicted in the current scandal were the same forces involved in the Guantanamo Bay infractions.

In an article in this morning's New York Times, Stephen Cambone, the undersecretary of defense for intelligence, when asked whether he had inquired about the applicability of the Geneva Convention to the directions to the guardsmen said:

"I didn't have to," Mr. Cambone replied. "We had been through a process in which we understood what those limits were with respect to Iraq, and what those were with respect to Guantanamo."

Another story in this morning's Times featured an Afghan who gave his testimony of abuse at the hands of Americans. The NYT also ran an editorial citing evidence that these abuses may be much more systematic and widespread than is generally known.

Do we punish those who torture?

Second, the Arab world is a big place. And of all the Arabs and Arab Americans in my social circle (and there are perhaps a dozen), not one has ever said to me that torture should go unpunished or that there is no justice in “their world.” Extremists are marginalized in ‘their world” just as they are in “our world.”

Finally, even if one segments the world into an “Arab world” and an “American world” (a dangerous view of our globe, in my opinion), it follows that the very distinctions that define each culture should be defended. If torture is rewarded in “their world” and punished in “our world,” what does it say to “them” when we conceal the evidence of torture? Doesn’t that make “us” seem more like “them”?

A major portion of the rhetoric we've used in the post-WMD justification for our war with Iraq has been that the regime we were deposing was guilty of atrocities without accountability from the citizenry. I believe that we have an obligation to show these people that our way of life is different, that there can be justice when mistakes are made.

Hiding the evidence of these events in government reports in the hope that no one would notice does not seem to be a good strategy for convincing the Iraqi people (and others in that region) that we are a just and noble people. If anything, such tactics would seem to suggest to them that our way of life differs from the last regime's only in respect to who it is that is controlling the information flow.

If we ever want Iraq to develop into a nation of honesty and integrity, we must go out of our way to demonstrate these characteristics of our society to them. Even when it makes us look bad (perhaps PARTICULARLY then).

How better for them to learn what our values are and how we live out our values in honest and open ways?

Monday, May 10, 2004

The Doctor is in ...

Having recently completed my Ph.D., I've been pondering this whole "Doctor" thing.

For the last few years, students that I have taught have struggled with what to call me. I have always encouraged my students to address me in the familiar ("Rick"), but many of them are uncomfortable with this. For some reason, university students always want to address their instructors as “Doctor,” no matter what degree he or she holds.

I am generally not a status-seeker, nor am I particular about what I’m called (in my experience, “HAY-U” works about well as anything). Students have called me “Professor Stevens,” “Mr. Stevens,” “Professor Rick,” or “Assistant Instructor John Richard Stevens” (that one got VERY OLD). However, one thing I have not allowed my students to call me is “Dr. Stevens.”

I’ve just always felt that titles that are achieved through merit (such as degrees, ranks or awards) should never be assumed by those who have not earned them. And particularly not the “Doctor” title, because it requires so much personal investment and an approval by one’s peers. I would never presume to adopt such approval and acceptance before it was granted me by those who are in a position to judge me competent. To do so would simply be arrogant and diminutive to those who have earned the title.

The other day, I was at a reception honoring a faculty member who was retiring after many years of dedicated service. I was supposed to deliver some remarks about this person, and when the emcee introduced me, it was as “Dr. Stevens.” And I almost corrected her, for old habits die hard.

This “Doctor” business is going to take some getting used to.

I have friends in practitioner fields who have earned doctorates in medicine or dentistry, etc. And I get a lot of good natured ribbing about not having a “real” doctorate and am at times warned not to announce myself as “doctor” in such a person’s presence.

Apparently, medical practitioners feel they have a corner on the “doctor” market. And in a general sense, I can see why they come to believe this. What child when dreaming about one day becoming a doctor thinks about an academic who studies mass media forms? Or a historian? Or a linguist?

Unfortunately, this seeming cultural controlling interest of the doctorate by the medical community is actually a rather recent development. “Doctor" is actually a Latin word for “teacher." In classic times, the original degrees of “doctorate” were issued for the study of philosophy and mathematics. The title of “Doctor” was usually awarded to a middle-aged man who had proven to his peers his dedication to learning, to teaching and to the spreading of knowledge.

It was not until the later medieval period that doctorates became associated with the disciplines of law, theology and medicine. In the later Western (and particularly in the American) mind, knowledge, science and medicine have become inextricably linked (I believe largely as a result of the effect of the Industrial Revolution on our culture).

Americans are a practical people, and as inheritors of the Protestant Reformation legacy, we are often quick to define boundaries between “real” groupings of people and “false” groupings.

There are currently close to 100 degrees classified as “doctorates.” I am proud to be the bearer of the oldest form, even though I am using it to study new technology and future trends.

Now, if I could just get used to be called “Doctor.”

JRS Personal Blog - launch!

This is the personal of J. Richard Stevens, Ph.D.

Who am I? I'm an assistant professor at Southern Methodist University studying the application of mass communication ethics to digital media communication. I have a personal Web site at http://jrichardstevens.com.

Here I plan to keep up with old friends, discuss current events, present and deconstruct ideas and generally just chat. I also have another blog, focused on the relationship between technology, media and society.

If you have any questions or comments, add a comment or email me.