Thursday, September 10, 2009

Getting physical

It turns out that the man who ran onto the court and kissed Rafael Nadal after a match at last year's US Open, creating a minor scandal, did it from the bottom of his heart. His name is Noam U. Aorta! It was a curious incident that highlights some of the ambiguity our culture demonstrates with regard to:
a) celebrities and the notion of celebrity
b) spontaneous demonstrations of affection
c) the "business" of security.

Nadal, the "victim", who was "hugged and kissed", took no offense and even seemed flattered by the gesture. In the video you can hear him say, "It's OK" and both during and after the incident he was smiling broadly. On that basis alone, I expect that, if asked (which is unlikely to happen), he would be opposed to prosecution. The law sees it differently:

Aorta will be charged with trespassing and faces possible jail time if convicted, prosecutors said.

“There was a breakdown,” U.S. Open spokesman Chris Widmaier said.

District Attorney Richard Brown, however, called it “particularly disturbing” because Aorta made physical contact with Nadal.

Of course, the law is the law, especially in the US. But I find it interesting that officials identified two distinct problematic actions: the physical contact of the anonymous fan with a celebrity and the "breakdown". Given Nadal's perception of the incident, the only thing that seems to me reasonable to worry about is the breakdown of the security system. And in that sense Aorta was rendering a vital service by demonstrating a hole in the system that could be corrected and possibly should be. If anything, he should be rewarded by the tournament management for it, just as software companies reward the "good" hackers who reveal holes in their security while going after the malicious ones who infect their software.

Unlike the man who tried, earlier this year, to place a hat on Roger Federer's head in the middle of the final of the French Open, Aorta approached Nadal after the end of his match. He shouldn't therefore be accused - as the French authorities did - of interrupting a sporting event. The intruder at Roland Garros also displayed a non-aggressive, possibly affectionate approach, but one which was disturbing for everyone (including Federer) as it took place near the beginning of the second set. In such circumstances, the timing was obviously calculated for maximum "attention getting", which is difficult to correlate with spontaneous affection. The quest for celebrity by intruding into the spotlight beamed on other celebrities is now a well-known social phenomenom. But Aorta appeared to be motivated by spontaneous enthusiasm rather than the desire to be noticed. He wasn't stealing the spotlight since the drama of the match was over and television was, by that time, probably airing commercials.

I think there are three interesting cultural issues here:

1. the fact that the prosecution implies an equivalence between "physical" and "disturbing", which may reflect attitudes in the US about intimacy and distance but possibly perceived differently in other cultures (e.g. Spain),

2. the recently acquired reflex in US society to see everything in terms of security, as the core US value of "control" seems to have morphed into an obsession with security (notice that in the video the commentators refer to two historical references: the attack on Monica Seles and 9/11!... the focus is on danger and risk),

3. the growing and somewhate paranoid trend of separating celebrities from real people, which also has negative psychological effects on celebrities, who often (Michael Jackson, Lindsay Lohan, etc.) lose their sense of their basic human identity.

Of course this last point is aggravated by the growing cult of celebrity associated very directly with the cult of success in capitalist cultures. It is linked to phenomena such as the emergence of "Reality TV", which should probably be called Irreality TV. Celebrity breeds success and success breeds celebrity and both produce wealth and/or what's perceived as easy money and lots of it. Everyone seeks success because they seek wealth. Instant success seems to have become a universal dream, which takes us one serious step further away from social reality.

In the original lyrics of "As Time Goes By" the first line contained the phrase "a kiss is just a kiss, a sigh is just a sigh", but when Warner Bros. integrated it so emblematically (and brilliantly) into the film Casablanca, Dooley Wilson sang, "a kiss is still a kiss, a sigh is just a sigh", highlighting, in Hollywood fashion, the positive side of kisses and the negative value of sighs (you always have to distinguish clearly the good from the bad).

I guess today it would be "a kiss is still physical contact".

Monday, August 17, 2009

Taking stock of Woodstock

The big talking point of August is, for many, Woodstock, classified as a major historical event. As a jazz musician at the time, I was only peripherally interested in rock and paid little attention to the event itself. I did however identify with the political causes that were massively voiced at Woodstock (notice that "peace" is billed ahead of "music"). But apart from its being big, messy and reasonably pacific, I don't remember its making a major impact on the news cycle or a culture that was already imbued with love-ins, protests and other spontaneous manifestations of a highly visible counter-culture. It certainly didn't beat the Democratic convention in Chicago that took place a year earlier and that, in a certain sense, was still going on with the trials of the Chicago seven (plus one). Hoffman, Rubin, Seale and Hayden had achieved media celebrity status for purely politico-cultural reasons. And they were taken very seriously by foes and sympathizers alike.

So why do we remember Woodstock rather than Chicago? It strikes me as particularly odd that we have succeeded in sentimentalizing, as if it was a moment of triumph, an event that was clearly a swansong for the new culture it is reputed to represent. Some analysts have maintained, and I would agree, that Woodstock provided the rationale for the political culture embodied a decade later by Ronald Reagan (who at the time was already the governer of California, the state of the hippies)! The key to this sentimentalization and to the integration of Woodstock into our current cultural mythology was finding a way to eliminate its political and historical component. The singers and groups at Woodstock generated emotion by calling into question all traditional institutions, protesting against the war, demanding civil rights. But all that has been forgotten or simply vaguely recalled as part of the indistinct and very muddy d├ęcor, in spite of Bush policies that have provided a real political parallel. What has been retained today - thanks in part to the existence of the feature film of the event - is the celebration of individual talent and its exploitation through the music industry (1969 can be seen as the year when all music started to become intensely industrial and commercial, a phenomenon I hope some specialized historians will someday try to examine).

Woodstock is remembered for its stars, its great musical moments more than the mythical communal experience, which was certainly less idyllic to experience than to read about 40 years later. What is signifcant is that the idea of focusing on talent and stars (including star wars!) has dominated US culture ever since (think "American Idol" and "Dancing with the stars", both utterly unimaginable in the 60s; even Donald Trump's "the Apprentice" partakes of this celebrity ethos).

The strategy of encouraging celebrity ambition has proved powerfully effective in the short-term management of social conflict. For example, the key to reducing the racial tensions that had produced major riots regularly throughout the 60s was to encourage talented African American individuals to become full-fledged, fully integrated, and highly idealized celebrities, strongly admired by other white celebrities. Examples of popular African American celebrities "proved" that all are equal because even blacks can achieve the American dream, provided they make the requisite personal investment. The Will Smiths, Oprahs, Michael Jordans (even OJ in an ultimately less predictable way) have given the white community a better conscience ("we love and admire blacks and pay high prices to see them perform"), which has channeled a lot of the nervous energy of black youngsters away from protest against a system that remains structurally racist and towards goals of personal success (entertainment, sport, music) or, failing that, of collective aggression amongst themselves (gangs). Rap culture combines both by generating a series of media stars apparently whose unique selling point is their "gangsta" values. And while the white community occasionally disapproves of the rhetoric, it consistently celebrates the business acumen and the pure accomplishment and popularity of those who succeed! The rapper rhetoric is provocative in the extreme, but unlike that of the 60s isn't intentionally subversive. (The ancestor of rap was Gil Scott-Heron's 1970 song "the revolution will not be televised", which was totally political).

Of course it was always true that even African American individuals could succeed, but previously they had to make a point of espousing white values and accepting white rhetoric. Louis Armstrong - the smiling and utterly unprovocative entertainer (unlike say, Fats Waller, who never hid his irony) - was the epitome of the successful black at a time when African Americans weren't allowed to compete in major league sports, including basketball. (Yes, before MLK apartheid actually did exist in the US and not just in the South). Paul Robeson was the opposite of Armstrong, initially celebrated for his talent, which he deployed in opera and musical comedy - far more respectable and closer to standard white values than jazz - he was ultimately and brutally excluded from the system for his politics.

There's one other phenomenon related to Woodstock that intrigues me. In an era of burning flags and draft cards, Jimi Hendrix "desecrated" the national anthems, highlighting the mindless and uncontrollable violence couched behind traditional patriotic sentiment. It was incredibly provocative. At the time I assumed that, after the continuous assault on the symbols conducted by my own generation, the status of both the flag and the national anthem would be readjusted to a more normal emotional level (Nixon handled the problem of draft cards by abolishing the draft). Septemeber 2001 showed us that the flag survived the red glare of the cultural rockets and plastic cigarette lighters of the 60s utterly intact, but the fate of the Star-Spangled banner has been a bit different. Living in England at the time, I was a witness of sorts to the disappearance of God Save the Queen from cinemas. I assumed something similar might happen with the Star Spangled Banner at sporting events. I was wrong of course. But something did happen as an indirect result of Jimi Hendrix's performance at Woodstock. The song was increasingly given to black singers (celebrities, of course) to introduce sporting events. They interpreted it with inflections and vocal fioritura derived from motown, soul music, r'n'b, etc. creating a kind of subdued irony that persists to this day as if to say (ok, this is the obligatory tribute to honky culture, but we're going to add our own cultural contribution to it, even if that means taking it in a different direction, and we know you dumb ofays are going to applaud). When white singers (celebrities) are asked to do it, they can no longer "sing it straight". They generally follow the lead of the blacks, with their own original touches (which may, for example, be derived from country music), but one senses that the irony is lacking: it's an exercise of pure imitation or conformity to a media imposed norm.

And now we arrive at the era of Obama, the first black celebrity to be elected president, the man who was caught on camera not holding his hand over his heart during the national anthem! There's a lot of historical irony at play here. But it's clearly far too early to tell this will take us.

Saturday, August 15, 2009

Reading facial expression

A colleague in the Intercultural Insights group has just drawn our collective attention to an article on the BBC website entitled, Facial expressions 'not global'.

For this kind of scientific reporting, I always feel the need to look for missing significant variables, whose absence could have an impact on the general conclusions put forward by the researchers. It's important to remember as well that these behavioural/psychological studies themselves conform to an array of cultural patterns and rules related to how research is funded, conducted and its results communicated in the West, including the role of the media in selecting and publicising the "conclusions".

It isn't just to be captious that we need to look for missing parameters. If our aim is truly scientific, we must assume that the failure to take any vital parameter into account can seriously influence the interpretation of the results.

So concerning this study, as it is reported in the BBC article, I propose two major observations:
  1. Real human emotions are never expressed as static poses... except in certain conventional iconographic and theatrical traditions, which vary from culture to culture! Emotion always derives from a context implying the presence of a number of dynamic elements in the expression of emotion, as well as a certain synaesthesia, or association of simultaneously processed sense perceptions (sound, movement, even smell as well as awareness of pbysical tension and what I would call "dramatic structure" or transitional logic in moving from one affective state to another, to say nothing about the phenomena related to unconscious synchronisation*).
  2. Every culture has developed, through its artistic and representational traditions (including advertising), an iconography of the static expression of human emotions. These traditions, which appeal to formal narrative including poetry, drama and religious and moral allegory - have a powerful influence on our perception of new images but - I would maintain - far less on our reactions to real communication situations.
In short I think it is an error to draw any cultural conclusions from this type of experiment other than to observe that, in this type of artificial interpretative exercise, East Asians are more likely to look for clues in the eyes and Westerners in the mouth. There may even be a link to the phonetic characteristics of the languages as well as to the strategies related to saving or losing face and respecting a principle of harmony by refraining from openly expressing one's emotion (meaning that the eyes may be the only reliable, though still ambiguous, guideline to interpreting emotion, as experts in lie detecting tell us!).

For all these reasons I think it would be an error to use this experimentation to draw conclusions about how people of different cultures read emotions in real situations. A simpler, more coherent and probably more honest conclusion would be to point out how this experiment seems to validate at a more trivial level Richard Nesbitt's research that led him to conclude that Asians are more dependent on context than Westerners to identify "meaning". Take away context - as photographs do - and the results are bound to be different.
An example of the type of photographs used in the study

The exercise used in the experiments is closer to the act - privately individual - of reading a book than engaging interactively in dialogue with another person, and yet the researchers are suggesting it tells us something about how people react to human dialogue. The proposed conclusions about eyes and mouth may be no more meaningful than to say that Arabs have a tendency to focus on the right side of the page when reading a book, whereas Europeans tend to focus on the left side... and then to conclude by creative extrapolation that one or the other culture privileges the right of left hemisphere of the brain!

As for the emoticons, which were initially a form of wit as practised by geeks on the Internet in the previous millennium before the advent of the multimedia Web, the principal variable I should expect to find should be sought for in the contrast between populations that use alphabets and those that use characters, which are already drawings. I find it curious that the researchers didn't seem to consider that influence. After all, if you turn a word composed of letters of the alphabet on its side, people still recognize it, but if you turn an ideogram on its side (given the indeterminate number of ideograms as compared to a strict limit of 24 to 26 letters), people are likely to seek a different meaning or simply fail to recognize the ideogram (or logogram). Another factor worth considering might be that script itself can be presented in vertical columns (the traditional way) or horizontally.

I would therefore propose changing the tagline of the article from "A new study suggests that people from different cultures read facial expressions differently" to "A new study suggests that people from different cultures use different strategies to classify emotions purportedly represented in photographs of facial expressions". Not very exciting, but certainly true.

I should add that I've personally done a lot of work on capturing and representing emotions through still images taken from video in the context of my work on multimedia resources for language learning. Because my concern was to use such resources to sensitise learners to the semantic component of affect and to help them discover and appreciate phonetic variations (rhythm, intonation, intensity, tension, etc.) in their relation to the expression affect, I can witness to a simple fact: that the exercise of capturing and representing unambiguously any emotion in a photograph is a perilous enterprise! Facial expression alone is always ambiguous, even in so-called "direct" cultures where the norm is to signify verbally and non-verbally what you think, "harmony be damned". In the course of my multimedia work I have organised and exploited photography shoots with professional actors to get them to express specific emotions and attitudes. The result on the page is never wholly satisfying in terms of representation... unless it is specifically iconographic (e.g. imitating the poses of sorrow derived ultimately from Renaissance depictions of the Crucifixion). But so long as one accepts ambiguity as a structural principle (and that is a key but much neglected point in traditional language teaching), the pedagogical result can be satisfying... precisely because the objective is to augment the sensitivity of the learner to the effect of context and synaesthesia.

After reading articles like this one, I'm invariably left with the impression that a lot of popular science just doesn't do nuance... or if it does, the media won't bother with it!

* In “The Dance of Life: The Other Dimension of Time” Edward T. Hall maintians that “people are tied together and yet isolated by hidden threads of rhythm and walls of time.” This is not only true of dialogue, which seems fairly straightforward, but also of much more complex interactions, as the following passage illustrates:

Rhythm is basic to synchrony. This principle is illustrated by a film of children on a playground. Who would think that widely scattered groups of children in a school playground could be in sync. Yet this is precisely the case. One of my students selected as a project an exercise in what can be learned from film. Hiding in an abandoned automobile, which he used as a blind, he filmed children in an adjacent school yard during recess. As he viewed the film, his first impression was the obvious one: a film of children playing in different parts of the school playground. Then — watching the film several times at different speeds, he began to notice one very active little girl who seemed to stand out from the rest. She was all over the place. Concentrating on the girl, my student noticed that whenever she was near a cluster of children the members of that group were in sync not only with each other but with her. Many viewings later, he realized that this girl, with her skipping and dancing and twirling, was actually orchestrating movements of the entire playground! There was something about the pattern of movement which translated into a beat — like a silent movie of people dancing. Furthermore, the beat of this playground was familiar! There was a rhythm he had encountered before. He went to a friend who was a rock music aficionado, and the two of them began to search for the beat. It wasn’t long until the friend reached out to a nearby shelf, took down a cassette and slipped it into a tape deck. That was it! It took a while to synchronize the beginning of the film with the recording — a piece of contemporary rock music — but once started, the entire three and a half minutes of the film clip stayed in sync with the taped music! Not a beat or a frame of the film was out of sync!

...When he showed his film to our seminar, however, even though his explanation of what he had done was perfectly lucid, the members of the seminar had difficulty understanding what had actually happened. One school superintendent spoke of the children as “dancing to the music”; another wanted to know if the children were “humming the tune.” They were voicing the commonly held belief that music is something that is “made up” by a composer, who then passes on “his creation” to others, who, in turn, diffuse it to the larger society. The children were moving, but as with the symphony orchestra, some participants’ parts were at times silent. Eventually all participated and all stayed in sync, but the music was in them. They brought it with them to the playground as a part of shared culture. They had been doing that sort of thing all their lives, beginning with the time they synchronized their movements to their mother’s voice even before they were born.

Sunday, July 12, 2009

The culture of business and the business of culture

Confusion is rife about what culture is and what it means in the business world. Talking recently to the training managers of several major multinational companies, I discovered that in spite of thunderously significant statements about corporate values pointing to respect, innovation, creativity, diversity and ethics in the construction of a compelling corporate culture, the only thing that is actually done about culture is to prepare future expats for the practical concerns of living in a different country. Diversity training, on the other hand – where it exists - tends to be more about compliance than reaching out to understand and embrace other cultures. The organizational impact of the multitude of complex cross-cultural interactions that take place every day appears only randomly in the strategy and hardly at all in the area of training and knowledge management.

One of the areas of confusion that help to explain this situation may be the sheer diversity of meanings attached to the notion of culture. From the very start, we have to distinguish culture from Culture (the arts). Then we have to deal with the multiple and mysterious origins of any particular person’s cultural profile. The cultures that guide our perception and interpretation of the world and people's behavior are not only national or regional cultures. We commonly list alongside these linguistic, ethnic and religious foundations of culture. But we also include corporate culture (specific to particular enterprises), occupational culture (practised by people in the same job area) and generational culture. Interestingly the culture of business (that tells us how to think about and orientate business decisions), as developed through the dominant management models, has been largely – or should I say royally? - indifferent to the diversity of cultures in the workplace, even though everything people do is first of all filtered through their specific cultural lenses. In the dominant "business culture", only economic acts are significant and reasoning always and uniquely tends towards the "bottom line", generally meaning things that can be measured in terms of short term results. Alas, culture is by definition long term!

The dominant model I’ve just referred to is of course the Harvard Business School way of thinking, which is the object of very recent commentary by Shoshana Zuboff, a former HBS professor in an article in Business Week that starts like this:
"I have come to believe that much of what my colleagues and I taught has caused real suffering, suppressed wealth creation, destabilized the world economy, and accelerated the demise of the 20th century capitalism in which the U.S. played the leading role."
Towards the end of the article, the author talks about an emerging "economy of trust" and says this, "These economies of trust are becoming even more important than economies of scale."

Creating a basis for trust is a function of all culture. Creating trust among people of different cultures is the biggest challenge businesses (and governments!) are facing in the 21st century. The age of competitive nationalism and a purely competitive economy appears to be waning. Still there's a lot of work to do on the cultural side to make it work. Perhaps this Community of Practice can make a significant contribution.

Saturday, March 21, 2009

PC has struck again, and this time it's made headlines, revealing some interesting aspects of US culture.

I'm of course referring to Obama's self-deprecating joke in which he mentioned "Special Olympics". Whatever his humorous intention, which was clearly pointing at his ineptness at bowling, the nation as a whole (nearly) and the media in particular have decided that this enters into the realm of moral failure or "serious sin", with some debate (among Catholics only) about whether it is mortal or venial. (One female black TV journalist asked whether his apology to the Chairman of the Special Olympics was "enough" or did he need to do more... a pilgrimmage to Athens in a wheelchair?). Some of the commentators extend the moral fault to the audience who actually laughed. The LA Times quoted Maria Shriver, Schwarznegger's wife and sister of the Chairman of the Special Olympics, Timothy Shriver:

"While I am confident that President Obama never intended to offend anyone, the response that his comments have caused, coupled with the reaction of a prime-time audience, demonstrate the need to continue to educate the non-disabled community on the issues that confront those with a developmental disability."

This all seems to boil down to what I call the actively repressive impulse at the heart of US culture, the same that sees exclusion, shaming - followed by rehabilitation - and/or killing as the appropriate response to an undefined black list of things one shouldn't do or say, sometimes thought of simply as "un-American". The fact that such an state of affairs is a recipe for hypocrisy in a culture where hypocrisy is considered the worst of all sins is in itself both painful and amusing. Having two potentially contradictory ideas or changing one's mind is typically seen and highlighted by one’s critics as proof of hypocrisy. Politicians call it the sin of "flip-flopping", which sunk Kerry in the 2004 elections. The funniest and most extreme example I know of that is when, as a teenager, I and three other friends were interrogated by two policeman in a dark parking lot in downtown Los Angeles. They separated us and placed us in the four corners of the parking lot to interrogate each of us individually. Not finding any contradiction in our stories (we had gone to see a film and were getting back in the car) one cop came over to me and asked for my driver's license on which it indicated that I was born in Chicago and my address was in West Los Angeles. He repeatedly asked me - with a change of tone each time - "how do you explain that it says here you were born in Chicago and live in Los Angeles?", a question that showed an amazing ignorance of sociology, since 2/3 of the population of southern California at the time had migrated from elsewhere. But he obviously thought he was on to some deep contradiction, a flaw in consistency, a proof of guilt without there even being a crime. (Maybe he also thought anyone born in Chicago was a member of the mafia).

This may seem to have little to do with the original point, but there is a connection, one that Ralph Waldo Emerson had already noticed in his famous observation (in "Self-Reliance"):
"A foolish consistency is the hobgoblin of little minds, adored by little statesmen and philosophers and divines. With consistency a great soul has simply nothing to do."

Had Emerson lived another century and a bit he could have replaced “statesmen, philosophers and divines” by "media". He would undoubtedly be appalled by the way all this has played out, even as his championing of self-reliance has been largely accepted. The concept itself has become a shibboleth but its meaning has come to represent the opposite of what he intended: an incitement to conformity and artificial consistency. Self-reliance now carries with it the obligation to be in phase with the crowd, which is precisely what PC tries to achieve: artificial consistency. Whatever people think or feel, they must conform to a formal code that will help them think and feel according to the norms. Self-reliance has become self-control. We see it in the concern with "continuing to educate" mentioned by Maria Shriver. Education in this usage has nothing to do with learning and everything to do with behaving predictably. It consists not only in knowing what not to say but also when not to laugh, even if you think it's funny. Some may remember Freud’s contention that humor and the ensuing laughter are the result of a spontaneous and pleasure-inducing shock between the unconscious and the conscious, which he considered a healthy way of letting off potentially unhealthy steam but which requires us to accept and appreciate ambiguity. This new form of PC education consists of not allowing any steam to escape and be perceived by others for fear of its corrupting effect.

And that leads me to an even deeper dimension of this question: the fundamentally and in many ways increasingly repressive impulse at the heart of a culture founded on the ideals of "freedom" and self-determination. Freedom has traditionally been seen as a good in itself and indeed more than a good, a moral ideal to be enjoyed at home and exported abroad, extending increasingly over recent decades to what Freud might call the right not just of the person but of the id (das Es) to achieve fulfillment so long as no visible damage is done to others (who can always sue if there is damage!). This freedom of the id – or the person as id - can be seen as the opposite of civilization, whose role is "sublimatation", avoiding both the direct expression and the systematic repression of the drives of the id. This new version of freedom (I have a constitutional right to do what I want irrespective of my social environment) inevitably requires some kind of mechanism to keep the lid on the id and its potential for chaos and destruction. Enter three actors (the ghostbusters!): repressive laws (including a deep commitment to capital punishment), PC and... silence.

If PC is a list of words people shouldn't say, silence is a quality forcibly attached to ideas one shouldn't have. There are many examples of this but I'll offer one that is very obvious today. As current events demonstrate the US economy and political system in the way it actually works (as opposed to the way it was designed and is believed to work) could be objectively described the opposite of democratic. Complicity between bankers, industrialists, politicians, the media and practically anyone who is assertively greedy and surrounded by good lawyers has made "the voice of the people" something of a sad joke, the realization of which has in recent weeks sent a shockwave through the population as it learns that the "average Joe" is nothing but a convenient sacrificial victim of those who run the show. This system built on cupidity and silence remained stable so long as trickle down economics seemed to work. All those greedy bastards tied to power mongers were actually keeping the machine going and therefore doing their job for the benefit of all. After all, if the crumbs that fall off the table are tasty and plentiful, who needs bread? But when you are required to collect the crumbs and give them back to the seated diners who have suddenly discovered that they've devoured all the bread (to say nothing of the meat and potatoes), one starts to feel the pain of hunger accompanied by pangs of resentment. And we begin to see who will ends up in tomorrow’s stew!

All this was made possible by... silence, in other words the repression of both dialogue and debate. The particular form of individualism developed in the US has made it possible for those in control to program ideas in a way similar to the way PC programs words. There have been and still are subjects that simply cannot be talked about, tabu, repressed from public awareness. (At the same time intellectuals are free to examine these things in their ivory towers so long as none of it spills over into the public arena, which it won’t because the concepts they use are on a virtual black list). The concept of socialism and its multiple avatars in the real world - which used to be more conveniently lumped under the rubric of Marxist communism, aka totalitarianism - has been a constant in the list of "things to be rejected before being discussed". It's now making a virulent comeback - in its repressed form, i.e. as a hobgoblin, a factor of dread - in reaction to the rising danger of taking seriously the concepts and practices associated with it (e.g. managing the collective wealth) as a response to growing criticism from within of the capitalist system. How that will play out is anyone's guess, since it has less to do with political decision-making and everything to do with the viability of the current financial system and ultimately with the grasping, pinching, casting off, squeezing, smothering or caressing of capitalism's "invisible hand" (hand actually do other things than just pointing fingers). What's interesting - and infuriating - is that the PC effect is producing its usual Manichaean division into choosing between good and evil: capitalism (us) and socialism (them... i.e. the unenlightened). No need for nuance, which we all know is a time-waster that makes decision-making difficult, complex (beyond the average Joe's understanding) and impossible to rally around, the way one rallies around a flag.

Depressing? Not entirely. It’s just the media’s insistence on following the white lists and banishing everything on the black list that hurts. The LA Times article starts out with this sentence:
“Despite the president's apology, athletes and others say they are disappointed with his remark on Jay Leno's show”
It maintains the idea of a uniform consensus of umbrage and indignation until the final paragraphs of the article, which concludes in this manner:

Brothers Rich and Ted Olson have participated in the Games for more than three decades and don't have enough space in their suburban Glen Ellyn, Ill., home for all their medals and ribbons. The Olsons, whose scores typically run in the 140s and 150s, didn't find the joke offensive, but Rich laughed when he heard the president's score.

"That's not very good," he said. "It wouldn't beat us. He needs to practice."

They actually didn’t take umbrage and pushed the humour even further.

Civilization is not dead or totally repressed. It has just been pushed to the end of articles, where people won’t read it (or worse won’t understand it because it doesn’t jibe with the rest)!