Sunday, December 14, 2008

The culture of greed

Today's LA Times report on Schwarzenegger's fund-raising proclivities (Where Schwarzenegger goes, money follows) includes this observation:

"That's the way the system works, and it troubles me," said Derek Cressman, Western regional director for Common Cause, who worked with Schwarzenegger on the initiative and has written a book critical of his fundraising. "The governor, like every other elected official in our state, pays more attention to those people who support him than those who don't. And those people who support him with big checks get noticed."

Schwarzenegger obviously has a better sense of the law than Rod Blagojevich, whose main excuse is that what he did is perfectly consistent with political culture in Chicago, where everything has always been up for sale. But how significant is the difference? And however signficant, what do the two cases - among thousands of others - tell us about "democracy in America"? To me the answer seems simple: politics is a marketplace and as with all marketplaces there is a white market and a black market. The white market is dominated by people who are usually (but not always) careful to remain within the constraints of the law, while manipulating those constraints is an integral part of their well-honed craft of business and art of politics. In both case this translates as a way of getting an unfair advantage over one's competitors and culling profit from unwitting consumers and voters. And because it's white it appears invisible. The black market is dominated by close and surprisingly secret networks of accomplices whose philosophy is less to "share the spoils" (sharing isn't a core value of their system) than to protect each other from discovery (the extreme case being omertà).

But when you look at the sociology of white and black markets you discover another signficant factor. White markets are dominated by those who have accumulated wealth (e.g. legitimately invested capital that then needs to be protected through persistent commercial advantage) whereas black markets thrive through the energy and initiative of those who have no capital but are intent on adopting the white life style. Schwarzenegger is a white market politician because his wealth came from another source. Blagojevich is a black market politician because 1) he's relatively "poor" 2) he was nurtured by the secretive black market system and remains loyal to its practices. Schwarzegger doesn't need to milk politics for his personal survival; Blagojevich does, at least insofar as "survival" is defined as living according to the minimal standards of the elite. He was driven by personal debt.

In other words, white markets and black markets are the US version of yin and yang, where the white, sitting comfortably on top, still depends on the dynamics and energy of the black, most of which is attracted to the white (wishing to stay legitimate) but a significant part of which remains attached to the values and habits generated by its roots in poverty, deploying its intelligence primarily to acquire enough to be seen as white and eventually to move over to the white side.


But the comparison with yin and yang should stop there. However similar the dynamics, all cultures are different. Chinese yin and yang is based on the value of harmony: natural opposites flow towards each other in order to establish a dynamic balance. US American yin and yang is based on greed, the belief that everyone pursuing his or her self-interest will automatically produce a better equilibrium as they move over to the other side or move up from where they were. And combined with the Puritanical ethos inherited from our historical predecessors, that equilibrium depends on the domination by the wealthy of the poor, since the wealthy are protected by their wealth from having to violate the laws in order to survive (and can, in any case, manipulate them, thereby attaining recognition, in the Puritanical scheme of things, as the "virtuous"), whereas the poor (the dark "yang") are forced to organize themselves outside the law and risk being caught before they are recongized as "yin" and so be considered automatically as virtuous. Blagojevich is a typical "yang" trying to become "yin" and almost making it, but still living within the yang system that is founded on black market principles.

Greed is indeed the motor of everything and officially recognized as such in the capitalist/puritan ethos. And so, in the midst of the biggest crisis of capitalism in 200 years (yes, bigger than 1929 because striking deeper into the system in a world that is now clearly post-industrial), the news is dominated by stories of greed, the latest and perhaps most spectacjular being the case of Bernard Madeoff (sorry for the spelling but it is true to say that he "made off" with $50 billion of other people's money). Was he yin or yang? Or does he just represent the principle at the middle of the whole dynamic... greed? Is he the black dot in the white or the white dot in the black? He went from investing "$5,000 that he said was earned from working as a lifeguard and installing sprinklers" (Wikipedia) to chairing the NASDAQ. And he used his "reputation" to cheat both hedge funds and charities out of their assets. And therein we discover the real secret of all societies and why greed is the least stable principle of social organization: reputation. Trust depends on reputationon and no society or economy can survive without trust. It doesn't matter how much law you have (and how many lawyers): trust is the foundation of everything. But in a society of greed trust is considered naive. Contracts replace promises and relationships. Yet human nature craves trust and human beings continue to build relationships around it... or at least around the illusion of trust. And so it has come to pass. Hasn't our whole commercial culture evolved away from the foundational principle of trust towards the subtler and more "businesslike" notion of creating the illusion of trust while at the same time seeking an infinite number of ways to betray it (legitimately or illegitimately, according to circumstance) ? I don't think the people who invested in Madoff's Ponzi scheme were inordinately trustful. They bought into his illusion because he was seen to be a solid symbol of the white market, legitimate - legally sanctioned and officially organized - greed as opposed to lawless greed. His reputation was such that they couldn't imagine he was operating according to the alternative laws of the black market.

We are living an historical moment when the culture of greed is undergoing a serious shock. Will the current crisis be solved by the self-policing of the greedy or by the radical calling into question of greed as a core value, the lowest common denominator of social motivation? In any case, the confusion is deep. The neat black and white of yin and yang has gone surprisingly grey. The economic and managerial culture, exported from the US to the rest of the world for the past 60 years, is unlikely to be the same in the decade to come and beyond.

Thursday, September 04, 2008

Facebook and the culture of learning

Jay Cross, following the lead of JP Rangaswami has provoked a discussion of the possible pertinence of Facebook as a tool for learning. As someone who attributes to Facebook a good part of the successful emergence of the Social Web as a general cultural phenomenon, which I see as the key to the future of learning in a radical break with the past, I have no axe to grind with Facebook itself. Yet I allowed myself to take a contrarian position on this question by proposing first to situate the cultural foundations of the Facebook phenomenon in very general terms (an appeal to US individualism with a strong appeal to ambient narcissism) and then to examine the possible factors of motivation behind some of the observable emerging trends in knowledge management through social tools. I could have gone much further by examining the nature of what I would call the collectivist impulse that is a very strong but perpetually marginalised component of US culture, but I was only making a brief commentary on the issue raised by Jay. A further comment by Clark Quinn, in which he takes the position that Facebook is about doing rather than being, provoked a further comment, which I published on Jay's blog. Since it raises a possibly controversial issue, I'm reproducing here to give it a specifically cultural (rather than learning-related) context. The starting point is the very civilized dialogue that is now taking place across blogs and on into conferences and unconferences in which people who generally think and work in similar ways have come together to predict and sometimes prepare the future of both learning and corporate communication. What follows is my second commentary to Jay's Informal Learning blog.
__________

It’s interesting how we can all be part of a culture that agrees on principles that include transparency, generosity and trust. I don’t think any of us would have a problem reaching a consensus on those ideals, their pragmatic interest and the necessity to promote them so that they become a part of the entire professional culture that surrounds us. But sometimes I have the feeling that our conversations resemble a political convention pushing candidates and platforms and engaging in massive self-justification. The risk is a lack of critical perspective.

The danger I see in this – as well as the explanation of the temptation itself – is that a different kind of command-and-control model is looming in the background. Could it be that having failed to establish control over subordinates and colleagues because of new lifestyles inaugurated and reinforced partly by technology within a culture that is, at bottom, both individualist and consumerist, we are seeking to create new norms of monitoring and surveillance built on the now trendy principle of gaining knowledge of everything everyone is “doing” in order to micromanage them? Knowledge is power. And the road to consolidating that power is the appeal to the narcissism of the few who set the standard for a newly idealized exhibitionism. The best way to do that is to create behavioral norms of self-revelation. Confession has always been the most efficient way of solving crimes!

When we began transforming corporate culture in the 1980s by getting people to learn how to use the PC, we had to struggle with the resistance of managers (IT managers being clearly the worst of the lot) to the idea of transferring power to lowly employees. This was a typical cultural problem… i.e. typical because it played out below the threshold of conscious awareness. But it was real. It helps explain the victory of MS-DOS over Mac in the corporate world: the austerity of DOS was a gauge of seriousness, ensuring a better focus on highly controlled work processes.

With the advent of the World Wide Web things got seriously out of hand after the only limited damage of Windows (tardily imitating Mac). With the Web, staff could do all sorts of things that had nothing to do with their programmed tasks: personal e-mail, games, pornography, blogging, etc. all of them considered to be absolute distractions from serious work.

So what’s the best cynical strategy for re-establishing order? Create a culture that makes spying the norm, not through clandestine operations and strict policing but through the provoked complicity of the spied-upon. Promoting the idea of self-promotion, encouraging exhibitionism as a basic value, one which will be perceived as a key to advancement, is by far the most efficient way of ultimately gaining control over behavior. Looked at from this perspective, Clark’s remark that it isn’t “look at me” but “look at what I’m doing” says it all. Command-and-control style management isn’t interested in the “me’s” that populate the workplace; they’re interested in controlling what those self-interested me’s are doing. And why not, if everyone agrees? It could be the solution to the problem. But I see it less as a question of learning than one of spying, controlling and “normalizing” behavior.

And are we sure everyone actually does agree with the permanent need for self-exposure? I’m not. Can we be sure that it truly will solve the problem of distraction? I’m not convinced of that either.

Is this a conscious strategy of subversion? Certainly not, but most of management and power culture isn’t conscious. My analysis may seem slightly paranoid, but I firmly believe society needs the milder forms of paranoia (conspiracy theories) as a ferment to help refine our analysis of innovation and the motives behind it. After all, it may be the only known antidote to the syndrome of the political convention!

__________

Going beyond the commentary published on Informal Learning, I should append my own belief that, as in all phases of cultural evolution (or revolution), there will inevitably be a pronounced struggle between centrifugal and centripetal forces. In such situations it's always interesting to see by what means and how thoroughly these conflicts are resolved. The centrifugal trend can lead to the emergence of new and multiple centers of gravity radically revising the old system or simply transfer mass within the existing system to other areas reconfiguring superficially the dynamic gravitional relationsuips. The centripetal forces can, when they are effective, pull in the energy and reverse the inertia of the forces that were initially directed outwards, creating new internal dynamics. It all depends on how flexible the systems are as well as how consolidated the force is in either direction. That is why I think that even when we are impressed by a new disruptive force, we need to look carefully at how the non-disruptive forces will react.


Monday, July 14, 2008

Phoning it in

In response to some of the contributions to this month's Big Question (Lead the Charge?) on the Learning Circuits blog, I don't see this as a question of technology itself or even technology literacy. It's more a question of cultural shift.

When, more than a century ago, there were only a few telephones around, most people wondered how those damn things worked and some even wondered out loud whether they could serve any useful purpose. When I settled in France straight out of university in the 70s, telephones were few and far between. My wife had never had a telephone in her home! While she had no problem with the technology itself - thanks to pay phones! - she and her circle of friends definitely didn't have a telephone culture (in contrast to my own, acquired instinctively throughout my childhood in California). On the other hand, as soon as France modernized (very quickly), the whole population adopted a strong telephone culture. Nobody analyzed; nobody "organized" the cultural shift; nobody pro-actively developed telephone literacy. It just happened, though it took a few years. (I did, however, a decade later, work on an interactive video training program called “Make the Telephone Work for You” in the UK and “Le Téléphone à Votre Service” in France, which focused on telephone etiquette with clients).

Does the example of the telephone sound trivial? Perhaps it does to baby boomer nerds who have invested so heavily in building their own cutting-edge knowledge of all things digital that they are unwilling to admit that those who don't spend their days and nights meditating technological innovation are condemned to living in analogue un-networked hell. For them (i.e. us, or at least some of us), yes, it's complex, otherwise it wouldn't be worthwhile. But as Professor Mitra's "hole in the wall" experiments have shown, you don't need to be initiated into an exclusive club to use it... and to use it creatively and collaboratively!

The social Web has started off in a predictable way within the consumer society, with an emphasis on narcissism and self-indulgence. This puts it clearly at odds with corporate culture. That could be considered a more serious problem than complexity. But that reminds me of the work I did in the 80s when I was saddled with the task of trying to kickstart a PC culture in companies. My analysis of the Mac-MS-DOS war, ultimately won by Microsoft, was that enterprises chose IBM/Dos over Mac because it was LESS attractive than the Mac. You weren't likely to have fun with it, so it was less of a threat to the command and control culture of the corporate world. Senior management and IT departments were worried sick about the dispersal of authority that might occur if everyone was managing their own data and free to use such flexible tools. So what happened? Two things:
  1. Client-server applications took over, creating a whole new culture for almost the entire workforce, a culture which is with us to this day.
  2. Dos was replaced by Windows and PCs evolved, culturally speaking, into carbon copies of the Mac, with more and more multimedia frills (for a while Apple was even left behind as the much more democratic Windows concept produced more significant innovation).
Then of course came the Web, peer-to-peer technology and an emerging netcentric culture with the Web 2.0. The model hasn't changed yet, and there are numerous concerns and worries on the part of those who feel their authority may be threatened, but there's little doubt that it will happen. Pushing it through official channels may be the worst thing to do, because it will provoke resistance. I would put my effort into making it work from the bottom up and demonstrate how it can achieve other things than self-promotion.

Sunday, July 13, 2008

The definition of informal learning

My friend and colleague, Jay Cross, on his Learning Blog has challenged the community with the question “what is informal learning?". Here's my definition in a nutshell:

Informal learning is perception mediated by social interaction and converting into behavior, which in turn converts back into perception.

How do children learn their language? Answer: by actively constructing it informally. Second anwser: by staying free of any formal learning until the age of 5 or 6! A child's language learning goes through stages but the process is cyclical. It always involves:

1. listening and discriminating those elements (sounds, phrases, sentences) that appear to be meaningful, whose meaning is indicated by both emotion (affect) and action (association producing both causal and descriptive links between things, events and language),

2. participating in language production experimentally (speaking) before mastering the rules, starting with repeated syllables and growing incrementally as proprioceptivity develops,

3. Interacting in varied situations of real and simulated need (play), judging the value of the reactions provoked and adapting.

How the brain builds language competency (sentence forming capacity) only God and Noam Chomsky can tell us. But the easily observable fact is that the only way it can be achieved is informally. When children get to the phase of formal language learning (i.e. school, with a little bit of useless parental coaching before that: e.g. “not 'he goed' but 'he went'), it is style that is taught formally, not language. Why the educational systems of the world fail to recognize this is beyond me, although I can think of some good political motives for perpetuating this error. All of which may explain why it's so hard to learn style and so few achieve success with style. There are too many people teaching it and not enough learning it. This is also why no amount of formal teaching can result in the learning of a foreign language. At best teaching provides a map that gives enough spatial orientation for the learner to begin interacting with a real environment in a state of minimized confusion. And spending too much time on the map before confronting the real informally will distort perception of the real. But how many teachers think of their work as first of all provoking the growth of spatial orientation?

Informal learning can be thought of as the kind of mental map making we all end up doing on our own in any area where we feel minimally competent. Our map evolves significantly as we continue to explore our world and refine our skills through our interactions with people and things (think of Columbus's mental picture of the globe over the span of his four voyages). If we lack confidence in our own map and cling to the belief that the official map (e.g. curriculum) proposed in formal learning is the only valid thing, we end up learning... nothing! And although it's a notorious myth that Europeans believed the world was flat before Columbus proved otherwise – a falsehood that I was taught formally at school! - the formal learning of the time taught that there was nothing but a vast ocean between the west of Europe and the east of Asia. Thanks to Columbus's experience and evolving mental map we now know otherwise.



Franz Ackerman, Mental Map: Evasion V

Monday, June 09, 2008

Second Life... compared to what?

The Big Question this month on the Learning Circuits Blog is:

Second Life Training?

I take a very simplistic view of this. I see SL as just another place to go, with its own set of rules and, inevitably, with its own culture. You can learn things at the street corner, as you can in Second Life. It depends on who and what is there and the culture shared by those present. This is already the case for Second Life, of course, since cultures are created by users sharing the same space and the same tools. SL could therefore become - or perhaps already is - another informal space in which human activity can be organized. From that activity learning is of course possible. But turning it into a formal space for learning is fraught with risks, as many of the contributors have pointed out. The problem with formal training is that there's always something planned, programmed and enforced about it. SL is designed to be both informal (unpredictable) and artificial (programmed and controlled). There will always be a risk of contradiction and cultural confusion if learners are expected to use it as anything other than an indicated resource. Bandwith isn't the only problem; implicit cultural values and questions of learner identity are as well. But if SL is simply an alternative resource, it doesn't seem to me very different from other resources, from books to sims. It's something that requires a larger framework, one that clearly belongs to the real world, to achieve its meaning.

More fundamentally I see the SL phenomenon as similar to Esperanto, though it certainly is considerably more seductive. Like Esperanto, SL proposes an artificial and simplified version of natural human activity. For it to be truly useful as a standard device for learning, its use would have to be very widespread and its acceptance (independent of use) universal. The barriers to that seem to me such that, apart from local initiatives characterized by strong direction and a clear notion of structured goals, this is unlikely to happen on a major scale.

On the other hand, I expect that in the near future other VR environments will emerge, environments whose base culture (the way people interact) will be radically different and much better adapted to learning. And if they are truly adapted to learning one could assume that they just might be adaptable to teaching as well! This would constitute the revolution many of us have been waiting for or even helping to provoke: turning the current educational paradigm - designed for teaching only - on its head.

Learner identity has always been the key issue for me in any learning process. Second Life does two things that I consider suspect: it promotes fantasized identity, possibly inhibiting the natural evolution of real identity, and it reinforces traditionally overblown instructor authority by compounding the manipulative powers and artificial power of "knowledge authorities" through the addition of technological prowess. The culture of instructional intimidation which has been with us for centuries is manifestly still with us today, even on the putatively democratic Social Web!

Although I see this side of things as a step backwards, the contribution of SL to the historical process of putting learning before teaching may be pertinent. It consists of provoking experiments and eventually identifying best practice. It also consists of demonstrating the limits of this type of environment. A little more practice, a little more cultural analysis and a lot more innovation might bring us to a truly useful version of the power of virtual worlds. There are already other initiatives taking place. There's no reason why the virtual cannot pay its respects to the real. For the moment, Second Life is drawing the buzz and the curiosity on the basis of the attraction of fantasy and escape. It may be nothing more than a necessary prelude to something deeper and richer.

Wednesday, May 14, 2008

Why Obama won't make it to the White House

It's pretty clear now that Obama is the only possible Democratic candidate for the Presidency of the US. Any other scenario would be suicidal for the Democratic party, which would permanently alienate African Americans and youth, even more radically than it did in 1968, when it ushered in more than four decades of conservative Republican domination of national politics. The Carter interlude was a fluke due to Watergate and Bill Clinton was elected only because of the presence of Ross Perot on the ballot in 1992. Clinton managed to be re-elected only because Congress was dominated by the Republicans led by Newt Gingrich, whose policies Clinton deftly endorsed. And in spite of that "collaboration", Clinton and the Democrats were humiliated by the farcical impeachment drama.

The political history of the US over the past 60 years can be divided into two parts:

1.the 50s and 60s, when the young generation not only took an interest in politics but sought to revise the values behind political decision-making. The beatniks and a generation of young culturally sensitive intellectuals put their mark on the culture of the Eisenhower years without directly influencing the politics. But their contribution to US culture (poetry, jazz, neo-folk and to a much lesser extent - in political terms - rock'n'roll, which initially played a more conservative role) helped to create the atmosphere in which it was possible to elect the young John Kennedy;

2. the three and half decades from 1972 onwards, when the wild generation that had been given a free reign of expression in the 60s settled down to business as usual and began managing the nation’s and the world’s resources, as the empire invited them to do.

Within two years of the Kennedy assassination, when I was still a teenager at UCLA, it occurred to me that the rapidly emerging hippie movement was the direct result of that event. There were, of course, a number of other contributing factors, not the least of which was British rock’n’roll that redefined the internal logic of that eminently commercial musical medium. But the elimination of the long-haired pioneer of a New Frontier (Kennedy's hair prefigured the Beatles) was an immense catalyst of unpredictable change. The replacement of the young image-conscious Bostonian Kennedy by Johnson, the power-conscious political insider from Texas, thanks to the hopelessly undemocratic means of an assassination, ultimately achieved its intended effect by pushing the socially structured, collectivist energy and creativity of the young generation back into a more traditional model of rugged individualism, which after a phase of communal experimentation outside the official polis, could easily evolve into the standard “every man for himself” ideology that is sometimes called “libertarian” (as if it were an actual “political philosophy”) but is essentially reactionary individualism. Tuning in, turning on and dropping out only lasted as a philosophy for a few years, but it produced a major shock during the transition. In the end, when the baby boomers dedicated themselves to securing their individual futures, it was Ayn Rand who had won the culture war thanks to an assassination… followed of course by two others in 1968: those of Martin Luther King and Robert Kennedy.

But let's jump forward in history. In another of those historical paradoxes that are magnified when they happen on US soil, George W Bush has single-handedly created a new political youth movement that has rallied around Barack Obama. The key to this identification is that he incarnates a true anti-war stance and represents an oppressed minority. An unpopular war and the challenge of electing a black to the presidency - recalling both the breaking of the tabu against Catholic presidents and the flowering of the civil rights movement - have combined to recreate an ambience similar to that of the sixties for young people. Hillary Clinton, by contrast to Obama, represents a “repressed majority” that is all the more irrational and unforgiving for not having to endure physical and economic hardship in its daily life, making it harder to “prove” the repression which it wants others to perceive as oppression. This has been on the ongoing drama of the radical feminist movement in the US, which has consistently been drawn towards a lobbying mentality and an ideological orientation, preferring various forms of intimidation and moral bullying to focusing on the raising civic awareness.

The atmosphere around the Obama campaign is similar to the years between 1960 and 1963 when Kennedy was elected to the White House. The only difference – apart from what may (if permitted) be called the fratricidal feminist rivalry that has not hesitated to tarnish it - is that Obama has not yet been elected and there’s good reason to think that he may not be elected, in spite of the inevitability of his nomination and polls showing that he should easily beat a John McCain who has foolishly (but logically) aligned himself with the most unpopular and risky policies of the Bush administration.

Chief among the foreseeable obstacles to Obama’s effective entry in the White House are 1) character assassination including racist and anti-Muslim swift boat style propaganda and lies 2) physical assassination, although this doesn’t seem to be as standard a feature of the power toolkit as it was in the sixties (partly because of increased difficulty of mounting a hermetically secure conspiracy in our electronically febrile and highly porous Internetworked world). There is of course one other possibility: a form of subtle insider political blackmail that would ensure Obama’s transformation into a safe and interested spokesman for the military-industrial establishment that has been running the show since the Eisenhower years, as Eisenhower himself anxiously pointed out just before leaving office.

All that is idle speculation, of course. What is more easily predictable and far more interesting is the effect that any of the scenarios that imply the brutal scotching of Obama’s hope and change campaign will have on the younger generation. Idealism and optimism have been key triggers of emotion throughout US history and have always been associated with what I would be tempted to call the “good” or feminine patriotism, clearly distinct from the more modern aggressive defense posture, masculine patriotism, based on protecting what has already been acquired, either by the country in its imperial realization of manifest destiny or by individuals as property accumulators, certainly a more accurate term than landowners. The deepest irony of this campaign is that Mr Obama represents feminine patriotism (maternal, supportive, seeking harmony) and Ms Rodham-Clinton represents the masculine patriotism that was easily drawn into the “logic of war”* in Iraq.

It should be remembered that in the US intelligence – and in particular subtle manifestations of it - is seen as a feminine trait akin to sentimentalism, contrasted with reflex action, preferably ruthless and potentially murderous, as masculine. Hamlet is, of course, an anagram of Thelma! The producers and writers of Last Action Hero with Schwarzenegger deftly reminded us of this cultural décalage (with added irony by presenting the story of Hamlet with images of the androgynous blond Hamlet created by Laurence Olivier and projected in a classroom by a teacher played by Joan Plowright, Olivier’s wife!).

In the West Virginia primary, exit polls showed that according to the Associated Press “three-fourths of whites without college degrees were backing Clinton.” Racism was also noticed as part of the pattern since this profile of poor, uneducated whites (traditionally the butt of the insult, poor white trash), “clinging” (as Obama had already pointed out) to the typically masculine values associated with insecurity and confidence in firearms rather than dogs as man’s best friend, is now considered to be the loyal base of Ms. Clinton, who knows that her best hope of gaining some sort of advantage – beyond seeming to be the masculine candidate – is playing to the fear many people still have of blacks and their need to believe that their own pale skin somehow makes them superior.

In other words, a major drama is now playing out between the masculine and feminine elements in the symbolism of US identity. You could even say that in the country’s very name, the adjective “United” represents the feminine and the plural noun “States” (a collection of autonomous individuals who aggregate only for questions of convenience… and defense) represents the masculine. Furthermore, the two have ultimately proved to be incompatible, a fact that may also be reflected in the evolution of the divorce rate over recent decades. In the red/blue split that has come to epitomize the Bush era, the reds are male and the blues female. Hillary’s 3 a.m. message to the electorate seems to be “better red than dead” as she appears to be willing to “embody” the Bush image of a strong, decisive leader, who may be wrong but will always be strong, rather than the clever juggler of ideas who may be right but will inevitably be the victim of those who are better armed (which is why all good law-abiding citizens should carry concealed weapons – especially in the intellectual space known as a campus - in order to be ready take down the villainous enemies among us).

In conclusion: if the feminine takes a hit this year, as it did with the assassination of John Kennedy (a man who was totally dependent on women, unlike John Wayne!), something is likely to go awry, the usual premises of social behavior will be wrenched in a different direction. No one can predict how that may play out and who exactly will be involved. But it may once again be a curious cultural alliance of young whites and the black community who will find a way of expressing a spirit of revolt, incarnated in the sixties by Jimi Hendrix at Woodstock who parodied the extreme masculinity of the Star Spangled Banner. But there may be a great deal less joyful and carefree exuberance than the last time, when a belief in the right to continual prosperity was still in the background. And of course it was that core belief in the right to prosperity that brought things "back to normal" as the hippies left the communes to pursue their careers.

This time around it isn't sure that the belief in conquest and inevitable prosperity are sufficiently present in the background to ensure a safe issue from the cultural revolution born of sudden disappointment. So the real question for those who will do everything they can to protect the status quo from the risks of a black man who attended too many sermons by Rev. Wright is whether they can somehow avoid having the disappointment seem too brutal. Somehow, I don't think they even care. It is in the spirit of masculine US culture to look for the result and not worry about the consequences. The question is, will the patterns of the past hold true in the future. The answer is likely to be, as it always is: some will and some won't. The question is, which ones will and with what degree of force?

* ”Logic of war” is, I believe, a euphemism invented by François Mitterand to justify in advance the French participation in the first Gulf war. Although socialists clearly represent the feminine side more than the macho right, in a similar inversion to the Clinton/Obama one, Chirac was clearly a more feminine figure than Mitterrand, who was caricatured as “God the father”.

Saturday, February 09, 2008

Instructional what?



The Big Question on the the Learning Circuits blog this month is "Instructional Design - If, When and How Much?"

Instructional Design shares its acronym (ID) with Intelligent Design and deserves to be equally controversial. Anything that works needs to have some principle of design -- whatever the source or agent -- and most would spontaneously agree that design which is intelligent is better than, say, random design.

But is it? Evolutionary theory is about creative adaptation rather than wilful creation for a single or simple-minded purpose and evolution works because by definition it permanently and constantly takes into account everything in the environment, not just some key factor someone thinks may be the most important*. It would be nice if in the Intelligent Design debate people recognized that "creative adaptation", as the easily recognizable active principle, could, philosophically speaking, admit a number of forces, known and unknown. From an empirical point of view that's actually what we appear to observe. But civilized humans seem to have acquired the habit of referring exclusively or preferentially to what they themselves know, or rather what they have previously theorized, and generally reduce the principle of creative adaptation to a cause or set of causes which they believe explain everything. Whether it’s God (implicitly meaning an anthropomorphic agent with a cosmic drawing-board) or the “selfish gene”, we have a curious taste for seeing the universe as a purpose-driven vehicle and putting a single driver in the car… possibly because we’re more influenced and admiring of our own mechanical inventions than the world around us and that we assume they are an appropriate model for understanding the natural world. How many of us refuse any form of belief in the common idea that the brain is a super-computer, a belief that persists even when we admit it’s only a metaphor?

All this is to say that we need to recognize that the most effective way to provoke learning is not so much to impose our “instructional intelligence” on the process as to ensure that there is enough spontaneous interaction in the complex learning environment – some of which we can create, but most of which is already there – for evolutionary process to develop with regard to EVERYTHING in the environment. The Instructional Design debate always seems to boil down to selecting the best formula for getting people to understand something we already know. Doesn’t that in itself indicate a certain form of perversity? It assumes that our knowledge is sufficient and suggests the belief that it is also complete or reasonably complete. And it assumes that there are replicable and equally effective methods for influencing that form of evolution we call learning.

My own instinct is to drop the word “instructional” and replace it by, say, “emerging awareness”, because learning is always a complex set of continuous interactions with the environment. So that would leave us with Emerging Awareness Design, which sounds rather New Age, even Ron L Hubbardish, and still contains the general idea of Mechanical Control conveyed by Design. So let’s get rid of Design and maybe call it “adaptive strategy”. So that would give us Emerging Awareness Adaptive Strategy, EAAS, which seems a bit cumbersome. To simplify I would suggest moving to a different metaphor and calling it a Learning Game Plan (LGP) since games are by definition a complex set of unpredictable interactions (in that sense Simulations are not really games, but representations of games precisely because they have been “designed” to look like games). ID, including Sims, could still exist alongside LGP, in a totally subordinate role, as a set of preventive and curative tactics within the overall game strategy. But those tactics should not be abstracted from any real environment (the actual game) and imposed as a set model to be applied whenever a specific type of problem or gap occurs. It’s the LGP that will judge and adapt an ID template to the reality of the game.

How can this be achieved? Two answers are possible:

· Through increasingly “intelligent agents” (however, those agents do not come into existence through interaction with the environment, but rather through our own “intelligent” reading of the environment, not quite the same thing).
· Through social interaction, which includes both conscious and unconscious, seen and unseen factors of the environment.

So even if we could build reliable intelligent agents (quite a step forward from e-learning courses), they must be subordinated to the social reality in which people actually learn and adapt. Call it the principle of subsidiarity: all learning artefacts are only potential tools in an adaptive Learning Game Plan.

Coaches in professional sports always start with a game plan but use the actual circumstances of play to evolve it rather than stupidly hoping that the outcome they had foreseen would automatically emerge. Any particular technique used in the course of the game can be seen simply as a tactic producing an event that will inevitably provoke a reaction in the environment (e.g. by the opposing team). Both the coach and the players are learning as they try to execute the game plan, which is itself evolving. They use environmentally adapted versions of techniques they have worked on in practice. And in some cases they discover a “mutation” that allows them to solve a problem posed by the environment in a new way.

Now that is a Design strategy and I would say an Intelligent Design Strategy. That is my “Instruction” for the day.

_____________

* The method of so much “evolutionary psychology” follows this pattern of imposing a brilliant and perfectly “logical” idea of causality on past stages of evolution. The exercise is fun but without knowing and calculating EVERY variable, our theories are about as reliable as a forecast of the weather for the third Tuesday of next month in Biloxi, Mississippi.

Thursday, January 31, 2008

Technology Enhanced Social Learning and post-industrial knowledge development (Part 1)

The industrial model of learning was based on the notion of competitive individual achievement. It implicitly and often explicitly encouraged the hoarding of knowledge, a certain form of passivity (or refusal of interaction) and a suspicion of colleagues and fellow practitioners who may be seen as potential rivals. On the positive side it also encouraged the ambition of leadership, but the practical problems associated with leading a group of individuals tend to diminish the effect of leadership, which is always easier to develop within collaborative teams. If I had the time, I would explore how the ideological individualism of the industrial age – in which individuals existed as resources to be allocated to profitable ventures (this capitalist logic is still the driver of an economy that is now global) – tends to create leaders whose most valuable skills are manipulation and various forms of collective brainwashing (the “science” of Public Relations). The twentieth century provided some stunning examples of masters of manipulation and the new century seems to have numerous examples of “leaders” ready to perpetuate the tradition, though there are a number of reasons for thinking that some kind of change is in the offing. One of the reasons for hope is that there are signs of a significant shift in the way culture (i.e. people’s behaviour) interacts with the economy.


Although the terms have been bandied about for some time, we have only recently entered a phase of transition from the industrial to the post-industrial age in learning, which will be characterized less by the instilling pre-defined authoritative knowledge than by managing the evolution of the capacity of performance in real contexts of production and social relationships. More than the mere transfer of knowledge, it involves the dynamic creation and restructuring of the myriad of things we know (more than what we traditionally think of as “knowledge”) accompanied by the generation of skills rooted in a rapidly changing context of performance. Although many experts and the media have focused on the extraordinary impact of information technology, which increases our ability to store and retrieve knowledge, the technology that has had the most radical impact on learning and indeed productive behaviour of all types is communication technology, both synchronous and asynchronous. It can of course be directly linked to knowledge creation and retrieval.

Recognizing the true source of knowledge development

The basic reality of human societies is that people learn from each other as complex social beings; not as repositories of bodies of knowledge, but rather as practitioners of professional and social skills that mobilise in non-linear fashion a wide range of forms of knowledge: gesture, attitude, perception, the ability to localize information, complex mental networks of association, visual, acoustic and kinaesthetic memory, etc. The post-industrial age has opened the channels of communication that until very recently were clogged by penury, handicapped by time delays and inhibited by unit costs that no longer exist. Bandwidth has ceased to be a problem; both spoken and written communication can be instantaneous; and the factor of cost has melted into the constantly diminishing price of access to the technology. This opening of the floodgates of communication is a far more radical change than all the advances in programming and storage, however impressed we may still be by tools such as Google Earth. One of the reasons this change is so radical is that, unlike programming and software manipulation, communication is a natural human skill that can be perfected, certainly, but doesn’t need to be formally acquired.

The transition from an individualistic model of knowledge acquisition to the much richer and more dynamic notion of “learning organizations” that create and transform as well as simply consult knowledge has only just begun. Our contexts of work and habitat, our routines and hierarchies, our behavioural expectations in our interactions with others are still modelled on or heavily influenced by the old paradigms, though the pressure is increasing on a daily basis to move towards something far more fluid. The advent of the Web 2.0, the Social Web, has opened up for the first time the possibility of shifting the model for learning away from the traditional institutional framework and the individualistic paradigm towards a model that embraces collectively constructed and shared knowledge.

Peering into the future

The challenge of the new paradigm is organisational, methodological and to a much lesser extent, technical. Software, networks and media must be conducive to easy appropriation and use, but that is already the trend that will undoubtedly continue as technology providers are obliged to make their products more attractive and usable at an increasingly rapid pace. New research will be concerned with design and production of optimally adapted technical environments, but the most urgent requirement – before optimisation can be achieved – is the social and methodological design that can incite organisations to renovate their practices of knowledge development.


The reality of the Social Web can be summarized by the notion of “gregarious exchange”, what Jay Cross usually refers to as “the art of conversation”. Interestingly, if the initial impulse behind Web 2.0 was typically individualistic consumer-orientated behaviour (expressing personal taste in music and entertainment, as well as “sharing” cultural content, legally or illegally!), the Social Web has already evolved into an informal publishing platform of ideas, opinions and personal cultural production. This means that users are spontaneously adhering to a culture in which an optimal balance between input and output will be progressively defined. For the training world, this is a major step forward in the culture of learning. Output becomes visible for the group to profit from, judge and criticize and (according to the wiki model) to evolve and perfect. Where for at least two centuries teachers and trainers have either struggled to incite the production of output of minimal quality and intended for the eyes of the teacher only, spontaneously formed social groups are producing and learning to refine the quality of their production thanks to what are now self-imposed and group-defined expectations. This constitutes the basis of a revolution in learning methodology whose long-term consequences do not yet seem to have been taken into account by most active decision-makers in the field of education and training.

Observation of current practice on the Social Web shows that while it is usually individuals who act (often driven by the kind of pride and ambition associated with past individualistic educational practices), the key to performance is the creation of groups, usually defined by some common interest or other principle of cultural proximity. At the same time the global nature of Internet based communication has changed the notion of “proximity” to one that it would be more appropriate to define culturally than geographically. The new geographical spread created by an increasingly “virtual” and therefore global environment that redefines cultural relationships is both a major opportunity to develop stronger relationships (commercial, cultural and even political) across borders and a potential source of discomfort if not disarray. A serious effort is required to define and secure the operating principles and social bearings of these newly acquired input/output reflexes.

Saturday, January 26, 2008

What Saddam was too culturally blind to see

From an Associated Press article that appeared today:
______

"Saddam Hussein allowed the world to believe he had weapons of mass destruction to deter rival Iran and did not think the United States would stage a major invasion, according to an FBI interrogator who questioned the Iraqi leader after his capture."

“He told me he initially miscalculated ... President Bush’s intentions,” said Piro. “He thought the United States would retaliate with the same type of attack as we did in 1998 ... a four-day aerial attack.”
_______

I find this very intriguing and requiring some sort of cultural explanation. How can it be that the head of a government, who had been a close ally of the US for more than 10 years, couldn't see what was obvious to everyone else, to wit, that with or without justification, Bush was the kind of "cultural being" -- a certain style of AmerIcan, imbued with a certain form of AmerIcan values -- who was going to "just do it", Nike style?

Was there a single person in the US -- whether for or against Bush -- that doubted his intentions at any time in the year preceding the invasion? I don't think I knew any. So what led Saddam so far astray that he couldn't see or even learn from others what was so patently obvious?

The article doesn't answer this question, but I think the interculturalist community can help to do so. The article does provide a few clues, history a few more and psychoanalysis yet another!

Time and patterns of behaviour would be the first one. Almost all political entities, especially democratic ones, remain relatively stable and predictable in spite of changing parties or clans in power. Call it the illusion of democracy (the idea that the people can change things through elections, whereas elections merely serve to ensure continuity) or rather the rock solid logic of representative democracy. The more politicians replace each other, the more their policies remain consistent if not identical, which is the very spirit of "representation", since it's the mob or, to be polite, the "wisdom of crowds" that founds and defends not only general policies but also permanent styles of relationship with other cultures and peoples. There is always leeway for debate, which can on occasion become acrimonious, but the attitudes, within the accepted range, tend to remain stable. (A good example of the dynamics between stability and instability is the current status -- nearing a peak of instability -- of the US attitude concerning Mexicans both as a people and a cultural force. Since no society is capable of defining cultural issues with any sense of rationality - i.e. distance -, the range of emotions is wide during periods of instability, but the policy, in spite of numerous "democratic" initiatives, will inevitably tend towards some middle ground).

The difference between Reagan's, Bush the Elder's and Clinton's foreign policy with regard to Iraq and indeed everything else, was minimal. From Saddam's point of view, US policy was solidly based on Iran being seen as what might be called the "hub of evil", that is before Bush the Younger stretched it into an "axis" that included Iraq and North Korea (quite a geographical spread and, consequently, a clear tip-off to the culturally savvy that something was up). It was impossible for Saddam to think that his well-established strategic role in containing Iran would ever make him vulnerable to anything other than annoying skirmishes. The three previous presidents had used variants of a consistent strategy: Reagan by encouraging Iraq to attack Iran and directly supporting a brutal and aggressive war, Bush I by establishing a "New World Order" that humiliated Iraq for a moment but was careful to show deep respect for Saddam's regime by allowing it to stay in place and refusing to create the inevitable chaos that would come of giving power to a Shiite majority. Clinton predictably "managed" the ensuing state of relative equilibrium, engaging in the occasional skirmish and showing a certain level of satisfaction in the virtual control of Iraq ensured by the policy of US dominated no-fly zones. Although the ongoing role of Iraq as the antidote to Iran made no sense in terms of political or moral principles, it was a totally rational system created by Reagan, given formal definition by a man named Bush and ultimately inherited by the same Bush's son. Since the "clan" was back in power 20 years on, for Saddam nothing was likely to change radically. Such, in any case, was the likely reasoning of a man hailing from a clan culture. Saddam clearly didn't understand the individualism at the heart of US culture... nor the implications of the Oedipal tradition (which, by the way, everyone in the West still seems to consider universal, following Freud, but whose universalism Lacan -- Freud's most adamant orthodox defender -- called into question after spending some time in Japan, where he claimed the Oedipus complex simply didn't exist. For all his "parisianisme", Lacan was a true interculturalist). Is Oedipus -- the man who killed his father and married his mother -- a purely Western icon? And is Oedipus's self-inflicted blindness the emblem of our own cultural blindness?

On Saddam's perception of the political situation, here is what the article says: "Piro said Saddam also said that he wanted to keep up the illusion that he had the [WMD] program in part because he thought it would deter a likely Iranian invasion."

Clearly part of Saddam's problem was that he didn't have access to the writings of the neo-cons who had invaded the White House. Or perhaps like the rest of us, he considered those "thinkers" to be an academic lunatic fringe, a kind of sect that everyone tolerates but no one takes seriously. In all cases, he underestimated the possible effects of US individualism, the kind that allows certain personalities to rise to positions of unassailable power, control and a sense of manipulative mission, without relying on well-established social structures to get to their summit. Examples abound in business (Bill Gates, Donald Trump), religion (Jerry Falwell, Ron L Hubbard, Jim Jones, etc. ad inifinitum) and political bureaucracy (J. Edgar Hoover), but public policy had traditionally benefited from the Constitutional checks and balances that prevented similar "achievers" from attaining discretionary power capable of overturning the sense of existing institutions.

Over a 20 year period, however, something actually had changed in US politics, its clearest starting point being the election of Reagan in 1980 accompanied by the slogan "America's back". Pure patriotic emotion and media amplification rather than morally based reasoning and diplomatic tact were becoming the new "norm" for political action. But Reagan's own exploitation of it was more electoral than anything else. Hardly a brilliant man or an original thinker, Reagan actually made a point -- perhaps for reasons of personal pride -- of maintaining a tradition of "responsible political reflection" accompanied by a sense of RealPolitik that kept itself at a certain distance from the otherwise useful electoral illusions. Looking back at the new millennium, it may have required a Banana Republic style presidential election (2000) to provide the catalyst for a shift towards the definitive adoption of emotion and media amplification as the central source of policy. But other phenomena indicate that the time had perhaps come for pure emotion and the media to play their role in a world where the patterns of the recent past had produced another "grande illusion": the idea, confirmed in the Clinton years, of a permanently expanding economy fueled by stock markets and purely financial management (this in spite of the 2000 dotcom crash, which curiously -- because of the technology theme -- may have further confirmed the dominance of finance over any concrete feature of the "real economy").

Saddam had no perception of any of these changes in the political culture or the potential of US individualism to reverse significant trends. In particular it now appears that he had no idea what Bush the younger might do, even if after the shock of 9/11 -- and the political manipulation that followed in its wake -- we AmerIcans could see it coming as a virtual certainty and therefore were not surprised by Bush the Younger's New World Disorder inaugurated in March 2003 with the complicity of Blair and Aznar. And even that monumental political decision played out pretty much according to the pattern of the new stock market, not as a political event to be prepared, negotiated and managed, but as a bet on futures*. A majority of AmerIcans thought at the time that the aggressive "takeover bid" of Saddam's Iraq was a sound operation of strategic market positioning (i.e. invasion of Iraq as a prelude to throttling Iran and securing stable resourcing in oil from the Middle East). The rest of us were treated as irrelevant because the only argument we had was "moral" (and we all know the equation "performance efficiency trumps ethics...except when the law imposes compliance issues", a question recently discussed among our Yahoo interculturalists in relation to diversity training; see as well the Steven Pinker piece in the New York Times magazine, where using this sort of implicit logic he appears to place Bill Gates above Mother Teresa in terms of ethical worth.

Finally, the article offers us the following culturally significant anecdote concerning the first Gulf war:
______

Piro also mentioned Saddam’s revelation during questioning that what pushed him to invade Kuwait in 1990 was a dishonorable swipe at Iraqi women made by the Kuwaiti leader, Sheik Jaber Al Ahmed Al Sabah.

During the buildup to the invasion, Iraq had accused Kuwait of flooding the world market with oil and demanded compensation for oil produced from a disputed area on the border of the two countries.

Piro said that Al Sabah told the foreign minister of Iraq during a discussion aimed at resolving some of those conflicts that “he would not stop doing what he was doing until he turned every Iraqi woman into a $10 prostitute. And that really sealed it for him, to invade Kuwait,” said Piro.

_______

That isn't the full story of course, but it's interesting to note the impact of Al Sabah's threat in Middle East culture. What was Al Sabah thinking? Did he fail to understand Middle East Arab culture? As an oil billionaire functioning within an economy dominated by the US, had he been infected by Texas culture, where such insults are mere expressions of macho bravado, the object of admiration for rhetorical skill? Or was he so sure of his position with the US that he felt he could afford to proffer such a hurtful insult?

One of the missing parts of the story is that the AmerIcan ambassador at the time had indicated informally that the US wouldn't prevent Iraq from invading Kuwait, which seemed logical enough to Saddam given his solid position as the buttress against Iran. For reasons of "global coherence", particularly in the wake of the fall of the Soviet Union, the US changed its attitude and quickly humbled Iraq but just as quickly re-established a political equilibrium that was at the limit of acceptable for Saddam and seemed, in its way, to guarantee long-term stability. Until, that is, Bin Laden (another US protégé during the anti-Soviet campaign in Afghanistan) provided an irrational (i.e. emotional) pretext to see all Arabs and Muslims as the enemy and set in motion the surreal neo-con game plan of taking over the Middle East to manage it on their enlightened terms (the triumph of the culture of rational economic individualism, for which the entire world has been waiting...manna in the desert).


* Has anyone noticed a major cultural shift in the economic press? The stock market used to be about "investment" and economic value, but now the press routinely talks about "betting" on markets, trends or specific stocks. Recent economic news has been largely financial: the subprime fiasco and this week's Société Générale scandal. Wall Street doesn't yet have a hotel in its name in Las Vegas (now that would be a clever new theme to exploit, wouldn't it?), but it does seem to have borrowed its culture from Nevada (or Atlantic City, the home of Monopoly as well as east coast casinos). Actually I think a squeamish sense of political correctness would prevent even the cleverest Las Vegas real estate investors (gamblers?) from pointing too clearly to the speculative and greed-orientated nature of the stock market.

Tuesday, January 08, 2008

The "logic" of networking

This could be considered as a more focused comment on the main point of my previous post addressing the Big Question for January on Learning Circuits.

Stephen Downes has focused on the vital debate about what he quite rightly calls "the network way of thinking" and provides links to a debate that others have developed about the finality and mechanics of our emerging networking culture. From an intercultural perspective I find this debate extremely interesting, to be classified in the category of "how individualist cultures grapple with the utterly alien notion of group relationships". My conclusion is that they fail, much as the fictional inhabitants of a two-dimensional world cannot imagine what the world would be like with a third dimension (Edwin Abbot's classic "Flatland", 1884). In failing they reveal the limitations imposed by their obligatory frames of reference. It's money (markets and production efficiency) or love (family and sex) and nothing else. (Actually there is a third factor: sharing of obsessions among like-minded consumers and achieving admiration through the mutual perception of the quality of one's tastes).

For the first time in centuries the Web has, it seems, raised the question of how our white European civilization, whose recent evolution has been intimately linked to the development of capitalism (organization and ownership of resources, but also the creation of a value system derived from economics for defining the status of merit for the individual) can "use" the availability of tools that respond to the fundamental human instinct of relationship building. Given that relationship building has been strenuously repressed for the past three centuries or so as a source of inefficiency (it's even associated with cheating in the form of nepotism or cronyism) as well as a violation of principles of the equality of individuals, everyone seems to be in the dark about what relationships are good for and whether there is a legitimate justification for them. The fact that some people are actually making fortunes out of providing software that encourage relationship building has given the concept a new-found prestige. After all, the governing principles of all decision-making nowadays is "if there's money to made, go for it" and "if it's profitable, it must be good."

I would suggest looking at Asian cultures -- and in particular because of their current political and economic significance -- China and India to discover, first of all, that relationships/networks can be a natural part of both social and economic activity and a fundamental component of the value system; secondly, that they don't require specific communication tools (hardware or software) to exist; and thirdly that the "economy" of such networks is a subtle mix of efficiency and affect... which means that we in the west perceive it inevitably as messy, unfair and arbitrary. And yet, like Galileo, we have to add... "but it turns" (for them, of course, not for us) so maybe it's worth reviewing the model.

My prediction for the next ten years is that we in the west will undergo a major learning experience focused on new models of social relationships/networking. We will discover and begin to adapt to what the Chinese call guanxi, a concept somewhere between network and relationship, with multiple behavioral ramifications. With major geo-political shifts taking place against a background of continuing technological change, marked by the growing strength of Asia to counter the US economic hegemony of the past 60 years (the US representing the ultimate template for extreme individualism), we may discover -- or even be forced to discover -- the value of paying attention to the way guanxi works. It represents a much more complex model of "engagement" than anything that has come out of our "local" debates (we like to think they're global, but -- and this is a measure of our naivete -- we are all prisoners of our village culture).

How that "enlightenment" encompassing a new vision of networking will happen nobody can predict. But it's worth knowing that there are other models than, on the one hand, our hopelessly "logical" but poorer than destitute "free markets governed by the actions of rational agents" or, on the other hand, the newly constructed stages for narcissistic activism (Second Life, Twitter).

Friday, January 04, 2008

The paradox of change and learning technology


If Iowa is any indication, 2008 looks to be a year of change, or rather of a growing desire for change. Of course, even when called for by the vox populi, change may not occur since we should never discount the resources of the powers of resistance -- always equal to the task -- who will find new ways of making sure the status quo maintains its fundamental rights. Pakistan gives us an idea of one tactic for protecting vested interests; a bit brutal, but there are other more subtle ways that can easily be mobilized in our western democracies.

So what does politics have to do with changes in the learning scene in 2008? The parallels are worth considering; to wit, the fact that in both domains the methods of the past have quite obviously failed to deliver anything but disappointing results. In politics, the received wisdom was that when there was a problem you carried a big stick, replacing it with a bigger one when necessary, and if you thought it was big enough for the annoyance you were faced with at any given moment you used it (whether preceded by talking softly or aggressively) to hammer home your favorite truth or doctrine and/or snuff out the enemy. In the field of learning, the stick had on its rough surface grades, degrees, diplomas and certification while its hard core consisted of controlling and amassing information (the equivalent of military might) and deploying it in places called, variously, "the classroom", "the training room" or "the learning management system".

But the big stick as the ultimate and unique solution seems to have failed once again and the choice between increasing its size or calling the premise into question has come to the fore. When that happens change becomes possible.

On both fronts, some things have visibly changed over the past 12 months and more is likely to change over the next 12. In 2007 the buzzword in the training technology sector, "Web 2.0", was transformed into a slogan and rallying cry. It is now perceived as corresponding to something that may just have a real and tangible impact on our lives. I call this the belief in the "tangible virtual" and it represents the major cultural innovation we're likely to see develop in 2008, although Second Life already pretends to be exactly that (whereas it is merely the graphic illusion of it).

I think two contrasting things will happen:

1) The Social Web as a cultural meme will gain credibility and draw towards it a sufficient number of users -- aware and unaware of the new culture they are associating with -- to validate at least the idea that it is a desirable general feature of the global environment (and this will be true even in the developing world where it is less present but ultimately more promising in terms of its transformative power and the human services that may for the first time be providable if not yet provided).

2) The myth that consists of thinking that the Social Web is authentically social will begin to be deflated, creating a desire for a truly social web, which we won't be tempted to call 3.0 because a truly social web doesn't need to be “semantic” (it seems that everyone is convinced that semantics will be the key feature for the “appellation contrôlée” of Web 3.0). The true Social Web (don't count on seeing it before 2012) won’t be defined by software but rather by human behavior. I prefer to think of it in terms of the Chinese concept of relationship and would call it the Guanxi Web. But we’ll have to learn a lot more about the way the Chinese do things before we get there. Or alternatively, wait for them to create it and follow in their footsteps (would our pride of technology leaders stand for that?).

In the more immediate future, starting this year, we will begin to understand that the relationship between what we now call the "social web" and human social interaction is as tenuous as the relationship we imagined between the artificial concept of “e-Learning” and actual human knowledge development, a disjunct we took nearly ten years to comprehend.

As soon as we realize, some time later this year, that everything we marvel at for being “the tangible virtual” is little more than an intriguing oxymoron (i.e. a poetic illusion) we will discover a need for something that would more appropriately be called the “tangible real” accompanied by a tangible virtual subtext, opening the gates of a new type of creativity rooted in the desire for the real rather than the desire to escape it. The historically minded may already have noticed that the tangible real coupled with the tangible virtual has been missing in our civilization since at least the 18th century, a period in which taverns, coffee houses, theatres and salons still actually encouraged people both to define and adapt to an intellectual environment created in common but spread and shared far beyond the local. The social intellect was subsequently corralled into universities as formal education, after laying the bricks of its buildings, laid the brakes on social learning and any kind of authentic intelligent (rather than intellectual) culture. This model needs to be dismantled and replaced, but don’t count on Starbucks or Second Life to take us there! In 2008 we will begin to see that Second Life is more like Second Wife, an object of fantasy (power and libido) that momentarily fulfils the individual while stifling communities by abolishing what is genuinely common or relegating it to the background. Second Life could be compared to an inflatable doll we fill with our own hot vapors (isn't that literally what we do with the avatars?). Of course it does serve a purpose in our global economy and culture (just as spam does) and so will continue to survive, but I don’t believe it defines our future in any serious way.

In conclusion, if this is a truly a period of change, as I think it is, that means we will be changing not only our ways of doing things but also our ways of thinking about change itself, and that applies to everyone including experts, thought leaders and fortune-tellers. We’d all like to be right in our predictions, but if we are tending towards something truly social, the result simply won’t resemble anything we individuals can imagine, however good we may be at analysing trends. Mainly because we all tend to reason like bankers, in terms of linear curves or market analysts in terms of product life cycles. After all, some of us still remember John Chambers’ “rounding error”, which with hindsight should be a sobering reminder of the value of "informed forecasting".