Welcome to the 2011 Through the Looking Glass awards, the Anger Is An Energy edition. In the year 2011 the whole Earth shook, sending shock waves in all directions. We don’t mean to imply that the seismic shifts were of equal magnitude: not every violent disturbance registered the same on our Richter scale.
Indeed, people could be seen making mountains out of molehills while other upheavals were genuinely ground breaking.
Time magazine voted ‘the protester’ 2011’s Person of the Year for a reason. Anger helped to shatter public images and shared illusions (amongst other things).
We will therefore need to go through the looking glass to pick up the pieces. The goal is to examine the shards of our own reflection. We shall also be holding up placards for our own reasons. If you feel angered by what you see – by the fragmented approach and ragged edges – then good. We could use that energy! So please feel free to get into the spirit of things and rage against your machines.
Anger Management Award
Charlie Sheen’s mad energy has always been part of his mass appeal. The former movie star’s reputation preceded him, and he arguably became most famous for acting out behind the scenes. Television literally tried to domesticate and contain the ‘bad boy’ by building a family sitcom around him. The reason for Two and a Half Men’s incredible success was there for all the world to see: audiences could live vicariously through its ticking time bomb lead.
It was a seemingly win-win situation – television could capitalize on his notorious actions off screen and the actor would reap the rewards in his private life: more money, sex and drugs that any “Vatican warlock assassin” with “tiger blood and Adonis DNA” could possibly ask for. Corporate media was able to manage our expectations through positive reinforcement and enabling.
It was only a matter of time, however, before the bomb would go off in everyone’s faces. It’s no wonder that Sheen was dazed and confused when CBS ceased production so he could get his act together. After all, behaving like a complete dick was tacitly written into our social contract with him. An angry Sheen began airing his grievances in public, and asked viewers in an open letter to TMZ “to walk with with me side-by-side as we march up the steps of justice to right this unconscionable wrong”. When the day of rage failed to materialize, audiences found something else to entertain themselves with during the show’s forced hiatus: a troubled man breaking down across media platforms.
Sheen put on quite the show, and appeared to be losing the plot before our very eyes. Media outlets were able to capitalize on the situation by reverting to its preferred social script: reducing a troubling public display to mere entertainment value. We all pretended to be shocked – shocked we say ! – that an increasingly manic Sheen was able to entertain us with even more self destructive behavior. Indeed, we welcomed our ringside seats to the man wrestling with his own demons. Some of us even bought tickets to stand in Sheen’s corner so as to be seen rubbing him the wrong way.
It’s difficult to know which was more entertaining – a famous actor appearing to sabotage a successful television career or a mere mortal talking about himself as if he were superhuman. It was equally difficult trying to determine who was the real substance abuser in a culture of addiction – an audience that couldn’t enough of the drug called Charlie Sheen or the actor with a series of addictive behaviors getting off on all the media attention, too.
Sheen’s meltdown across multimedia platforms (radio, television, stage, print, internet, etc) certainly mirrored viewing habits in consumer societies. The relentless media attention served no purpose other than to feed our habit of turning troubling displays into cheap laughs, ensuring our dependency on other valued entertainments (talk back radio, television shows, etc). As he observed on Piers Morgan Live “It’s been a tsunami of media, and I’ve been riding it on a mercury surfboard”.
Speculating whether Sheen was mentally ill or a clever performance artist was beside the point: the line between person and performer had long been blurred by the role he was already playing in our own psyches. As if to prove the cycle of co-dependency, Sheen’s rage has since been co-opted by the entertainment industry. The fired actor might as well have been auditioning for his next project: a television remake of the comedy Anger Management.
Friday On Our Mind Award
Charlie Sheen is living proof that fame can be a ‘monster’ (to quote renowned philosopher Lady Gaga). The ‘fame monster’ can swallow people whole and produce it’s own demon spawn. It’s worth noting that Gaga hasn’t exactly got out the torches and pitchforks – that’s one ensemble that conveniently alludes her. Instead of forming an angry mob to track ‘the fame’ down, she wants impressionable young girls to “walk around delusional about how great they can be and to fight so hard for it that the lie becomes the truth”. There was no fighting off, however, the anger directed at the ‘little monster’ Gaga branded a genius. Indeed, this ‘monster’s’ claim to fame primarily consisted in angering millions of people.
We are referring, of course, to the 13-year-old girl that came out of nowhere to topple Charlie Sheen as a trending topic in early 2011. Indeed, someone purporting to be Charlie Sheen tweeted “Dear Rebecca Black. We don’t hate you because you’re famous; you’re famous because we hate you. Sincerely, everyone”. Perhaps what is most telling is the way the angry mob was revealed to be the true monster – it was our own hatred that fed ‘the fame’ and let “Friday” run amok online.
One of the ironies is that the aspiring singer’s mother funded the upload to teach her daughter the downside of a potential music career. She wanted to show Rebecca that being famous was harder that it looked – it was more blood, sweat and tears than living a glamorous life. The other irony is that the music video was written and produced by a vanity label calling itself the Arc Music Factory, a company “based on the idea of Noah’s Arc” in that it was “a place to gather people, where they could be safe” (“Friday On His Mind”, 29 March, 2011). Since the videos were made for a small circle of people, no one expected the gathering shit storm.
“Friday” inadvertently became a meme for ineptitude. Although the vanity project was not made for mass consumption, “Friday” unintentionally parodied the songs chocking our airwaves. “Friday” was built around a shopping list of musical clichés: autotuned vocals, banal lyrics, anonymous beats and a gratuitous rap to feign street cred. The fact that the (admittedly) catchy song resembled a sing along from Sesame Street merely heightened audience incredulity. The literal visuals underscored the song’s shortcomings, throwing the entire project into (comic) relief. The funniest thing about the music video, though, was the hysterical reaction to it. The glorified home video was somehow mistaken for the real thing.
While there’s no denying that “Friday” was a terrible pastiche of professional recordings, the record simultaneously brought out the worst in other people, too. Apart from the relentless ridicule, a 13-year-old girl had to contend with hate mail and death threats. As the impressionable young singer noted, one hateful message struck a particularly chord with her: an anonymous music critic wished self mutilation, an eating disorder and death upon her. Such a hateful response sums up what happens when an aspiring performer fails to entertain us in the ‘right’ way: we’ll share the pain and entertain each other with our own vitriol.
We All Carry the Virus
Instead of putting the song in quarantine, we somehow tried to inoculate ourselves by putting it on high rotation. It was our supposed resistance that helped turn “Friday” into a viral phenomenon. It didn’t seem to occur to millions of internet users that they were the carriers of this ‘virus’ and were choosing to expose each other to it. Like all virulent strains, the outrage that was “Friday” required a susceptible host and recipient in order for it to spread online.
Perhaps we should have really being hating on ourselves for allowing “Friday” to catch on and occupy our minds. As millions of people bemoaned the lack of character and originality (amongst other things) in pop music, they similarly followed the established trends of hopping onto hate bandwagons and/or cyber bullying in order to feel more in tune with one another. Our own movements invariably raises the question: why did we think it was so important to band together and call for a public beat down on a deluded young girl?
There But For the Grace of God (Go I) Award
The oceanic feeling is said to be the source of all religious belief. It’s the feeling of being connected to something deeper and limitless. To some extent, the story of Noah’s Arc plumbs the depths of the sensation of an indissoluble bond. The Judea- Christian tradition purportedly feels a connection to the ‘limitless’ through the wrath of God. The moral of the story itself isn’t too deep: while much of the Earth was supposedly swept away in a deluge, a select few were saved and gathered together to start the rinse cycle all over again.
The 2011 Tohoku tsunami therefore bore witness to a holier than thou attitude. Oceanic feelings also rose to the surface, and an earthquake off the coast of Japan was seen as a latter day moral cleansing. Seeking the high moral ground, many people uncharitably viewed the tragedy through the lens of self aggrandizing beliefs. The groundswell of bad feeling invariably revealed the fault-line in their own sensibilities. The 2011 tsunami was supposedly (say) karmic payback for the attack on Pearl Harbour in 1941, or God’s way of punishing Japan for its contemporary whaling industry. The captain of Sea Shepherd even wrote a poem to commemorate the “fearful wrath” of a sea god: Neptune “angrily smote the deep seabed floor” with his spear and the “shore echoed mankind’s cry”. It wasn’t just individuals trolling the Net that shared these views – some prominent people gave expression to such deeply felt sentiments too.
According to conservative radio host Glenn Beck, the tsunami was more than a random natural disaster. The Japanese tragedy was really a “message being sent” from an angry God to the rest of the world. Everyone should therefore stop “doing the things” that make God angry and “buckle up” for what’s going to be a “bumpy ride” for the rest of humanity. Tokyo’s Governor also saw the writing on the wall and insensitively observed “the Japanese people must take advantage of this tsunami as means of washing away their selfish greed. I really do think this is divine punishment” and a wake up call to a grieving and shell shocked nation. Through the Looking Glass doesn’t mean to imply that sanctimony and fear mongering were the default responses – an outpouring of deeper emotions occurred as people around the world reached out to a country in dire straits.
There was one clear message being sent on that fateful day: modern technology is now able to render apocalyptic images in real time. We might not have been witnessing judgement day, but the live footage was a complete revelation. Anyone watching the events unfold was ideally left humbled and speechless. The immediacy of the footage itself speaks volumes, and manages to convey the scale of an escalating natural disaster. Humankind was reduced to a powerless spectator, and civilization revealed as the most fragile of artifices. There were at least two things to observe about the voluminous raw footage. It appears to be second nature for embattled humans to reach for readily available video devices as the world literally comes rushing towards them. Equally natural is humankind’s willingness to rush to shared footage as the world falls apart before their very eyes. People who found themselves in the middle of an unfolding disaster took it upon themselves to become amateur reporters, and piecemeal reports managed to capture a much bigger picture.
Theatre of War Award
The indefeasible power of the ocean – in the symbolic form of Operation Neptune’s Spear – evoked other feelings in 2011. We are talking about, of course, the Navy Seal’s hunting down and killing of the notoriously elusive Osama Bin Laden. The ocean even staked a claim on his bullet ridden corpse: the mass murderer was unceremoniously dumped into the sea to ensure that he would be sleeping with (and feeding) the fishes. Bin Laden’s summary execution proved to be a cleansing ritual that helped bring closure on the tenth anniversary of September 11. Fueled by an unrelenting outrage, his timely death managed to satisfy our own blood lust.
As the stage managed news of Bin Laden’s death confirmed, truth invariably becomes the first spoil of war. Indeed, the White House sought to justify the execution of Bin Laden by lending a visual power to its own storytelling. Following the administration’s lead, news outlets were similarly thinking about the raid from a “visual perspective” in order to show “how false his narrative has been over the years”. Conventional narrative tropes — good versus evil, heroes and villains, happy endings, etc — and poetic license were amongst the White House’s weapons of choice. The media’s eagerness to reproduce the official story confirms that there’s a thin line between news and entertainment. News outlets filtered the story through the lens of a Hollywood movie, ensuring that Bin Laden’s death could entertain people in the home theater.
We were originally led to believe that Bin Laden was captured and killed during a fierce firefight. A supposedly austere and fearless warrior was reported to be living in the lap of luxury and hiding behind women’s skirts when he was not masturbating to porn. Indeed, the Western media helped narrate a series of events that would bring audiences in the home theater to dramatic climax. The official story exhibited a classic three act structure and was characterized by exciting shifts in tone and location. News reports immediately established the main character (USA! USA!), genre (action-adventure) and complicating incident that would culminate in catharsis: bringing a fugitive to justice during a final showdown.
Competing media outlets understandably repeated the story verbatim — any attempt at independent verification would disrupt the breaking of news and give their competitors an unfair advantage. It’s certainly true that some outlets invariably cast doubt on the official narrative. The mass media, however, gave much less attention to the change in details: there were no comparable newsflashes when it became apparent that the Navy Seals met with next to no resistance and murdered a modest household of (mostly) defenseless men. Few wanted the truth to get in the way of a good story.
The ending to Bin Laden’s story reportedly brought into play two major themes — the moral fortitude of the West (an ability to endure adversity with persistence and courage) and the moral vindication of its war on terror (good triumphing over evil in a final showdown). The official story thereby highlighted the West’s supremacy on the world stage while throwing its avowed enemy into the dustbin of history. Indeed, America proved to be so high and mighty that it was even beneath them to participate in a show trial in an international court of law. The targeted demographic was also given a role to play on the world’s stage and became part of theater of war’s mis en scene. There remains a problem, however, when viewing justice in such dramatic terms – especially when many of us ended up publicly celebrating the murder of other people.
Some Are Holier than Thou
Specifically, the dynamics of an action-adventure — and the corresponding logic of an ‘eye for an eye’ — blinded us to our own moral distinctions. It encouraged us to see justice in terms of vengeance, and the resulting death/s succeeded in eliciting a questionable pleasure response from consumers. As importantly, such a characterization of events discourages many Westerner’s from asking some very troubling questions — like why Bin Laden had grown so angry with America in the first place – making it difficult to break the cycle when implicated in such a vicious circle. Since we viewed his death through the lens of an action movie, the concern is that there might be inevitable (or unintended) sequels. To cut a long story short, the narrative we chose to tell ourselves ended up telling an entirely different story, too: it made us look like an enemy we claim to be morally distinct from.
R.I.P. Award
Two larger than life men died in 2011 – figures that loomed so large on the cultural landscape that not even the grim reaper could cut through the crap. While Steve Jobs and Christopher Hitchens lived very different lives, they did share some things in common. They were both angry men who tried to create the world in their own image, and both succumbed to cancer at a relatively early age. There were also widespread attempts to canonize them when they passed away – which only goes to show that their reality distortion fields had also spread far and wide.
According to Steve Jobs’ 2005 Sanford commencement address, “death is the destination we all share” and “is the single best invention of life. It cleans out the old to make way for the new”. Indeed, death appears to have been invented just so people can say extraordinary things about you. It certainly came as a surprise to hear that death wasn’t invented by Steve Jobs or an Apple trademark, too. Nonetheless, it obviously helps if you were the head of the largest publicly traded corporation in the world by market capitalization – corporate media will form long queues just so they can also profit off you. Indeed, when Job’s died, the media fell over itself to sing the praises of a world class asshole – he was packaged and sold as a beloved genius that was somehow single handedly responsible for many of the creations that helped change our lives.
Never mind that many of Job’s new inventions had established precedents, and that his real genius consisted in tweaking and capitalizing on other people’s creations. Fame and fortune had its perks in death too, especially when you were the major shareholder (and public face) of a multinational corporation employing some of the best talent money could buy. At least that way you could surround yourself with brilliant people and be officially given credit for their work, as well. Perhaps what is most telling about the media spin was the fact that news outlets had clearly been drinking the Kool-Aide in Apple’s Genius Bar, too.
Part of the reason for all the hyperbole, of course, is that Jobs had helped us interface with a brave new world. It’s no accident that he was on the cover of Time magazine so many times – the charismatic CEO had become one of the faces of the computer age, and genuinely believed in the transformative powers of new technology. Computers weren’t just ‘product’ or ‘devices’ to him: they could be seen as an extension of our own personalities, too.
The ‘I’ (pod, pad, tunes, etc.) had become its own meme, or a self replicating unit of meaning. Individuals could therefore create and traverse shape shifting worlds, interacting with (aesthetic) objects as if they were an integral part of the design. Jobs infused passion into new commodities, and Apple products provided their own arguments from teleology. Perhaps that’s why Jobs spoke like a God lording over his creation. Design “is the fundamental soul of a man made creation that ends up expressing itself in successive outer layers…(so that)…each element plays together” and can be integrated into our whole lives (James B Stewart, “How Jobs Put Passion Into Products”, 7 October, 2011).
The Apple creator had, of course, a near death experience once before – he was fired from the company that was the amongst the first to spread the gospel about computers, and his ‘second coming’ ensured his place in the pantheon. The rise, fall and rise again of Steve Jobs, nonetheless, raises its own questions about the meaning (or teleos) of ‘design’. We ask the following questions not to impugn his accomplishments, but to follow Job’s own lead and similarly incorporate the issue of holism (the relation between part and whole) into the trajectory of our own lives. Specifically,
1. How was it possible for a supposed spiritual person (or Buddhist) to align himself with the values of consumerism and/or help fuse our own identities with material objects and constant cravings ?
2. How could Jobs – a person who succeeded because he was inter-personally exploitative and was able to convince other people to see the world through his eyes – not recognize the self refutation when insisting other “I’s” not to waste their time “living someone’s else’s life” or “with the results of other people’s thinking?”
3. What are we to make of a) an individual who accumulated so much wealth and power, but (unlike other inspirational figures) was reportedly uninterested in sharing his good fortune with the less fortunate or b) a situation where Job’s social standing (and our own pleasure and status seeking) remains implicated in the suffering and exploitation of others?
4. Given that Job’s believed in “karma” and “destiny”, what does his premature death say (if anything) about his life choices and/or Job’s own place in the grand design of things?
There was a time when Christopher Hitchens might have been interested in asking such questions. The former left wing critic liked nothing more than to take on sacred cows, ruffle people’s feathers and provide a corrective to received opinion. Since September 11, however, one of the most terrifying rhetoricians that the world has yet seen had more important things to worry about: like waging war against the enemies of Western civilization and cutting God down to size.
When Hitchens lost the battle to cancer, it was clear how high he was held in many people’s regard: many colleagues made a point of raising their glasses to their former drinking companion and cast him in the most flattering of lights. There were at least two ironies to be seen in the loving afterglow. The first is that Hitchens was unlikely to approve of a media shielding people from unpalatable truths. Indeed, their “safety-first version of public opinion” merely obscured our view of an increasingly dangerous public figure (to quote Christopher Hitchens himself, “Truth and Consequences”, 18 February, 2008). The other irony is that the famed atheist was also famous for his holier than thou attitude. Hitchens might not have believed in a God, but he certainly liked to think in absolutes and with unquestionable moral authority. Indeed, his proselytizing for the Iraqi war and/or increasing religious intolerance cannot be simply cast aside – they throw into question the status of his own legacy.
The Cultural Landscape as a Metaphorical Killing Field
People might have been shocked and awed to read that Hitchens (like Bin Laden) felt exhilaration when two planes hit the Twin Towers on September 11 and (like Bin Laden) saw the acts as answering a higher calling – 9/11 was the means in which to finally engage perceived enemies in an endless war from the trenches of his writing desk. Mass murder had become the occasion in which to commit mass murder in turn, but it was for a much better cause of course: exporting secular democracy to the Middle East under false presences. Indeed, it had become a good time for war and the use of cluster bombs on identifiable enemies had a heartening effect upon him.
But wait, there’s more bloodlust and fanaticism in the name of the greater good: widely banned cluster bombs were morally defensible in this war because it could go “straight through somebody and out the other side and through somebody else. And if they’re bearing a Koran over their heart, it’ll go straight through that too” (Adam Shatz, “The Left and 911”, 23 September, 2002). Instead of continuing to speak truth to power, Hitchens increasingly spoke to power on the grounds that the ‘liberation’ and occupation of Iraq justified the rising death toll and civilian casualties. We were finally seeing the “tree of liberty of being watered in the traditional manner” – with other people’s bloodshed (Christopher Hitchens, “Liberal Hawks Reconsider the Iraq War”, 13 January, 2004).
Hitchen’s argument with God merely revealed his own grandiose thinking: it was narcissism and zealotry writ large. While his attempt to make the world reflect a more rational ‘design’ was not without justification (or irony), such arrangements traditionally had their own boundaries and limits. Indeed, the higher faculty of reason was ideally not to be used to sanctify favored positions or falsify (negate, diminish) alternate viewpoints when more charitable approaches abound. Instead of adopting the principle of charity to ensure our shared humanity, (his) reason became a poison chalice to be passed around.
Its no coincidence that the hitchslap has claimed many victims and will continue to be an enjoyable spectator sport – Hitchen’s gladiatorial approach satisfies the widespread desire to turn our own cultural landscape into a metaphorical killing field. Consequently, he appeared to believe his own publicity and encouraged others to worship at the altar of idolatry. Perhaps that’s why he tended to pick his targets and audiences – unlike (say) the famous debate between Friar Copleston and Betrand Russell, our defender of civilization seemed incapable of having a civil conversation about the existence of God (amongst other things). And therein lies the problem of an increasingly combative and imperial Hitchens ‘terrifying’ rhetoric: Like his enemies, Hitchens used the power of reason to exalt himself and/or build temples to self serving beliefs. And not unlike Charlie Sheen, he was all about the ‘winning’ – he might ‘win here’ and he might ‘win there’, and somehow believed that small minded victories would make a provincial world view omnipresent. Reason had become an instrument of violence and colonization for him, and the ‘force’ of his arguments had little to do with encouraging others to participate in an ongoing dialogue or strengthening their positions. Indeed, Hitchens’ intolerance of competing and/or ‘weaker’ views throws into question his commitment to the homegrown democracies he wanted to export.
First Bloom of Democracy Award
“In the garden, growth has its seasons. First comes spring and summer, and then we have fall and winter. And then we get spring and summer again”. No, these are not the lyrics to the new Rebecca Black single. They’re banal words famously mistaken for illuminating advice in Being There. As our idiot savant Chance Gardner reassures us: “There will (always) be growth in the spring!” Perhaps what is most illuminating about the advice given is the way it was taken. The meaning of Chance’s words grew out of the consciousness of observers, and people merely saw what they wanted to see.
Unfortunately, such simple mindedness (and opportunism) could be seen in the West’s response to upheavals across North Africa and the Middle East. It’s true, of course, that the Arab uprisings officially had their roots in an argument over the sale of fresh produce. Mel Brook’s Producers, however, had already established that ‘springtime’ was a laughable political concept and deserving of its bad rap in To Be or Not To Be . Western media outlets nonetheless jumped on the bandwagon and ‘branded’ the waves of anger the Arab Spring. The images of Arabic people calling for more equality and freedom was obviously seen as a sign of growth in the West. The ‘power of the people’ was taken as vindication of Western values and civilization. It’s not by chance, then, that the Western ‘brand’ had two popular labels: the Facebook and Twitter revolutions.
While recent studies readily confirm the role social media actively played in planting the trees of liberty, many Westerner’s failed to notice something staring them in the face. The Arab people were already in the process of changing – otherwise it wouldn’t have been possible to cultivate anger online in the first place. Facebook and Twitter registered shifts that had been occurring offline for years. The use of social media fails to explain, for example, why online activism failed to bring about the anticipated spring in Iran’s 2009 Presidential election or why an internet savvy Bahrain remains rooted in a winter of discontent. The disparate 2011 ‘revolutions’ (however fruitful) had their roots in the more fertile ground of culturally specific dynamics and issues.
As Sourcewatch documents, however, the popular brand name predates the 2011 uprisings; namely, ‘Arab Spring’ coincides with the American led invasion of Iraq, and the subsequent spring cleaning (sic) speaks more to Western aspirations and policies. The reality is that a freedom loving West was helping to prop up repressive regimes like Iraq, Egypt and Saudi Arabia in the first place. A coalition of powerful democratic nations had been undermining ‘the people’ and their ‘power’ until (or unless) it suited its purposes. A complicit free press was equally disinclined to tell the story of ‘the people’s’ daily struggles and aspirations – which is presumably why the Western media appeared to be similarly caught off guard by the uprisings.
This is not to suggest that the branding should be viewed with complete skepticism. It’s to caution against the Orientalism that continues to inform the West’s selective and/or wishful thinking. We therefore need to resist the tendency to view a wide spectrum of people as the same being there.
That Most Elusive of Ideals
The images that captivated the world, then, failed to see the bigger picture: what might be going on behind the scenes or within the hearts of the different people similarly crying freedom. While the footage might have suggested a united front, the seemingly uniform voices failed to speak to the specific needs of distinct people across a complex and volatile region. Religious, ethnic and tribal differences temporally merged with the crowds and these were bound to emerge again once ‘the people’ returned to their homes or took up arms.
The footage therefore didn’t capture the various groups that would be jockeying for position and/or individuals falling by the wayside in the invariable power struggles. Indeed, our preference for the wide shot, crowd scenes and English speakers only managed to bring one thing into focus: calls for (more) freedom, equality and justice could only make sense within the context of people’s lives. Witness the recent democratic elections in Tunisia and Egypt: the principle of ‘majority rule’ converged around identity politics, or the question of how to relate to the roots of their own culture. Specifically, ‘the (Arab) people’ chose not to vote for Western (i.e., liberal or secular) values – the majority ruled in favor of their own brand of Islam.
Flowers of Democracy Award: Stems 1 and 2
Democracy is arguably the most elusive of ideals. The ‘rule of the people’ ideally equates to equality of opportunity and the pursuit of individual freedoms. The year 2011 publicly confirmed, however, that not everyone is created equal or is equally free to pursue their dreams. Racial and class differences broke through to the surface, reminding skeptics that democracy appears incapable of leveling the playing field. Many skeptics already know, of course, that corporations and interests groups are the ones with the real power – vested interests game the system and are not interested in transferring power (back) to the people. ‘The people’ might be given the freedom to vote during democratic elections, but their participation seems to consist in attempting to be manipulated and controlled by their own representatives.
There were two acts of protests that highlighted feelings of powerlessness in the West: the 2011 English Riots and the international Occupy movements. We shall award these protests separately, and view them as two sides of a coin that gained currency in the “Arab Spring” ™.
Free At Last! Award
The ‘Arab Spring’ ™ reportedly grew out of a dispute over fruit and vegetables: a poor street vendor (Mohamed Bouazizi) took his own life in front of a police station when he was told that he didn’t have a permit to support his family. In England, other symbols of freedom came to fruition: thousands of people gave themselves permission to steal trainers, iPods and flat screen TVs after a black man (Mark Duggan) was shot dead by the police. Indeed, the English riots proved to be a real free for all as rioters united to “liberate the exploited consumer goods of capitalism from their unjustly priced confinement” (“British Freedom Rioters Liberate TVs, Shoes and Computers”, 10 August, 2011 ).
Its worth noting that the initial reaction to Duggan’s death was not to embark on a crime wave. The first public response was a peaceful protest seeking a dialogue and answers. There was an attempt to hold the police to account for what was felt to be routine criminal behavior: unjustified police harassment and unpunished deaths in police custody. When the community wasn’t satisfied with the answers it got, ‘the people’ took the law into their own hands.
The first English riot grew out of an inflammatory situation: the Tottenham area appeared to spontaneously combust and the outrage spread to other areas like wildfire. Perhaps what’s most telling about the anger is the way it was publicly expressed: community anger found expression in status seeking and readdressing the status quo. The rioters not only sought out and stole valued consumer goods, they acted out against the stores that sold them (amongst other properties). While it’s difficult to see the riots as a political protest, identity politics was implicated in the rioter’s defiant behavior. Aptly described as the consumer society riots, the four day rampage may also be compared to a prison riot. This obviously begs the questions: why did these people feel imprisoned by their circumstances, and what was so liberating about such public acts of defiance?
Most of the rioters might have come from working class (and unemployed) areas, but a sense of entitlement appears to know no social bounds. The rioter’s identities were clearly wrapped up in the values of consumerism and were utilized to enfranchise their characters. Shopping trolleys and carry bags were amongst the rioter’s weapons of choice: it suddenly became necessary to help themselves to non essential items. The 2011 English riots were a free and unregulated marketplace at its most blatant and civic minded: people competed with each other in places where social value was primarily determined by the principles of supply and demand. Some of the more priceless stories include rioters queuing up to loot retail outlets and exchanging stolen goods with each other. It wasn’t always so ‘civil’ of course – looters occasionally resorted to mugging each other, too.
Indeed, the source of their perceived power appeared to be located in the freedom to make informed consumer choices: which items were worth stealing or would help to brand their individuality? Group think remained the order of the day, though – many people gave themselves license to steal (and kill) because many other people could be seen stealing, too. Perhaps that’s why so many of the rioters were so brazen and made little attempt to conceal their (physical) identities: when everyone seems to be doing it, is anyone really (or equally) responsible?
The British government continued to lead by example by similarly passing the buck. It blamed the riots on a “moral collapse” within a “broken society”, and declared “all out war on gangs and gang culture” (“Broken Society a Top Priority”, 15 August, 2011). Instead of acknowledging the riot’s root causes – as being symptomatic of morally bankrupt economic policies or as deviant insofar as it conformed to and perpetuated a socially sanctioned consciousness – the government responded by casting aspersions on the moral character of its own populace. Indeed, they had the CTV footage and text messages to confirm what everyone should already know: the world has become a violent and dangerous place because many people lack moral fibre and possess internet connections (as opposed to the violence reflecting structural inequalities and shared social values). Gang membership therefore equals criminality (instead of being an indicator of socially dis/placed identities and frustrations).
The rioters delivered themselves into the government’s hands when it invariably called for an increase in control and punitive measures. Their own criminal behavior gave ‘the powers that be’ a free pass to capitalize on the obviously false belief that ‘law’ and ‘order’ remain mutually attuned and reciprocally related.
Fight The Power! Award
If the English riots proved anything, it was the extent to which disaffected people had internalized the values of a consumer society. Consumerism has colonized public consciousness to such an extent that the public had effectively become ‘occupied’ (seized and controlled) by it. The International Occupy Movement provided an intriguing mirror image to this pre/occupation: it was a global protest about the way liberal democracies were consuming and spitting out their own. Indeed, the free world appeared to be over/run by razor gangs and robber barons. It should therefore come as no surprise that the occupy movement was initially spearheaded by Adbusters, an anti-consumerist group dedicated to overthrowing the mental environment.
Democracy Now / Democracy In Question
The initial call to Occupy Wall Street (via an email list and hashtag) encouraged people to actively challenge the way democratic institutions (financial services industries, government bodies, the legal system, free press, etc,) had become complicit in the re/production and/or legitimation of social inequalities. We shouldn’t give too much credit (or power) to Adbusters though: there was clearly something in the air. The Occupy movement explicitly has its roots in the Arab Spring™, and it has grown to such an extent that no one can (or does) claim to speak for the 99%. The nascent grass roots movement testifies to the fact that liberal democracies have been sowing seeds of dissent around the world for many years.
While the Arab Spring™ might have been calling for democracy now, people in the West were now calling the entire democratic process (back) into question. For all intents and purposes, democracy appears to have turned into legalized looting on a global scale, with the concentration of power, wealth and resources in the hands of a select few. As Cornel West said to Democracy Now , “we’re talking about a democratic awakening” and “raising political consciousness….so people can begin to see what’s going on through a different lens” and insist on a “transfer of power from oligarchs to everyday people” (“Cornel West on Occupy Wall Street”, 29 September, 2011).
The international Occupy movement has attracted people from all walks of life for obvious reasons: it appeals to general feelings of powerlessness and hopelessness. The movement has thereby become a lightening rod for a range of specific concerns and grievances (ranging from environmental to taxation issues). To some extent, the movement’s all inclusiveness also threatens to be its Achilles heel. As some tone deaf observers have mocked, the movement appears to lack a uniform message and recognizable platform. It’s certainly true that it resists easy categorization and does not attempt to secure consent in the form of sound bytes. The self proclaimed 99% have been conveying a complex and multifaceted message – they’ve come together to speak for themselves and to each other, indicating that their ‘awakening’ in the form of demands and objectives remains a work in progress.
Perhaps that’s why its been so much easier for corporate media to get prepackaged messages across – like images of police cracking down on perceived public nuisances or commentators engaging in smear campaigns and caricatures.
A particularly angry response comes from popular comic book artist Frank Miller. One wonders how many percent of people (potentially) agree with him when he calls the relatively small band of activists “losers”. Miller is clearly in no mood to awaken from his dogmatic slumbers or help raise the consciousness of a silent (or relatively passive) majority. Being seen as a loser is obviously the real moral failure in society, so no one should be taking their concerns seriously – that would put ‘the people’ on the wrong side of the cultural divide and make them look like big losers, too. According to Miller’s outburst, the true crime is unemployed (and unemployable) people milling about and calling for anarchy.
“Occupy is nothing but a pack of louts, thieves, and rapists, an unruly mob, fed by Woodstock-era nostalgia and putrid false righteousness. These clowns can do nothing but harm” since Occupy is “nothing short of a clumsy, poorly-expressed attempt at anarchy” that has taken the form of “an ugly fashion statement by… spoiled brats who should stop getting in the way of working people and find jobs for themselves” (Frank Miller, “Anarchy”, 7 November, 2011).
The Occupiers obviously have their work cut out for them when the 99% are (thus far) a vocal minority struggling to have their voices heard and/or legitimated by the majority of people. The big job at hand is determining how its possible to transfer power from one percent of people to the 99 percent they purportedly represent. The main difficulty is that general assemblies need to convince the majority of ‘occupied’ consumer societies of at least four things: 1) ‘power back to the people’ is a defensible cause and worth fighting for, 2) the fight for self empowerment through majority rule can be won through peaceful means and direct participation – assuming that peace should be given a chance and that everyday people want to directly participate in their own governance, 3) ‘the people’ feel that have more to gain than lose by taking on ‘the powers that be’, and 4) the world will be a better (or morally distinct) place if power and wealth could be distributed in a more equable manner.
The leaderless movement is attempting to lead by example – participatory democracy (consensus based decisions reached through an open dialogue and non hierarchical means) ideally points the way forward and can hopefully achieve practical change. Since the majority currently live in representative democracies, the self proclaimed 99% will need to compete in the marketplace of ideas and a broken system susceptible to one kind of ‘fixing’: convincing other voters that they represent a viable long term alternative every political season is no small task. The biggest hurdle remains our general mental climate – the environment continues to be seasoned with flavors of the month and sensational media coverage. We’ve come to live in a world that turns on media spin, quick fixes and short attention spans. Indeed, the democratic process is designed to co-opt (manage and diffuse widespread anger or seek to dis/place it via the latest distractions, material comforts and television seasons.