By arrangement with TomDispatch.
Down the road only a few generations, the millennium of Magna Carta, one of the great events in the establishment of civil and human rights, will arrive. Whether it will be celebrated, mourned, or ignored is not at all clear.
That should be a matter of serious immediate concern. What we do right now, or fail to do, will determine what kind of world will greet that event. It is not an attractive prospect if present tendencies persist—not least, because the Great Charter is being shredded before our eyes.
The first scholarly edition of Magna Carta was published by the eminent jurist William Blackstone. It was not an easy task. There was no good text available. As he wrote, “the body of the charter has been unfortunately gnawn by rats”—a comment that carries grim symbolism today, as we take up the task the rats left unfinished.
Blackstone’s edition actually includes two charters. It was entitled The Great Charter and the Charter of the Forest. The first, the Charter of Liberties, is widely recognized to be the foundation of the fundamental rights of the English-speaking peoples—or as Winston Churchill put it more expansively, “the charter of every self-respecting man at any time in any land.” Churchill was referring specifically to the reaffirmation of the Charter by Parliament in the Petition of Right, imploring King Charles to recognize that the law is sovereign, not the King. Charles agreed briefly, but soon violated his pledge, setting the stage for the murderous Civil War.
After a bitter conflict between King and Parliament, the power of royalty in the person of Charles II was restored. In defeat, Magna Carta was not forgotten. One of the leaders of Parliament, Henry Vane, was beheaded. On the scaffold, he tried to read a speech denouncing the sentence as a violation of Magna Carta, but was drowned out by trumpets to ensure that such scandalous words would not be heard by the cheering crowds. His major crime had been to draft a petition calling the people “the original of all just power” in civil society—not the King, not even God. That was the position that had been strongly advocated by Roger Williams, the founder of the first free society in what is now the state of Rhode Island. His heretical views influenced Milton and Locke, though Williams went much farther, founding the modern doctrine of separation of church and state, still much contested even in the liberal democracies.
The concept of due process has been extended under the Obama administration’s international assassination campaign in a way that renders this core element of the Charter of Liberties (and the Constitution) null and void.
As often is the case, apparent defeat nevertheless carried the struggle for freedom and rights forward. Shortly after Vane’s execution, King Charles granted a Royal Charter to the Rhode Island plantations, declaring that “the form of government is Democratical,” and furthermore that the government could affirm freedom of conscience for Papists, atheists, Jews, Turks—even Quakers, one of the most feared and brutalized of the many sects that were appearing in those turbulent days. All of this was astonishing in the climate of the times.
A few years later, the Charter of Liberties was enriched by the Habeas Corpus Act of 1679, formally entitled “an Act for the better securing the liberty of the subject, and for prevention of imprisonment beyond the seas.” The U.S. Constitution, borrowing from English common law, affirms that “the writ of habeas corpus shall not be suspended” except in case of rebellion or invasion. In a unanimous decision, the U.S. Supreme Court held that the rights guaranteed by this Act were “[c]onsidered by the Founders [of the American Republic] as the highest safeguard of liberty.” All of these words should resonate today.
The Second Charter and the Commons
The significance of the companion charter, the Charter of the Forest, is no less profound and perhaps even more pertinent today—as explored in depth by Peter Linebaugh in his richly documented and stimulating history of Magna Carta and its later trajectory. The Charter of the Forest demanded protection of the commons from external power. The commons were the source of sustenance for the general population: their fuel, their food, their construction materials, whatever was essential for life. The forest was no primitive wilderness. It had been carefully developed over generations, maintained in common, its riches available to all, and preserved for future generations—practices found today primarily in traditional societies that are under threat throughout the world.
The Charter of the Forest imposed limits to privatization. The Robin Hood myths capture the essence of its concerns (and it is not too surprising that the popular TV series of the 1950s, “The Adventures of Robin Hood,” was written anonymously by Hollywood screenwriters blacklisted for leftist convictions). By the seventeenth century, however, this Charter had fallen victim to the rise of the commodity economy and capitalist practice and morality.
With the commons no longer protected for cooperative nurturing and use, the rights of the common people were restricted to what could not be privatized, a category that continues to shrink to virtual invisibility. In Bolivia, the attempt to privatize water was, in the end, beaten back by an uprising that brought the indigenous majority to power for the first time in history. The World Bank has just ruled that the mining multinational Pacific Rim can proceed with a case against El Salvador for trying to preserve lands and communities from highly destructive gold mining. Environmental constraints threaten to deprive the company of future profits, a crime that can be punished under the rules of the investor-rights regime mislabeled as “free trade.” And this is only a tiny sample of struggles underway over much of the world, some involving extreme violence, as in the Eastern Congo, where millions have been killed in recent years to ensure an ample supply of minerals for cell phones and other uses, and of course ample profits.
The rise of capitalist practice and morality brought with it a radical revision of how the commons are treated, and also of how they are conceived. The prevailing view today is captured by Garrett Hardin’s influential argument that “freedom in a commons brings ruin to us all,” the famous “tragedy of the commons”: what is not owned will be destroyed by individual avarice.
An international counterpart was the concept of terra nullius, employed to justify the expulsion of indigenous populations in the settler-colonial societies of the Anglosphere, or their “extermination,” as the founding fathers of the American Republic described what they were doing, sometimes with remorse, after the fact. According to this useful doctrine, the Indians had no property rights since they were just wanderers in an untamed wilderness. And the hard-working colonists could create value where there was none by turning that same wilderness to commercial use.
In reality, the colonists knew better and there were elaborate procedures of purchase and ratification by crown and parliament, later annulled by force when the evil creatures resisted extermination. The doctrine is often attributed to John Locke, but that is dubious. As a colonial administrator, he understood what was happening, and there is no basis for the attribution in his writings, as contemporary scholarship has shown convincingly, notably the work of the Australian scholar Paul Corcoran. (It was in Australia, in fact, that the doctrine has been most brutally employed.)
The grim forecasts of the tragedy of the commons are not without challenge. The late Elinor Olstrom won the Nobel Prize in economics in 2009 for her work showing the superiority of user-managed fish stocks, pastures, woods, lakes, and groundwater basins. But the conventional doctrine has force if we accept its unstated premise: that humans are blindly driven by what American workers, at the dawn of the industrial revolution, bitterly called “the New Spirit of the Age, Gain Wealth forgetting all but Self.”
Like peasants and workers in England before them, American workers denounced this New Spirit, which was being imposed upon them, regarding it as demeaning and destructive, an assault on the very nature of free men and women. And I stress women; among those most active and vocal in condemning the destruction of the rights and dignity of free people by the capitalist industrial system were the “factory girls,” young women from the farms. They, too, were driven into the regime of supervised and controlled wage labor, which was regarded at the time as different from chattel slavery only in that it was temporary. That stand was considered so natural that it became a slogan of the Republican Party, and a banner under which northern workers carried arms during the American Civil War.
Controlling the Desire for Democracy
That was 150 years ago—in England earlier. Huge efforts have been devoted since to inculcating the New Spirit of the Age. Major industries are devoted to the task: public relations, advertising, marketing generally, all of which add up to a very large component of the Gross Domestic Product. They are dedicated to what the great political economist Thorstein Veblen called “fabricating wants.” In the words of business leaders themselves, the task is to direct people to “the superficial things” of life, like “fashionable consumption.” That way people can be atomized, separated from one another, seeking personal gain alone, diverted from dangerous efforts to think for themselves and challenge authority.
The process of shaping opinion, attitudes, and perceptions was termed the “engineering of consent” by one of the founders of the modern public relations industry, Edward Bernays. He was a respected Wilson-Roosevelt-Kennedy progressive, much like his contemporary, journalist Walter Lippmann, the most prominent public intellectual of twentieth century America, who praised “the manufacture of consent” as a “new art” in the practice of democracy.
As societies became freer and the resort to state violence more constrained, the urge to devise sophisticated methods of control of attitudes and opinion has only grown.
Both recognized that the public must be “put in its place,” marginalized and controlled—for their own interests of course. They were too “stupid and ignorant” to be allowed to run their own affairs. That task was to be left to the “intelligent minority,” who must be protected from “the trampling and the roar of [the] bewildered herd,” the “ignorant and meddlesome outsiders”—the “rascal multitude” as they were termed by their seventeenth century predecessors. The role of the general population was to be “spectators,” not “participants in action,” in a properly functioning democratic society.
And the spectators must not be allowed to see too much. President Obama has set new standards in safeguarding this principle. He has, in fact, punished more whistleblowers than all previous presidents combined, a real achievement for an administration that came to office promising transparency. WikiLeaks is only the most famous case, with British cooperation.
Among the many topics that are not the business of the bewildered herd is foreign affairs. Anyone who has studied declassified secret documents will have discovered that, to a large extent, their classification was meant to protect public officials from public scrutiny. Domestically, the rabble should not hear the advice given by the courts to major corporations: that they should devote some highly visible efforts to good works, so that an “aroused public” will not discover the enormous benefits provided to them by the nanny state. More generally the U.S. public should not learn that “state policies are overwhelmingly regressive, thus reinforcing and expanding social inequality,” though designed in ways that lead “people to think that the government helps only the undeserving poor, allowing politicians to mobilize and exploit anti-government rhetoric and values even as they continue to funnel support to their better-off constituents”—I’m quoting from the main establishment journal, Foreign Affairs, not from some radical rag.
Over time, as societies became freer and the resort to state violence more constrained, the urge to devise sophisticated methods of control of attitudes and opinion has only grown. It is natural that the immense PR industry should have been created in the most free of societies, the United States and Great Britain. The first modern propaganda agency was the British Ministry of Information a century ago, which secretly defined its task as “to direct the thought of most of the world”—primarily progressive American intellectuals, who had to be mobilized to come to the aid of Britain during World War I.
Its U.S. counterpart, the Committee on Public Information, was formed by Woodrow Wilson to drive a pacifist population to violent hatred of all things German—with remarkable success. American commercial advertising deeply impressed others. Goebbels admired it and adapted it to Nazi propaganda, all too successfully. The Bolshevik leaders tried as well, but their efforts were clumsy and ineffective.
A primary domestic task has always been “to keep [the public] from our throats,” as essayist Ralph Waldo Emerson described the concerns of political leaders when the threat of democracy was becoming harder to suppress in the mid-nineteenth century. More recently, the activism of the 1960s elicited elite concerns about “excessive democracy,” and calls for measures to impose “more moderation” in democracy.
One particular concern was to introduce better controls over the institutions “responsible for the indoctrination of the young”: the schools, the universities, the churches, which were seen as failing that essential task. I’m quoting reactions from the left-liberal end of the mainstream spectrum, the liberal internationalists who later staffed the Carter administration, and their counterparts in other industrial societies. The right wing was much harsher. One of many manifestations of this urge has been the sharp rise in college tuition, not on economic grounds, as is easily shown. The device does, however, trap and control young people by debt, often for the rest of their lives, thus contributing to more effective indoctrination.
The Three-Fifths People
Pursuing these important topics further, we see that the destruction of the Charter of the Forest, and its obliteration from memory, relates rather closely to the continuing efforts to constrain the promise of the Charter of Liberties. The “New Spirit of the Age” cannot tolerate the pre-capitalist conception of the Forest as the shared endowment of the community at large, cared for communally for its own use and for future generations, protected from privatization, from transfer to the hands of private power for service to wealth, not needs. Inculcating the New Spirit is an essential prerequisite for achieving this end, and for preventing the Charter of Liberties from being misused to enable free citizens to determine their own fate.
Popular struggles to bring about a freer and more just society have been resisted by violence and repression, and massive efforts to control opinion and attitudes. Over time, however, they have met with considerable success, even though there is a long way to go and there is often regression. Right now, in fact.
The most famous part of the Charter of Liberties is Article 39, which declares that “no free man” shall be punished in any way, “nor will We proceed against or prosecute him, except by the lawful judgment of his peers and by the law of the land.”
Through many years of struggle, the principle has come to hold more broadly. The U.S. Constitution provides that no “person [shall] be deprived of life, liberty, or property, without due process of law [and] a speedy and public trial” by peers. The basic principle is “presumption of innocence”—what legal historians describe as “the seed of contemporary Anglo-American freedom,” referring to Article 39; and with the Nuremberg Tribunal in mind, a “particularly American brand of legalism: punishment only for those who could be proved to be guilty through a fair trial with a panoply of procedural protections”—even if their guilt for some of the worst crimes in history is not in doubt.
The founders of course did not intend the term “person” to apply to all persons. Native Americans were not persons. Their rights were virtually nil. Women were scarcely persons. Wives were understood to be “covered” under the civil identity of their husbands in much the same way as children were subject to their parents. Blackstone’s principles held that “the very being or legal existence of the woman is suspended during the marriage, or at least is incorporated and consolidated into that of the husband: under whose wing, protection, and cover, she performs every thing.” Women are thus the property of their fathers or husbands. These principles remain up to very recent years. Until a Supreme Court decision of 1975, women did not even have a legal right to serve on juries. They were not peers. Just two weeks ago, Republican opposition blocked the Fairness Paycheck Act guaranteeing women equal pay for equal work. And it goes far beyond.
Slaves, of course, were not persons. They were in fact three-fifths human under the Constitution, so as to grant their owners greater voting power. Protection of slavery was no slight concern to the founders: it was one factor leading to the American revolution. In the 1772 Somerset case, Lord Mansfield determined that slavery is so “odious” that it cannot be tolerated in England, though it continued in British possessions for many years. American slave-owners could see the handwriting on the wall if the colonies remained under British rule. And it should be recalled that the slave states, including Virginia, had the greatest power and influence in the colonies. One can easily appreciate Dr. Johnson’s famous quip that “we hear the loudest yelps for liberty among the drivers of negroes.”
Post-Civil War amendments extended the concept person to African-Americans, ending slavery. In theory, at least. After about a decade of relative freedom, a condition akin to slavery was reintroduced by a North-South compact permitting the effective criminalization of black life. A black male standing on a street corner could be arrested for vagrancy, or for attempted rape if accused of looking at a white woman the wrong way. And once imprisoned he had few chances of ever escaping the system of “slavery by another name,” the term used by then-Wall Street Journal bureau chief Douglas Blackmon in an arresting study.
This new version of the “peculiar institution” provided much of the basis for the American industrial revolution, with a perfect work force for the steel industry and mining, along with agricultural production in the famous chain gangs: docile, obedient, no strikes, and no need for employers even to sustain their workers, an improvement over slavery. The system lasted in large measure until World War II, when free labor was needed for war production.
Recent Supreme Court rulings greatly enhance the already enormous political power of corporations and the super-rich, striking further blows against the tottering relics of functioning political democracy.
The postwar boom offered employment. A black man could get a job in a unionized auto plant, earn a decent salary, buy a house, and maybe send his children to college. That lasted for about 20 years, until the 1970s, when the economy was radically redesigned on newly dominant neoliberal principles, with rapid growth of financialization and the offshoring of production. The black population, now largely superfluous, has been recriminalized.
Until Ronald Reagan’s presidency, incarceration in the U.S. was within the spectrum of industrial societies. By now it is far beyond others. It targets primarily black males, increasingly also black women and Hispanics, largely guilty of victimless crimes under the fraudulent “drug wars.” Meanwhile, the wealth of African-American families has been virtually obliterated by the latest financial crisis, in no small measure thanks to criminal behavior of financial institutions, with impunity for the perpetrators, now richer than ever.
Looking over the history of African-Americans from the first arrival of slaves almost 500 years ago to the present, they have enjoyed the status of authentic persons for only a few decades. There is a long way to go to realize the promise of Magna Carta.
Sacred Persons and Undone Process
The post-Civil War fourteenth amendment granted the rights of persons to former slaves, though mostly in theory. At the same time, it created a new category of persons with rights: corporations. In fact, almost all the cases brought to the courts under the fourteenth amendment had to do with corporate rights, and by a century ago, they had determined that these collectivist legal fictions, established and sustained by state power, had the full rights of persons of flesh and blood; in fact, far greater rights, thanks to their scale, immortality, and protections of limited liability. Their rights by now far transcend those of mere humans. Under the “free trade agreements,” Pacific Rim can, for example, sue El Salvador for seeking to protect the environment; individuals cannot do the same. General Motors can claim national rights in Mexico. There is no need to dwell on what would happen if a Mexican demanded national rights in the United States.
Domestically, recent Supreme Court rulings greatly enhance the already enormous political power of corporations and the super-rich, striking further blows against the tottering relics of functioning political democracy.
Meanwhile Magna Carta is under more direct assault. Recall the Habeas Corpus Act of 1679, which barred “imprisonment beyond the seas,” and certainly the far more vicious procedure of imprisonment abroad for the purpose of torture—what is now more politely called “rendition,” as when Tony Blair rendered Libyan dissident Abdel Hakim Belhaj, now a leader of the rebellion, to the mercies of Qaddafi; or when U.S. authorities deported Canadian citizen Maher Arar to his native Syria, for imprisonment and torture, only later conceding that there was never any case against him. And many others, often through Shannon Airport, leading to courageous protests in Ireland.
The concept of due process has been extended under the Obama administration’s international assassination campaign in a way that renders this core element of the Charter of Liberties (and the Constitution) null and void. The Justice Department explained that the constitutional guarantee of due process, tracing to Magna Carta, is now satisfied by internal deliberations in the executive branch alone. The constitutional lawyer in the White House agreed. King John might have nodded with satisfaction.
The issue arose after the presidentially ordered assassination-by-drone of Anwar al-Awlaki, accused of inciting jihad in speech, writing, and unspecified actions. A headline in the New York Times captured the general elite reaction when he was murdered in a drone attack, along with the usual collateral damage. It read: “The West celebrates a cleric’s death.” Some eyebrows were lifted, however, because he was an American citizen, which raised questions about due process—considered irrelevant when non-citizens are murdered at the whim of the chief executive. And irrelevant for citizens, too, under Obama administration due-process legal innovations.
Presumption of innocence has also been given a new and useful interpretation. As the New York Times reported, “Mr. Obama embraced a disputed method for counting civilian casualties that did little to box him in. It in effect counts all military-age males in a strike zone as combatants, according to several administration officials, unless there is explicit intelligence posthumously proving them innocent.” So post-assassination determination of innocence maintains the sacred principle of presumption of innocence.
It would be ungracious to recall the Geneva Conventions, the foundation of modern humanitarian law: they bar “the carrying out of executions without previous judgment pronounced by a regularly constituted court, affording all the judicial guarantees which are recognized as indispensable by civilized peoples.”
The most famous recent case of executive assassination was Osama bin Laden, murdered after he was apprehended by 79 Navy seals, defenseless, accompanied only by his wife, his body reportedly dumped at sea without autopsy. Whatever one thinks of him, he was a suspect and nothing more than that. Even the FBI agreed.
Celebration in this case was overwhelming, but there were a few questions raised about the bland rejection of the principle of presumption of innocence, particularly when trial was hardly impossible. These were met with harsh condemnations. The most interesting was by a respected left-liberal political commentator, Matthew Yglesias, who explained that “one of the main functions of the international institutional order is precisely to legitimate the use of deadly military force by western powers,” so it is “amazingly naïve” to suggest that the U.S. should obey international law or other conditions that we righteously demand of the weak.
Only tactical objections can be raised to aggression, assassination, cyberwar, or other actions that the Holy State undertakes in the service of mankind. If the traditional victims see matters somewhat differently, that merely reveals their moral and intellectual backwardness. And the occasional Western critic who fails to comprehend these fundamental truths can be dismissed as “silly,” Yglesias explains—incidentally, referring specifically to me, and I cheerfully confess my guilt.
Executive Terrorist Lists
Perhaps the most striking assault on the foundations of traditional liberties is a little-known case brought to the Supreme Court by the Obama administration, Holder v. Humanitarian Law Project. The Project was condemned for providing “material assistance” to the guerrilla organization PKK, which has fought for Kurdish rights in Turkey for many years and is listed as a terrorist group by the state executive. The “material assistance” was legal advice. The wording of the ruling would appear to apply quite broadly, for example, to discussions and research inquiry, even advice to the PKK to keep to nonviolent means. Again, there was a marginal fringe of criticism, but even those accepted the legitimacy of the state terrorist list—arbitrary decisions by the executive, with no recourse.
The record of the terrorist list is of some interest. For example, in 1988 the Reagan administration declared Nelson Mandela’s African National Congress to be one of the world’s “more notorious terrorist groups,” so that Reagan could continue his support for the Apartheid regime and its murderous depredations in South Africa and in neighboring countries, as part of his “war on terror.” Twenty years later Mandela was finally removed from the terrorist list, and can now travel to the U.S. without a special waiver.
Another interesting case is Saddam Hussein, removed from the terrorist list in 1982 so that the Reagan administration could provide him with support for his invasion of Iran. The support continued well after the war ended. In 1989, President Bush I even invited Iraqi nuclear engineers to the U.S. for advanced training in weapons production—more information that must be kept from the eyes of the “ignorant and meddlesome outsiders.”
One of the ugliest examples of the use of the terrorist list has to do with the tortured people of Somalia. Immediately after September 11th, the United States closed down the Somali charitable network Al-Barakaat on grounds that it was financing terror. This achievement was hailed one of the great successes of the “war on terror.” In contrast, Washington’s withdrawal of its charges as without merit a year later aroused little notice.
Al-Barakaat was responsible for about half the $500 million in remittances to Somalia, “more than it earns from any other economic sector and 10 times the amount of foreign aid [Somalia] receives” a U.N. review determined. The charity also ran major businesses in Somalia, all destroyed. The leading academic scholar of Bush’s “financial war on terror,” Ibrahim Warde, concludes that apart from devastating the economy, this frivolous attack on a very fragile society “may have played a role in the rise… of Islamic fundamentalists,” another familiar consequence of the “war on terror.”
The very idea that the state should have the authority to make such judgments is a serious offense against the Charter of Liberties, as is the fact that it is considered uncontentious. If the Charter’s fall from grace continues on the path of the past few years, the future of rights and liberties looks dim.
Who Will Have the Last Laugh?
A few final words on the fate of the Charter of the Forest. Its goal was to protect the source of sustenance for the population, the commons, from external power—in the early days, royalty; over the years, enclosures and other forms of privatization by predatory corporations and the state authorities who cooperate with them, have only accelerated and are properly rewarded. The damage is very broad.
If we listen to voices from the South today we can learn that “the conversion of public goods into private property through the privatization of our otherwise commonly held natural environment is one way neoliberal institutions remove the fragile threads that hold African nations together. Politics today has been reduced to a lucrative venture where one looks out mainly for returns on investment rather than on what one can contribute to rebuild highly degraded environments, communities, and a nation. This is one of the benefits that structural adjustment programmes inflicted on the continent—the enthronement of corruption.” I’m quoting Nigerian poet and activist Nnimmo Bassey, chair of Friends of the Earth International, in his searing expose of the ravaging of Africa’s wealth, To Cook a Continent, the latest phase of the Western torture of Africa.
The major business lobbies openly announce their propaganda campaigns to convince the public that there is no need for undue concern—with some effect, as polls show.
Torture that has always been planned at the highest level, it should be recognized. At the end of World War II, the U.S. held a position of unprecedented global power. Not surprisingly, careful and sophisticated plans were developed about how to organize the world. Each region was assigned its “function” by State Department planners, headed by the distinguished diplomat George Kennan. He determined that the U.S. had no special interest in Africa, so it should be handed over to Europe to “exploit”—his word—for its reconstruction. In the light of history, one might have imagined a different relation between Europe and Africa, but there is no indication that that was ever considered.
More recently, the U.S. has recognized that it, too, must join the game of exploiting Africa, along with new entries like China, which is busily at work compiling one of the worst records in destruction of the environment and oppression of the hapless victims.
It should be unnecessary to dwell on the extreme dangers posed by one central element of the predatory obsessions that are producing calamities all over the world: the reliance on fossil fuels, which courts global disaster, perhaps in the not-too-distant future. Details may be debated, but there is little serious doubt that the problems are serious, if not awesome, and that the longer we delay in addressing them, the more awful will be the legacy left to generations to come. There are some efforts to face reality, but they are far too minimal. The recent Rio+20 Conference opened with meager aspirations and derisory outcomes.
Meanwhile, power concentrations are charging in the opposite direction, led by the richest and most powerful country in world history. Congressional Republicans are dismantling the limited environmental protections initiated by Richard Nixon, who would be something of a dangerous radical in today’s political scene. The major business lobbies openly announce their propaganda campaigns to convince the public that there is no need for undue concern—with some effect, as polls show.
The media cooperate by not even reporting the increasingly dire forecasts of international agencies and even the U.S. Department of Energy. The standard presentation is a debate between alarmists and skeptics: on one side virtually all qualified scientists, on the other a few holdouts. Not part of the debate are a very large number of experts, including the climate change program at MIT among others, who criticize the scientific consensus because it is too conservative and cautious, arguing that the truth when it comes to climate change is far more dire. Not surprisingly, the public is confused.
In his State of the Union speech in January, President Obama hailed the bright prospects of a century of energy self-sufficiency, thanks to new technologies that permit extraction of hydrocarbons from Canadian tar sands, shale, and other previously inaccessible sources. Others agree. The Financial Times forecasts a century of energy independence for the U.S. The report does mention the destructive local impact of the new methods. Unasked in these optimistic forecasts is the question what kind of a world will survive the rapacious onslaught.
In the lead in confronting the crisis throughout the world are indigenous communities, those who have always upheld the Charter of the Forests. The strongest stand has been taken by the one country they govern, Bolivia, the poorest country in South America and for centuries a victim of western destruction of the rich resources of one of the most advanced of the developed societies in the hemisphere, pre-Columbus.
After the ignominious collapse of the Copenhagen global climate change summit in 2009, Bolivia organized a People’s Summit with 35,000 participants from 140 countries—not just representatives of governments, but also civil society and activists. It produced a People’s Agreement, which called for very sharp reduction in emissions, and a Universal Declaration on the Rights of Mother Earth. That is a key demand of indigenous communities all over the world. It is ridiculed by sophisticated westerners, but unless we can acquire some of their sensibility, they are likely to have the last laugh—a laugh of grim despair.
By arrangement with TomDispatch.