Monday, September 29, 2008

Revealing the Clarion Fund's Bigoted 'Swift Boat' Campaign


As many of you no doubt already know, a recent campaign has targeted swing states in this year's Presidential election during which a shadowy group, the pro-McCain Clarion Fund, has paid for "advertising supplements" to be inserted in over 70 U.S. newspapers and publications, including The Chronicle of Higher Education. 28 million copies of the bigoted propaganda DVD Obsession have been inserted in these publications. The film includes includes only interviews with individuals who back its ideological agenda. Background is available here: http://tpmelectioncentral.talkingpointsmemo.com/2008/09/who_is_funding_distribution_of.php and http://www.obsessionwithhate.com/index.php.

See also, "Clarion Fund, Pro-McCain Non-profit Group Fueling Politics of Fear," and "Anti-Muslim Film Produced by Pro-Israel Partisan Boosts McCain."

Professor Omid Safi of the University of North Carolina-Chapel Hill has written an excellent primer based on his own recent sleuthing and research that does much to reveal the thinly veiled political motives of this campaign and, most importantly, the perpetrators of this campaign and which ideological groups they have ties to. This primer is available to read here.

Please read it and distribute in the format as sent, as you see fit, to those who you think need to know the facts, in order to thwart this latest attempt at distorting the truth to win a political election.

Is This a 'Victory'? Deconstructing False Claims about Iraq

The Badr Corps of the Supreme Islamic Iraqi Council, a U.S. AND Iranian ally. The SIIC and its paramilitary wing, the Badr Corps (or Badr Brigades/Badr Army), were founded in 1982 and 1983 in Iran by Iraqi Arab exiles, with the support of the Iranian revolutionary government led by Grand Ayatullah Sayyid Ruhollah Khumayni. The Badr Corps was trained and, for a time, led by officers of the Iranian Revolutionary Guard Corps, the "imperial guard" of the Islamic Republic of Iran, tasked with protecting the new revolutionary order during the early 1980s when regional rebellions threatened to fracture the nation-state of Iran. Only 50% of Iranians are Aryan and speak Persian as their mother tongue. Azeris, Turkmen, Kurds, Lurs, Baluchis, Arabs, and other ethnic groups make up the rest of Iran's population. Nomadic pastoralism and regional/ethnic desires for autonomy and independence were brutally crushed by the Pahlavi shahs and then the Iranian revolutionary regime.

Is This a 'Victory' ?

By Peter W. Galbraith
The New York Review of Books [September 25, 2008]

We hear again and again from Washington that we have turned a corner in Iraq and are on the path to victory. If so, it is a strange victory. Shiite religious parties that are Iran's closest allies in the Middle East control Iraq's central government and the country's oil-rich south. A Sunni militia, known as the Awakening, dominates Iraq's Sunni center. It is led by Baathists, the very people we invaded Iraq in 2003 to remove from power. While the US sees the Awakening as key to defeating al-Qaeda in Iraq, Iraq's Shiite government views it as a mortal enemy and has issued arrest warrants for many of its members. Meanwhile the Shiite-Kurdish alliance that brought stability to parts of Iraq is crumbling. The two sides confronted each other militarily after the Iraqi army entered the Kurdish-administered town of Khanaqin in early September.

John McCain has staked his presidential candidacy on his early advocacy of sending more troops to Iraq. He says he is for victory while Barack Obama is for surrender; and polls suggest that voters trust McCain more on Iraq than they do Obama. In 2006, dissatisfaction with the Iraq war ended Republican control of both the House of Representatives and the Senate. This year, in spite of being burdened with the gravest financial crisis since 1929 and the most unpopular president since the advent of polling, the Republican presidential nominee is running a competitive race.

The US sent more troops into Iraq in 2007 and violence has declined sharply in Anbar, Baghdad, and many other parts of the country. Sectarian killings in Baghdad are a fraction of what they were in 2006, although that city remains one of the world's most dangerous places. In recent months, US casualties have been at their lowest level of the entire war. While it is debatable how much of this is the result of the "surge" in US troop strength, as opposed to other factors, the decline in violence is obviously a welcome development.

Less violence, however, is not the same thing as success. The United States did not go to war in Iraq for the purpose of ending violence between contending sectarian forces. Success has to be measured against US objectives. John McCain proclaims his goal to be victory and says we are now winning in Iraq (a victory that will, of course, be lost if his allegedly pro-surrender opponent wins). He considers victory to be an Iraq that is "a democratic ally." George W. Bush has defined victory as a unified, democratic, and stable Iraq. Neither man has explained how he will transform Iraq's ruling theocrats into democrats, diminish Iran's vast influence in Baghdad, or reconcile Kurds and Sunnis to Iraq's new order. Remarkably, neither the Democrats nor the press has challenged them to do so.

In January 2007, President Bush announced that he was sending 25,000 additional troops to Baghdad and Anbar province. Under a military strategy devised by the newly appointed Iraq commander, General David Petraeus, US troops moved out of their secure bases and embedded themselves among the population. The forces of the surge were intended to provide sufficient protection to the local population so that they would cooperate with the Iraqi army and police and US troops fighting insurgents and subversive Shiite militias. By living with their Iraqi counterparts, the US troops could provide training, advice, and confidence, making the Iraqi forces more capable.

Politically, the surge was intended to provide a breathing space for Iraq's diverse factions to come together on a program of national reconciliation. This was to include revision of a law excluding Baathists from public service, new provincial elections so that Sunnis might be fully represented on the local level, a law for the equitable sharing of oil revenues, and revisions of the Iraqi constitution to create a more powerful central government. Except for a flawed law on de-Baathification, these goals have not been achieved, although the parliament recently passed a law to allow elections in parts of the country. Militarily, however, the surge worked as General Petraeus intended. In Baghdad and other places wracked by sectarian violence, Sunnis and Shiites welcomed the increased presence of US troops.

The surge, however, has not been the main reason for the decline in violence. In 2006, Sunni tribal leaders in Anbar decided that al-Qaeda and like-minded Islamic fundamentalist fighters were a greater threat than the Americans. The fundamentalists were a direct challenge to the local establishment, assassinating sheikhs and raping their daughters (sometimes under the pretext of forced marriage to jihadis). More importantly, the tribal leaders came to realize that the Americans would sooner or later want to leave while the fundamentalists intended to stay and rule. The tribal leaders obtained American money to create their own militias and, in a brief period of time, forced al-Qaeda and its allies out of most of Sunni Iraq. Denied their base in Sunni areas, the fundamentalists have been less able to stage the spectacular attacks on Shiites that helped fuel Iraq's Sunni–Shiite civil war.

Meanwhile, the radical Shiite Moqtada al-Sadr responded to the increased US military deployments by ordering his militia, the Mahdi Army, to stand down. At the time, this seemed like a sensible tactical approach. He, too, realized that the US presence—in particular the surge in troop numbers—was a temporary phenomenon. By not fighting the Americans, he could wait out the surge, recall his troops, and eventually resume battle with the Sunnis and rival Shiite factions.

Al-Sadr's Shiite rivals, however, outfoxed him. In 2006, the support of al-Sadr's parliamentarians enabled Nouri al-Maliki to win the nomination of the Shiite caucus to be prime minister by one vote over Adel Abdul Mehdi, the candidate of Iraq's largest Shiite party, the Supreme Council for the Islamic Revolution in Iraq (SCIRI). In 2008, however, al-Maliki broke his connection to al-Sadr and aligned himself with SCIRI (since renamed the Supreme Islamic Iraqi Council, or SIIC). In March, he used the Iraqi army, a Shiite-dominated institution built around the SIIC's militia, the Badr Corps, to oust the Mahdi Army from much of Basra. Subsequently, the Iraqi army and police have made inroads against the Madhi Army in its stronghold in Sadr City, Baghdad's sprawling Shiite slum.

Al-Maliki launched the Basra operation without first telling the Americans, and when the Iraqi forces ran into difficulty, he had to ask for American support. Once it became clear that the government and the Americans were bringing substantial resources to both the Basra and Baghdad campaigns, the Mahdi Army chose to negotiate a halt in the fighting rather than engage in full-scale combat.

Thus in 2007 and 2008, both the Sunnis and the Shiites fought civil wars within their communities. Among the Sunnis, the Awakening emerged as the decisive victor over al-Qaeda and the other fundamentalists. Among the Shiites, the ruling Shiite political parties have undercut Moqtada al-Sadr politically and diminished the Mahdi Army militarily. But al-Sadr has not been defeated and has significant residual support.

In both the Shiite and Sunni communities, relative "moderates" have emerged from the intracommunal fighting. This is one key factor in the reduced violence. The Sunni Awakening does not use car bombs against Shiite pilgrims and it has diminished al-Qaeda's ability to do so. The SCIRI-controlled Iraqi Interior Ministry had run its own death squads targeting Sunnis, but they were not as murderous and cruel as the death squads of al-Sadr. The surge had little to do with Sunnis turning against al-Qaeda (although US funds were critical) but it did have a part in undermining the Mahdi Army.

Although the Bush administration would never say so, it has in effect adopted the decentralization strategy long advocated by Senator Joseph Biden and now also supported by Senator Obama. Biden's plan would devolve almost all central government functions—including security—to Sunni or Shiite regions with powers similar to those now exercised by Kurdistan. Until late 2006, the Bush administration tried to defeat al-Qaeda with a US-backed Shiite- dominated Iraqi army. The approach failed and the US Marines even concluded that Anbar, Iraq's largest Sunni province, was lost to al-Qaeda. While the Sunnis have yet to set up a region (as allowed by Iraq's constitution), they now have, in the Awakening, a Sunni-commanded army. And it has defeated al-Qaeda.

In July, Prime Minister Nouri al-Maliki interjected himself into the US presidential campaign, telling the German magazine Der Spiegel that "US presidential candidate Barack Obama talks about sixteen months. That, we think, would be the right time frame for a withdrawal, with the possibility of slight changes." Al-Maliki's endorsement of the main plank of Obama's Iraq plan undercut both President Bush and Senator McCain. The US embassy prevailed on al-Maliki's spokesman, Ali al-Dabbagh, to say that Der Spiegel had mistranslated his boss. Al-Dabbagh, however, wouldn't issue the statement himself, so it was put out by CENTCOM in his name. A few days later, al-Maliki met the visiting Senator Obama and again endorsed his deadline. This time al-Dabbagh explained that al-Maliki meant it.

Some conservative commentators suggested that al-Maliki had decided Obama was going to win and wanted to have good relations with the next US president. Others suggested that al-Maliki was playing to Iraqi public opinion and didn't mean what he said. Bush loyalists grumbled that al-Maliki was an ingrate.

Few grasped the most obvious explanation: Nouri al-Maliki wants US troops out of Iraq. He leads a Shiite coalition comprised of religious parties, including his own Dawa party, which is committed to making Iraq into a Shiite Islamic state. Like his coalition partners, al-Maliki views Iraq's Sunnis with deep—and justifiable—suspicion. For four years after Saddam's fall, Iraqi Sunnis supported an insurgency that branded Shiites as apostates deserving death. Now the Sunnis have thrown their support behind the Awakening, which is portrayed by American politicians, including Senator McCain, as a group of patriotic Iraqis engaged in the fight against al-Qaeda. Iraq's Shiite leaders see the Awakening as a Baathist-led organization that rejects Iraq's new Shiite-led order—an accurate description.

Until 2007, the Americans fought alongside the Shiite-led Iraqi army against the Sunni fundamentalists. The Shiites were more than happy to have the Americans do much of their fighting for them. When the US created and began to finance the Sunni Awakening in 2007, the Shiite perspective on the American presence shifted. Now the United States was backing a military force deeply hostile to Shiite rule. Al-Qaeda could—and did—kill thousands of Shiites but it was no threat to Shiite rule per se. It was a shadowy terrorist organization operating with small cells and unable to mobilize or concentrate large forces. Further, both the US and Iran, the two most important external powers in the Iraqi equation, were certain to support the Shiites against al-Qaeda.

With some 100,000 men under arms, the Awakening is, at least potentially, a strong military force in its own right. Its leaders are not only ideologically linked to Saddam's anti-Shiite Baath regime, but many served in Saddam's army. And most importantly from a Shiite perspective, the Awakening has powerful outside support—from the United States. Al-Qaeda could never take over Iraq, but the Awakening might—or at least so Iraq's Shiite government fears.

Since the US created the Awakening, its goal has been to integrate the Sunni militiamen into Iraq's armed forces. Al-Maliki's government has repeatedly promised the Bush administration that it would do so, and then reneged. (Iraqis learned in the early days of the occupation that President Bush and his team were readily satisfied with promises, regardless of whether any actions followed.) At the end of 2007, General Jim Huggins, who oversaw the Iraqi police in the Sunni belt south of Baghdad, submitted three thousand names—most from the Awakening but also including a few hundred Shiites—to the Iraqi government for incorporation into the security forces. Four hundred were accepted. All were Shiites. As of October 1, the Iraqi government is supposed to take over responsibility for the 54,000 Awakening militiamen in Baghdad, including paying their salaries. By all accounts, the militiamen are deeply skeptical that this will happen, as apparently are their American sponsors. US commanders have been reassuring the Awakening that the US will not abandon them.

As many as one half the members of the Awakening have been insurgents or insurgent sympathizers. While the Sunni militiamen can gain tactical advantage by joining the Iraqi army and police, they are no less hostile to the Shiite-led Iraqi government than when they were planting roadside bombs, ambushing government forces, and executing kidnapped Iraqi army recruits and police. The Shiites understand this and so, apparently, do some of the Americans. As General Huggins told USA Today, if the Sunnis "aren't pulled into the Iraqi security forces, then we have to wonder if we're just arming the next Sunni resistance."

From 2003 until 2007, the Bush administration helped Iraq's most pro-Iranian Shiite religious parties take and consolidate power. Naturally, the Shiites—and their Iranian backers—welcomed the US involvement, at least temporarily. Now the United States is putting heavier pressure on al-Maliki to include the Sunni enemy in Iraq's security forces. It has created a Sunni army that, as long as the US remains in Iraq, can only grow in strength. Al-Maliki and his allies want the US out of Iraq because the American presence has become dangerous.

Without American troops, the Iraqi army and police would be able to move against the Awakening. Should Sunni forces prove too powerful, Iran is always available to help.

In early September, al-Maliki sent Iraqi troops into Khanaqin, a dusty Kurdish town on the Iranian border northeast of Baghdad. While technically not part of the Kurdistan Region, the Kurdistan Regional Government has administered Khanaqin since 2003. The forces of the Kurdish Peshmerga army, who liberated the town from Saddam that April, have provided security. It is widely expected that Khanaqin will formally be incorporated into the Kurdistan Region as part of the process specified in Article 140 of Iraq's constitution for determining Kurdistan's borders. By sending Arab troops to Khanaqin, al-Maliki deliberately picked a fight with the Kurds, who have been the Shiites' partner in governing Iraq since 2003.

Iraq's Kurds have had a very large part in post-Saddam Iraq. Iraq's president, deputy prime minister, foreign minister, and army chief are all Kurds. The Peshmerga fought on the US side in the 2003 war and is the one indigenous Iraqi force that is reliably pro-American. Iraqi Kurds are secular, democratic, and pro-Western. Both militarily and politically, they have supported US policy, even when they have had reservations about its wisdom.

In recent months, al-Maliki has tried to marginalize the Kurds. In ordering troops to Khanaqin, he did not consult Jalal Talabani, Iraq's Kurdish president, and he did not involve General Babakir Zebari, the Kurd who supposedly heads Iraq's army. In order to bypass Hoshyar Zebari, Iraq's Kurdish foreign minister, al-Maliki has appointed his own "special envoys."

President Talabani, who was in the US for medical treatment at the time, helped defuse the Khanaqin crisis by persuading both the Peshmerga and the Iraqi army to withdraw. But the incident has been seen by the Kurds as a danger sign. When Iraq's defense minister proposed acquiring American F-16s for the Iraqi air force, Iraq's neighbors—including Iran and Kuwait—said nothing. But the Kurdish deputy speaker of the Iraqi parliament strongly protested, expressing fear that the planes' most likely target would be Kurdistan. As a condition of the proposed US–Iraq security agreement, the Kurds want assurances that the Iraqi army will not be used in Kurdistan.

The surge was intended to buy time for political reconciliation. In January, Iraq's parliament revised the country's de-Baathification law, thus meeting a long-standing US demand. While the new law restored the rights of some former Baathists, however, it imposed an entirely new set of exclusions on Baathists in so-called sensitive ministries. Iraq's Sunni parliamentarians mostly opposed the law, which was supposed to help them. The Sunnis had demanded early provincial elections since they had boycotted the previous local elections in 2005 and were largely unrepresented on the provincial councils, even in Sunni areas. The Shiite-dominated parliament inserted a poison pill into the election law, a provision that would invalidate the "one man, one vote" principle in the Kirkuk Governorate—the administrative unit that includes the major city of Kirkuk on the Kurdistan border—in favor of a system of equal representation for each of Kirkuk's three communities: Kurds, Arabs, and Turkmen. Naturally, the Kurds, who are a majority both in the Governorate and on the Governorate Council, opposed a system that would give their foes two thirds of council seats.

Talabani vetoed the entire bill and as a result the Kurds were blamed for blocking national elections that the Shiites and some Sunnis also did not want to hold. (The SIIC was afraid it might lose some Governorates it now controls, including Baghdad, to Moqtada al-Sadr, while some Sunni parliamentarians feared the Awakening's electoral strength would underscore the fact that they do not represent the Sunni community.) Recently, the parliament passed a law to allow elections in 2009 in Sunni and Shiite Iraq, but not in Kirkuk or Kurdistan. The maneuverings left the Kurds politically isolated while, as a bonus to the Shiite ruling parties, providing more time for them to deal with al-Sadr. The Shiites are also pursuing changes in Iraq's constitution that would strengthen the central government at the expense of Kurdistan, knowing full well that these changes will be rejected by the Kurds.

Al-Maliki's agenda is transparent. The Kurds and Sunnis are obstacles to the ruling coalition's ambitions for a Shiite Islamic state. Al-Maliki wants to eliminate the Sunni militia and contain the Kurds politically and geographically. America's interest in defeating al-Qaeda is far less important to him than the Shiite interest in not having a powerful Sunni military that could overthrow Iraq's new Shiite order. The Kurds are too secular, too Western, and too pro-American for the Shiites to share power comfortably with them.

This should not be a surprise. Iran, not the US, is the most important ally of Iraq's ruling Shiite political parties. The largest party in al-Maliki's coalition is the SIIC, which was founded by the Ayatollah Khomeini in Iran in 1982. By all accounts, Iran wields enormous influence within Iraq's ruling Shiite coalition and has an effective veto over Iraqi security policies. In 2005, Iran intervened in Iraq's constitutional deliberations to undo a Shiite–Kurdish agreement on Kurdistan's powers, only to relent after Kurdistan President Massoud Barzani made clear that there would be no constitution without the deal; many Iraqis have told me that one reason that the US and Iraq have been unable to agree on a new security arrangement is that Iran opposes anything of the kind.

Nor is al-Maliki a Western-style democrat, in spite of President Bush's attempts to portray him as just that. Rather, he is a Shiite militant from the hard-line Dawa Party. Before returning to Iraq in 2003, he had spent more than twenty years in exile in Iran and Syria. As late as 2002, State Department officials sought to exclude Dawa from a US-sponsored Iraqi opposition conference because of Dawa's historical links to terrorism, including a 1983 suicide bomb attack on the US embassy in Kuwait. (There is no basis for linking al-Maliki or other mainstream Dawa leaders to that attack.)

Al-Maliki is an accidental prime minister, having secured the job only after internecine Shiite rivalries (and Kurdish opposition) derailed more prominent candidates. The Bush administration knew so little about him that it initially had his first name wrong. He had never been considered important enough to meet the many senior US officials traipsing to Baghdad. But President Bush has embraced him as the embodiment of American values and goals in Iraq.

John McCain says that partly because of his persistent support of the surge, we are now winning the Iraq war. He defines victory as an Iraq that is a democratic ally. Yet he advocates continued US military support to an Iraqi government led by Shiite religious parties committed to the establishment of an Islamic republic. He takes a harder line on Iran than President Bush, but supports Iraqi factions that are Iran's closest allies in the Middle East. He praises the Awakening and but seems not to have realized that the Iraqi government is intent on crushing it. He has denounced the Obama-Biden plan for a decentralized state but has said nothing about how he would protect Iraq's Kurds, the only committed American allies in the country.

George W. Bush has put the United States on the side of undemocratic Iraqis who are Iran's allies. John McCain would continue the same approach. It is hard to understand how this can be called a success—or a path to victory.

Peter W. Galbraith, a former US Ambassador to Croatia, is Senior Diplomatic Fellow at the Center for Arms Control and a principal at the Windham Resources Group, which has worked in Iraq. His new book, Unintended Consequences: How War in Iraq Strengthened America’s Enemies, has just been released.

Saturday, September 27, 2008

No, Senator Obama, On This One You Were Wrong and McCain Was Right

By Reidar Visser (www.historiae.org)

27 September 2008

Senator Barack Obama to Senator John McCain during yesterday’s presidential debate: “You said that there was no history of violence between Shiite and Sunni. And you were wrong.”

Since this is forceful claim about Iraqi history which was presented during a contest for the position as the world’s most powerful leader, it is worth examining in some further detail. Let’s take a closer look at that “history of violence between Shiite and Sunni” in Iraq. Shiites and Sunnis have coexisted in Iraq since they crystallised as two distinctive religious communities in Baghdad in the tenth century AD, when the struggle for power between various factions of the Islamic caliphate that had been going on since the seventh century became transformed into a theological one with the (Shiite) doctrine of the imamate. In the subsequent centuries, there was certainly tension between these two communities at times (not least because the rivalling ruling elements of the caliphates chose to cultivate links with particular communities to further their own power struggles), but outbreaks of violence on a large scale were extremely rare. In fact, not more than three cases stand out before the late twentieth century, and these were all related to invasion by foreign forces rather than to internal sectarian struggles between the Iraqis.

The first major case of extensive Shiite–Sunni violence was in 1508: A massacre by invading Persian Safavids of Sunnis and Christians in Baghdad. The Safavids returned a little more than one hundred years later, in 1623, and once more went ahead with a massacre of Sunnis in Baghdad. Later, in the nineteenth century, extremist Sunni Wahhabis from the Arabian Peninsula would regularly overrun the settled areas of Iraq; on one occasion, in 1801, this took on a clearly sectarian nature as Bedouin warriors massacred Shiites in the holy city of Karbala. The list can be completed down to 2003 and the US invasion with a series of ugly episodes that took place in the late twentieth century: Between 1969 and 1971, the Sunni-dominated Baathist regime performed mass expulsions of Shiites (including a high number of Fayli Kurds); in 1980 there was another wave of mass deportations of Shiites in the wake of the Iranian revolution; finally, in 1991 there were massacres of Shiites after the failed intifada that followed the Gulf War. (Conversely, some of the other historical episodes that are occasionally described as instances of “sectarian violence” simply do not fit this label. For example, the conflict between the government and the mostly Shiite tribes on the Euphrates in the 1935 was interwoven with questions relating to agrarian issues and conscription, and Sunni politicians had ties to both the government and the opposition camps.)

On the one hand, there can be no doubt that this is a grim record: it involves thousands of innocent people who were massacred simply because they belonged to the wrong sect. On the other hand, however, it is important to keep things in perspective. These six cases of widespread sectarian violence took place in a time span of more than 1,000 years. Moreover, they were mostly instigated by foreign invaders. The Iraqis themselves repeatedly closed ranks against these aggressions, uniting Shiites and Sunnis against the foreign forces. For example, in 1623 when the Safavid army was about to massacre the Sunni population of Baghdad, Shiites of Karbala intervened to save Sunnis from Shiite aggression. Similarly, in 1801, when Sunni Wahhabis sacked Karbala, the Sunni pasha of Baghdad punished the Sunni governor of Karbala for having failed to prevent the attack on the Shiites. Also, in none of these cases did the victims propose separative solutions. Never in Iraqi history has there been any call for a small Sunni state. And with the exception of feeble and short-lived attempt by some low-ranking clerics and notables of Baghdad in 1927, the Shiites have also consistently shied away from a call for a small Shiite breakaway state. None of the major upheavals of twentieth-century Iraqi history – 1920 and 1958 – featured sectarian conflict as the main mode of political action.

The accumulation of cases of sectarian violence during the decades of Baathist rule calls for special comment. True, the measures taken against the Shiites in the early 1980s and in 1991 were extremely repressive, and in a one-off episode in the immediate wake of the 1991 uprising they turned into fully-fledged explicit sectarianism through a series of chauvinist Sunni editorials in the Thawra newspaper in which the Arabness of the Shiites was questioned. However, subsequent developments in Iraqi politics show that the “Sunni” character of the Baathist regime was not its real core and that it was first and foremost an authoritarian regime built on relations between patrons and clients: In the mid-1990s the dominant political trend in Iraq was intra-Sunni struggles, as tribe after tribe challenged Saddam Hussein, who ended up executing people from his hometown Tikrit and his own family. When the defector Husayn Kamil Hasan al-Majid returned from Jordan in 1996, he was put to death just like many Shiite rebels had been after the 1991 uprising.

Perhaps most importantly in the context of the US elections, this record needs to be compared with that of the country Barack Obama represents himself. Did not the Civil War cause some 600,000 deaths between 1861 and 1865? How many thousands of African Americans have been killed in KKK violence? What about the Native Americans? The numbers here are clearly higher than the number of deaths caused by sectarian violence in Iraq, and yet few are prepared to question the viability of the United States as a political project. So where did Senator Obama really want to go with those comments?

There is a problem in Democratic discourse on Iraq that consists of always trying to put the actions of the Bush administration is the worst possible light, even in situations when this forces the Democrats to twist the reality. It seems reasonable to criticise the Iraq War on several grounds: there were no weapons of mass destruction, no al-Qaida link and no 911 relationship, and the unilateral action without a UN mandate created yet another dangerous precedent in international affairs. But Democrats go further than this: they frequently claim that the sectarian problems seen in Iraq since 2003 and especially in the wake of the Samarra bombing in 2006 were a “natural” expression of Iraqi politics, and that the high degree of Iranian influence seen in today’s Iraq is somehow a “natural” phenomenon in a country with a large Shiite population. The argument is that the Bush administration should have known that any tampering with the authoritarian structures of Baathist Iraq would automatically have prompted a civil war with a strong Iranian role among the Shiites. It is also a way of ultimately blaming the Iraqis themselves for all the problems they are currently going through.

This is to ignore the historical record of coexistence between Shiites and Sunnis in Iraq and the fervent anti-Iranian attitudes among large sections of Iraq’s Shiites. Of course, on this latter point, John McCain is off the mark just like Obama: by suggesting that “the consequences of defeat would have been increased Iranian influence” he overlooks the fact that some of America’s best friends in the Maliki government have extremely close ties to Iran and that Iran’s interests so far have been well served by the Republican “victory” project and Washington’s peculiar choice of alliance partners among Iraq’s Shiites. But on the whole, Republicans, to a greater degree than Democrats, at least seem to recognise the historical roots of Iraqi national unity. From the Iraqi point of view it simply seems more dangerous to have a US president who based on some extremely superficial reading pretends to know something about the divide between Sunnis and Shiites than to have one who reportedly is completely ignorant about the subject.

The bottom line is that there is nothing in Iraq’s history that should prevent the country from reverting to its natural role as one of the world’s great nations. Those who try to suggest otherwise either ignore the empirical record or do not care for the well-being of the Iraqi people. It is said that Obama has several top-notch, progressive and knowledgeable advisers who know about all these things, and who are unlikely to look to soft partitionist Joe Biden when it comes to actual policy-making. But unless these voices can have a real impact on what their candidate says in front of millions of Americans in prime-time televised debates, their usefulness seems unclear. If yesterday’s unfounded attack on Iraq's record of coexistence is the most inspirational thing Obama can come up with on Iraq then it is hard for a non-American observer to see any fundamental difference between his candidacy and that of all the others before him.

Reidar Visser is a research fellow at the Norwegian Institute of International Affairs and a noted expert on Iraqi politics and history. He completed an undergraduate degree in history and comparative politics at the University of Bergen and a Ph.D. in Middle Eastern studies at Oxford University.

Friday, September 26, 2008

يوم القدس


يوم القدس

تذكروا الفلسطينيين

القدس مدينة لكل مسلمين واسخاص من كل الأديان

القدس مدينة التي قلب العرب

سُبۡحَـٰنَ ٱلَّذِىٓ أَسۡرَىٰ بِعَبۡدِهِۦ لَيۡلاً۬ مِّنَ ٱلۡمَسۡجِدِ ٱلۡحَرَامِ إِلَى ٱلۡمَسۡجِدِ ٱلۡأَقۡصَا ٱلَّذِى بَـٰرَكۡنَا حَوۡلَهُ ۥ لِنُرِيَهُ ۥ مِنۡ ءَايَـٰتِنَآ‌ۚ إِنَّهُ ۥ هُوَ ٱلسَّمِيعُ ٱلۡبَصِيرُ

-
سُوۡرَةُ الإسرَاء

Wednesday, September 24, 2008

In Photos: Observing Ramadan Across the Globe

The al-Zaim family of Duxbury, Massachusetts sits, gathered together for their Iftar dinner, the meal that breaks the fast, after 7 p.m. on September 14.

A young boy sleeps in a mosque on September 1 while waiting to break his fast in Makassar, Indonesia.

Symbolizing the faith of Islam, the crescent moon is seen at sunset on top of the Faisal Mosque in Islamabad, Pakistan on September 16.

A seller of traditional Syrian sweets calls out for customers in the Meidan Quarter of Damascus on September 2.

A Palestinian man reads from the Qur'an in a mosque in West Bank city of Jenin on September 11.

Palestinian women lead young girls through the Kalandia checkpoint (Israeli), on the outskirts of the West Bank city of Ramallah, as they cross to Jerusalem to attend Friday prayers at the sacred al-Aqsa Mosque compound on September 19.

A Palestinian girl prays inside a mosque in the West Bank city of Ramallah on September 17.

A Palestinian boy holds a homemade sparkler firework after breaking his fast at the end of the second day of Ramadan in the West Bank city of Ramallah on September 2.

A Pakistani man offers Friday prayers atop a mosque roof in Peshawar, Pakistan on September 5.

Israeli border police at the Kalandia checkpoint outside of Ramallah try to hold back Palestinians on their way to pray at the sacred al-Aqsa Mosque compound in Jerusalem's Old City. Notice the infamous Israeli wall.

A young boy prepares food for the Iftar, the evening meal that breaks the daily fast, on the first day of Ramadan for many Muslims at the Memon Mosque in Karachi, Pakistan on September 2.

Kashmiri Muslims pray inside the Great Mosque in Srinigar, Kashmir on the first Friday of Ramadan, September 5. Note the enotions while praying to God.

A stall worker prepares roast chicken wings to be sold at a Ramadan bazaar in downtown Kuala Lumpur, Malaysia for the breaking of the fast on September 5.

Indonesian men attend Friday prayers at the Istiqlal Mosque, the biggest in Southeast Asia, in Jakarta, Indonesia on Friday, September 12.

A small Kashmiri boy stands amidst praying men inside the Grand Mosque in Srinigar, Kashmir on September 5.

THE SIZE OF THE PHOTOS HERE DOES NOT CAPTURE THEIR TRUE BEAUTY AND POWER. VIEW THEM AT FULL SIZE HERE.

Tuesday, September 23, 2008

Islamic Finance, Interest, and Efficiency: An Analytical Overview

ربا...

اقتصادنا، كتاباً من أية الله العظمى سيد الشهيد محمد باقر الصدر

Riba, Efficiency, and Prudential Regulation: Preliminary Thoughts

By Mohammad Fadel
University of Toronto, Faculty of Law

Wisconsin International Law Journal (forthcoming 2008)
Islamic Law and Law of the Muslim World Paper

ABSTRACT: Recent years have witnessed the rapid growth 'Islamic Finance.' Islamic finance distinguishes itself from conventional finance through its adherence to the doctrines of peculiar Islamic commercial prohibitions, most famously the prohibition against riba. Although riba is commonly equated with interest, Islamic law does not condemn all types of interest. It is the ambiguous relationship of riba to interest that explains the paradoxical nature of Islamic finance: even as it condemns lending with interest, it endorses transactions that replicate the economics of interest-based lending, giving rise to the phenomenon of shari'a arbitrage. Islamic finance is thus a systematic strategy to exploit unresolved tensions within Islamic law regarding riba. This paper explores the legal puzzles that arise out of the various doctrines of riba in Islamic law and suggests that riba-based prohibitions can only be understood against a background rule that generally privileged market pricing mechanisms. From the perspective of contractual freedom, it is possible to break down riba into two sets of doctrines: ex ante prohibitions and ex post prohibitions. Only prohibitions that deal with bankrupt debtors should be understood as categorical, while the ex ante riba-based prohibitions are best understood as prophylactic or prudential measures that function as price-fixing measure in times of scarcity which tend to reinforce a baseline distribution of entitlements guaranteed by the system of zakat, a tax-and-transfer system that guaranteed all individuals a year's worth of provisions. Because the prohibition against interest-based lending is also a type of ex ante restriction on market pricing mechanisms, it follows that it should also be viewed as a prudential rule rather than a categorical one, thereby vitiating the need to engage in complex restructuring of conventional financial instruments to assure their consistency with Islamic law.

Mohammad Fadel is professor at the University of Toronto at the Faculty of Law. He earned a B.A. in government and foreign affairs, a Ph.D. in Near Eastern Languages and Civilizations at the University of Chicago, and a J.D. from the University of Virginia. He is a past articles development editor at the Virginia Law Review and John M. Olin Law and Economics Scholar at the University of Virginia. He also served as a law clerk for Judge Paul V. Niemeyer of the United States Court of Appeals, 4th Circuit.


READ THE FULL ARTICLE HERE

Monday, September 22, 2008

Revealing the Real Zardari: Pakistan's New President Nicknamed "Mr. 10%" for Corruption

Asif 'Ali Zardari, the widower of Benazir Bhutto, a former prime minister of Pakistan who was assassinated in December 2007. He is the latest in a long line stretching from his wife back to her father, Zulfiqar 'Ali Bhutto, another former prime minister. Contrary to many factless claims made in the U.S. media following Benazir Bhutto's death, she and her family are far from examples of "democracy." The Bhutto family's political party, the so-called "Pakistan People's Party," is known for its corruption and undemocratic leadership structure. Case in point, there has NEVER been an internal election of any kind to decide the party leader. Leadership was founded by Zulfiqar, passed to his daughter Benazir upon his overthrow and execution in April 1979, and now rests with her son, Bilawal Bhutto Zardari, and her husband/his father, Asif 'Ali Zardari.

"Mr. Ten Percent:" Bhutto's Party Picks Her Widower, Son as Successor

MotherJones Blog, December 2007

Here's the NY Times ten years ago, on January 8, 1998, on the alleged extraordinary corruption of Benazir Bhutto's husband, Asif Ali Zardari, named today as the caretaker co-chair of the Pakistan People's Party until their 19-year-old son Bilawal is old enough to take over.

A decade after she led this impoverished nation from military rule to democracy, Benazir Bhutto is at the heart of a widening corruption inquiry that Pakistani investigators say has traced more than $100 million to foreign bank accounts and properties controlled by Ms. Bhutto's family.
Starting from a cache of Bhutto family documents bought for $1 million from a shadowy intermediary, the investigators have detailed a pattern of secret payments by foreign companies that sought favors during Ms. Bhutto's two terms as Prime Minister.
The documents leave uncertain the degree of involvement by Ms. Bhutto, a Harvard graduate whose rise to power in 1988 made her the first woman to lead a Muslim country. But they trace the pervasive role of her husband, Asif Ali Zardari, who turned his marriage to Ms. Bhutto into a source of virtually unchallengeable power.
In 1995, a leading French military contractor, Dassault Aviation, agreed to pay Mr. Zardari and a Pakistani partner $200 million for a $4 billion jet fighter deal that fell apart only when Ms. Bhutto's Government was dismissed. In another deal, a leading Swiss company hired to curb customs fraud in Pakistan paid millions of dollars between 1994 and 1996 to offshore companies controlled by Mr. Zardari and Ms. Bhutto's widowed mother, Nusrat.
In the largest single payment investigators have discovered, a gold bullion dealer in the Middle East was shown to have deposited at least $10 million into an account controlled by Mr. Zardari after the Bhutto Government gave him a monopoly on gold imports that sustained Pakistan's jewelry industry. The money was deposited into a Citibank account in the United Arab Emirate of Dubai, one of several Citibank accounts for companies owned by Mr. Zardari.
Together, the documents provided an extraordinarily detailed look at high-level corruption in Pakistan, a nation so poor that perhaps 70 percent of its 130 million people are illiterate, and millions have no proper shelter, no schools, no hospitals, not even safe drinking water. During Ms. Bhutto's five years in power, the economy became so enfeebled that she spent much of her time negotiating new foreign loans to stave off default on $62 billion in public debt. ...
The documents [obtained by the NYT] included: statements for several accounts in Switzerland, including the Citibank accounts in Dubai and Geneva; letters from executives promising payoffs, with details of the percentage payments to be made; memorandums detailing meetings at which these "commissions" and "remunerations" were agreed on, and certificates incorporating the offshore companies used as fronts in the deals, many registered in the British Virgin Islands.
The documents also revealed the crucial role played by Western institutions. Apart from the companies that made payoffs, and the network of banks that handled the money -- which included Barclay's Bank and Union Bank of Switzerland as well as Citibank -- the arrangements made by the Bhutto family for their wealth relied on Western property companies, Western lawyers and a network of Western friends.....

"Mr. 10% running it until the boy king comes of age," notes a former US official who served in Pakistan. "So much for democracy."

Sunday, September 21, 2008

Is Google Making Us Stupid?


Is Google Making Us Stupid? What the Internet is Doing to Our Brains

By Guy Billout
The Atlantic Monthly [July/August 2008]

"Dave, stop. Stop, will you? Stop, Dave. Will you stop, Dave?” So the supercomputer HAL pleads with the implacable astronaut Dave Bowman in a famous and weirdly poignant scene toward the end of Stanley Kubrick’s 2001: A Space Odyssey. Bowman, having nearly been sent to a deep-space death by the malfunctioning machine, is calmly, coldly disconnecting the memory circuits that control its artificial “ brain. “Dave, my mind is going,” HAL says, forlornly. “I can feel it. I can feel it.”

I can feel it, too. Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.

I think I know what’s going on. For more than a decade now, I’ve been spending a lot of time online, searching and surfing and sometimes adding to the great databases of the Internet. The Web has been a godsend to me as a writer. Research that once required days in the stacks or periodical rooms of libraries can now be done in minutes. A few Google searches, some quick clicks on hyperlinks, and I’ve got the telltale fact or pithy quote I was after. Even when I’m not working, I’m as likely as not to be foraging in the Web’s info-thickets’reading and writing e-mails, scanning headlines and blog posts, watching videos and listening to podcasts, or just tripping from link to link to link. (Unlike footnotes, to which they’re sometimes likened, hyperlinks don’t merely point to related works; they propel you toward them.)

For me, as for others, the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind. The advantages of having immediate access to such an incredibly rich store of information are many, and they’ve been widely described and duly applauded. “The perfect recall of silicon memory,” Wired’s Clive Thompson has written, “can be an enormous boon to thinking.” But that boon comes at a price. As the media theorist Marshall McLuhan pointed out in the 1960s, media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.

I’m not the only one. When I mention my troubles with reading to friends and acquaintances—literary types, most of them—many say they’re having similar experiences. The more they use the Web, the more they have to fight to stay focused on long pieces of writing. Some of the bloggers I follow have also begun mentioning the phenomenon. Scott Karp, who writes a blog about online media, recently confessed that he has stopped reading books altogether. “I was a lit major in college, and used to be [a] voracious book reader,” he wrote. “What happened?” He speculates on the answer: “What if I do all my reading on the web not so much because the way I read has changed, i.e. I’m just seeking convenience, but because the way I THINK has changed?”

Bruce Friedman, who blogs regularly about the use of computers in medicine, also has described how the Internet has altered his mental habits. “I now have almost totally lost the ability to read and absorb a longish article on the web or in print,” he wrote earlier this year. A pathologist who has long been on the faculty of the University of Michigan Medical School, Friedman elaborated on his comment in a telephone conversation with me. His thinking, he said, has taken on a “staccato” quality, reflecting the way he quickly scans short passages of text from many sources online. “I can’t read War and Peace anymore,” he admitted. “I’ve lost the ability to do that. Even a blog post of more than three or four paragraphs is too much to absorb. I skim it.”

Anecdotes alone don’t prove much. And we still await the long-term neurological and psychological experiments that will provide a definitive picture of how Internet use affects cognition. But a recently published study of online research habits , conducted by scholars from University College London, suggests that we may well be in the midst of a sea change in the way we read and think. As part of the five-year research program, the scholars examined computer logs documenting the behavior of visitors to two popular research sites, one operated by the British Library and one by a U.K. educational consortium, that provide access to journal articles, e-books, and other sources of written information. They found that people using the sites exhibited “a form of skimming activity,” hopping from one source to another and rarely returning to any source they’d already visited. They typically read no more than one or two pages of an article or book before they would “bounce” out to another site. Sometimes they’d save a long article, but there’s no evidence that they ever went back and actually read it. The authors of the study report:

It is clear that users are not reading online in the traditional sense; indeed there are signs that new forms of “reading” are emerging as users “power browse” horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense.

Thanks to the ubiquity of text on the Internet, not to mention the popularity of text-messaging on cell phones, we may well be reading more today than we did in the 1970s or 1980s, when television was our medium of choice. But it’s a different kind of reading, and behind it lies a different kind of thinking—perhaps even a new sense of the self. “We are not only what we read,” says Maryanne Wolf, a developmental psychologist at Tufts University and the author of Proust and the Squid: The Story and Science of the Reading Brain. “We are how we read.” Wolf worries that the style of reading promoted by the Net, a style that puts “efficiency” and “immediacy” above all else, may be weakening our capacity for the kind of deep reading that emerged when an earlier technology, the printing press, made long and complex works of prose commonplace. When we read online, she says, we tend to become “mere decoders of information.” Our ability to interpret text, to make the rich mental connections that form when we read deeply and without distraction, remains largely disengaged.

Reading, explains Wolf, is not an instinctive skill for human beings. It’s not etched into our genes the way speech is. We have to teach our minds how to translate the symbolic characters we see into the language we understand. And the media or other technologies we use in learning and practicing the craft of reading play an important part in shaping the neural circuits inside our brains. Experiments demonstrate that readers of ideograms, such as the Chinese, develop a mental circuitry for reading that is very different from the circuitry found in those of us whose written language employs an alphabet. The variations extend across many regions of the brain, including those that govern such essential cognitive functions as memory and the interpretation of visual and auditory stimuli. We can expect as well that the circuits woven by our use of the Net will be different from those woven by our reading of books and other printed works.

Sometime in 1882, Friedrich Nietzsche bought a typewriter—a Malling-Hansen Writing Ball, to be precise. His vision was failing, and keeping his eyes focused on a page had become exhausting and painful, often bringing on crushing headaches. He had been forced to curtail his writing, and he feared that he would soon have to give it up. The typewriter rescued him, at least for a time. Once he had mastered touch-typing, he was able to write with his eyes closed, using only the tips of his fingers. Words could once again flow from his mind to the page.

But the machine had a subtler effect on his work. One of Nietzsche’s friends, a composer, noticed a change in the style of his writing. His already terse prose had become even tighter, more telegraphic. “Perhaps you will through this instrument even take to a new idiom,” the friend wrote in a letter, noting that, in his own work, his “‘thoughts’ in music and language often depend on the quality of pen and paper.”

Also see:

Living With a Computer

(July 1982)
"The process works this way. When I sit down to write a letter or start the first draft of an article, I simply type on the keyboard and the words appear on the screen..." By James Fallows

“You are right,” Nietzsche replied, “our writing equipment takes part in the forming of our thoughts.” Under the sway of the machine, writes the German media scholar Friedrich A. Kittler , Nietzsche’s prose “changed from arguments to aphorisms, from thoughts to puns, from rhetoric to telegram style.”

The human brain is almost infinitely malleable. People used to think that our mental meshwork, the dense connections formed among the 100 billion or so neurons inside our skulls, was largely fixed by the time we reached adulthood. But brain researchers have discovered that that’s not the case. James Olds, a professor of neuroscience who directs the Krasnow Institute for Advanced Study at George Mason University, says that even the adult mind “is very plastic.” Nerve cells routinely break old connections and form new ones. “The brain,” according to Olds, “has the ability to reprogram itself on the fly, altering the way it functions.”

As we use what the sociologist Daniel Bell has called our “intellectual technologies”—the tools that extend our mental rather than our physical capacities—we inevitably begin to take on the qualities of those technologies. The mechanical clock, which came into common use in the 14th century, provides a compelling example. In Technics and Civilization, the historian and cultural critic Lewis Mumford described how the clock “disassociated time from human events and helped create the belief in an independent world of mathematically measurable sequences.” The “abstract framework of divided time” became “the point of reference for both action and thought.”

The clock’s methodical ticking helped bring into being the scientific mind and the scientific man. But it also took something away. As the late MIT computer scientist Joseph Weizenbaum observed in his 1976 book, Computer Power and Human Reason: From Judgment to Calculation, the conception of the world that emerged from the widespread use of timekeeping instruments “remains an impoverished version of the older one, for it rests on a rejection of those direct experiences that formed the basis for, and indeed constituted, the old reality.” In deciding when to eat, to work, to sleep, to rise, we stopped listening to our senses and started obeying the clock.

The process of adapting to new intellectual technologies is reflected in the changing metaphors we use to explain ourselves to ourselves. When the mechanical clock arrived, people began thinking of their brains as operating “like clockwork.” Today, in the age of software, we have come to think of them as operating “like computers.” But the changes, neuroscience tells us, go much deeper than metaphor. Thanks to our brain’s plasticity, the adaptation occurs also at a biological level.

The Internet promises to have particularly far-reaching effects on cognition. In a paper published in 1936, the British mathematician Alan Turing proved that a digital computer, which at the time existed only as a theoretical machine, could be programmed to perform the function of any other information-processing device. And that’s what we’re seeing today. The Internet, an immeasurably powerful computing system, is subsuming most of our other intellectual technologies. It’s becoming our map and our clock, our printing press and our typewriter, our calculator and our telephone, and our radio and TV.

When the Net absorbs a medium, that medium is re-created in the Net’s image. It injects the medium’s content with hyperlinks, blinking ads, and other digital gewgaws, and it surrounds the content with the content of all the other media it has absorbed. A new e-mail message, for instance, may announce its arrival as we’re glancing over the latest headlines at a newspaper’s site. The result is to scatter our attention and diffuse our concentration.

The Net’s influence doesn’t end at the edges of a computer screen, either. As people’s minds become attuned to the crazy quilt of Internet media, traditional media have to adapt to the audience’s new expectations. Television programs add text crawls and pop-up ads, and magazines and newspapers shorten their articles, introduce capsule summaries, and crowd their pages with easy-to-browse info-snippets. When, in March of this year, TheNew York Times decided to devote the second and third pages of every edition to article abstracts , its design director, Tom Bodkin, explained that the “shortcuts” would give harried readers a quick “taste” of the day’s news, sparing them the “less efficient” method of actually turning the pages and reading the articles. Old media have little choice but to play by the new-media rules.

Never has a communications system played so many roles in our lives—or exerted such broad influence over our thoughts—as the Internet does today. Yet, for all that’s been written about the Net, there’s been little consideration of how, exactly, it’s reprogramming us. The Net’s intellectual ethic remains obscure.

About the same time that Nietzsche started using his typewriter, an earnest young man named Frederick Winslow Taylor carried a stopwatch into the Midvale Steel plant in Philadelphia and began a historic series of experiments aimed at improving the efficiency of the plant’s machinists. With the approval of Midvale’s owners, he recruited a group of factory hands, set them to work on various metalworking machines, and recorded and timed their every movement as well as the operations of the machines. By breaking down every job into a sequence of small, discrete steps and then testing different ways of performing each one, Taylor created a set of precise instructions—an “algorithm,” we might say today—for how each worker should work. Midvale’s employees grumbled about the strict new regime, claiming that it turned them into little more than automatons, but the factory’s productivity soared.

More than a hundred years after the invention of the steam engine, the Industrial Revolution had at last found its philosophy and its philosopher. Taylor’s tight industrial choreography—his “system,” as he liked to call it—was embraced by manufacturers throughout the country and, in time, around the world. Seeking maximum speed, maximum efficiency, and maximum output, factory owners used time-and-motion studies to organize their work and configure the jobs of their workers. The goal, as Taylor defined it in his celebrated 1911 treatise, The Principles of Scientific Management, was to identify and adopt, for every job, the “one best method” of work and thereby to effect “the gradual substitution of science for rule of thumb throughout the mechanic arts.” Once his system was applied to all acts of manual labor, Taylor assured his followers, it would bring about a restructuring not only of industry but of society, creating a utopia of perfect efficiency. “In the past the man has been first,” he declared; “in the future the system must be first.”

Taylor’s system is still very much with us; it remains the ethic of industrial manufacturing. And now, thanks to the growing power that computer engineers and software coders wield over our intellectual lives, Taylor’s ethic is beginning to govern the realm of the mind as well. The Internet is a machine designed for the efficient and automated collection, transmission, and manipulation of information, and its legions of programmers are intent on finding the “one best method”—the perfect algorithm—to carry out every mental movement of what we’ve come to describe as “knowledge work.”

Google’s headquarters, in Mountain View, California—the Googleplex—is the Internet’s high church, and the religion practiced inside its walls is Taylorism. Google, says its chief executive, Eric Schmidt, is “a company that’s founded around the science of measurement,” and it is striving to “systematize everything” it does. Drawing on the terabytes of behavioral data it collects through its search engine and other sites, it carries out thousands of experiments a day, according to the Harvard Business Review, and it uses the results to refine the algorithms that increasingly control how people find information and extract meaning from it. What Taylor did for the work of the hand, Google is doing for the work of the mind.

The company has declared that its mission is “to organize the world’s information and make it universally accessible and useful.” It seeks to develop “the perfect search engine,” which it defines as something that “understands exactly what you mean and gives you back exactly what you want.” In Google’s view, information is a kind of commodity, a utilitarian resource that can be mined and processed with industrial efficiency. The more pieces of information we can “access” and the faster we can extract their gist, the more productive we become as thinkers.

Where does it end? Sergey Brin and Larry Page, the gifted young men who founded Google while pursuing doctoral degrees in computer science at Stanford, speak frequently of their desire to turn their search engine into an artificial intelligence, a HAL-like machine that might be connected directly to our brains. “The ultimate search engine is something as smart as people—or smarter,” Page said in a speech a few years back. “For us, working on search is a way to work on artificial intelligence.” In a 2004 interview with Newsweek, Brin said, “Certainly if you had all the world’s information directly attached to your brain, or an artificial brain that was smarter than your brain, you’d be better off.” Last year, Page told a convention of scientists that Google is “really trying to build artificial intelligence and to do it on a large scale.”

Such an ambition is a natural one, even an admirable one, for a pair of math whizzes with vast quantities of cash at their disposal and a small army of computer scientists in their employ. A fundamentally scientific enterprise, Google is motivated by a desire to use technology, in Eric Schmidt’s words, “to solve problems that have never been solved before,” and artificial intelligence is the hardest problem out there. Why wouldn’t Brin and Page want to be the ones to crack it?

Still, their easy assumption that we’d all “be better off” if our brains were supplemented, or even replaced, by an artificial intelligence is unsettling. It suggests a belief that intelligence is the output of a mechanical process, a series of discrete steps that can be isolated, measured, and optimized. In Google’s world, the world we enter when we go online, there’s little place for the fuzziness of contemplation. Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive.

The idea that our minds should operate as high-speed data-processing machines is not only built into the workings of the Internet, it is the network’s reigning business model as well. The faster we surf across the Web—the more links we click and pages we view—the more opportunities Google and other companies gain to collect information about us and to feed us advertisements. Most of the proprietors of the commercial Internet have a financial stake in collecting the crumbs of data we leave behind as we flit from link to link—the more crumbs, the better. The last thing these companies want is to encourage leisurely reading or slow, concentrated thought. It’s in their economic interest to drive us to distraction.

Maybe I’m just a worrywart. Just as there’s a tendency to glorify technological progress, there’s a countertendency to expect the worst of every new tool or machine. In Plato’s Phaedrus, Socrates bemoaned the development of writing. He feared that, as people came to rely on the written word as a substitute for the knowledge they used to carry inside their heads, they would, in the words of one of the dialogue’s characters, “cease to exercise their memory and become forgetful.” And because they would be able to “receive a quantity of information without proper instruction,” they would “be thought very knowledgeable when they are for the most part quite ignorant.” They would be “filled with the conceit of wisdom instead of real wisdom.” Socrates wasn’t wrong—the new technology did often have the effects he feared—but he was shortsighted. He couldn’t foresee the many ways that writing and reading would serve to spread information, spur fresh ideas, and expand human knowledge (if not wisdom).

The arrival of Gutenberg’s printing press, in the 15th century, set off another round of teeth gnashing. The Italian humanist Hieronimo Squarciafico worried that the easy availability of books would lead to intellectual laziness, making men “less studious” and weakening their minds. Others argued that cheaply printed books and broadsheets would undermine religious authority, demean the work of scholars and scribes, and spread sedition and debauchery. As New York University professor Clay Shirky notes, “Most of the arguments made against the printing press were correct, even prescient.” But, again, the doomsayers were unable to imagine the myriad blessings that the printed word would deliver.

So, yes, you should be skeptical of my skepticism. Perhaps those who dismiss critics of the Internet as Luddites or nostalgists will be proved correct, and from our hyperactive, data-stoked minds will spring a golden age of intellectual discovery and universal wisdom. Then again, the Net isn’t the alphabet, and although it may replace the printing press, it produces something altogether different. The kind of deep reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire from the author’s words but for the intellectual vibrations those words set off within our own minds. In the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of contemplation, for that matter, we make our own associations, draw our own inferences and analogies, foster our own ideas. Deep reading, as Maryanne Wolf argues, is indistinguishable from deep thinking.

If we lose those quiet spaces, or fill them up with “content,” we will sacrifice something important not only in our selves but in our culture. In a recent essay, the playwright Richard Foreman eloquently described what’s at stake:

I come from a tradition of Western culture, in which the ideal (my ideal) was the complex, dense and “cathedral-like” structure of the highly educated and articulate personality—a man or woman who carried inside themselves a personally constructed and unique version of the entire heritage of the West. [But now] I see within us all (myself included) the replacement of complex inner density with a new kind of self—evolving under the pressure of information overload and the technology of the “instantly available.”

As we are drained of our “inner repertory of dense cultural inheritance,” Foreman concluded, we risk turning into “‘pancake people’—spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.”

I’m haunted by that scene in 2001. What makes it so poignant, and so weird, is the computer’s emotional response to the disassembly of its mind: its despair as one circuit after another goes dark, its childlike pleading with the astronaut—“I can feel it. I can feel it. I’m afraid”—and its final reversion to what can only be called a state of innocence. HAL’s outpouring of feeling contrasts with the emotionlessness that characterizes the human figures in the film, who go about their business with an almost robotic efficiency. Their thoughts and actions feel scripted, as if they’re following the steps of an algorithm. In the world of 2001, people have become so machinelike that the most human character turns out to be a machine. That’s the essence of Kubrick’s dark prophecy: as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.

Friday, September 19, 2008

Wednesday, September 17, 2008

The New Humanitarian Order: Darfur


The New Humanitarian Order

By Mahmoud Mamdani
The Nation [September 10, 2008]

Excerpt from Prof. Mamdani's book, Saviors and Survivors: Darfur, Politics and the War on Terror, forthcoming from Pantheon in January 2009.

On July 14, after much advance publicity and fanfare, the prosecutor of the International Criminal Court applied for an arrest warrant for the president of Sudan, Omar Hassan Ahmad al-Bashir, on charges that included genocide, crimes against humanity and war crimes. Important questions of fact arise from the application as presented by the prosecutor. But even more important is the light this case sheds on the politics of the "new humanitarian order."
The conflict in Darfur began as a civil war in 1987-89, before Bashir and his group came to power. It was marked by indiscriminate killing and mass slaughter on both sides. The language of genocide was first employed in that conflict. The Fur representative at the May 1989 reconciliation conference in El Fasher pointed to their adversaries and claimed that "the aim is a total holocaust and no less than the complete annihilation of the Fur people and all things Fur." In response the Arab representative traced the origin of the conflict to "the end of the '70s when...the Arabs were depicted as foreigners who should be evicted from this area of Dar Fur."

The ICC prosecutor, Luis Moreno-Ocampo, has uncritically taken on the point of view of one side in this conflict, a side that was speaking of a "holocaust" before Bashir came to power, and he attributes far too much responsibility for the killing to Bashir alone. He goes on to speak of "new settlers" in today's Darfur, suggesting that he has internalized this partisan perspective.

At the same time, the prosecutor speaks in ignorance of history: "AL BASHIR...promoted the idea of a polarization between tribes aligned with him, whom he labeled 'Arabs' and...the Fur, Masalit and Zaghawa...derogatory [sic] referred to as 'Zurgas' or 'Africans'." The racialization of identities in Darfur has its roots in the British colonial period. As early as the late 1920s, the British tried to organize two confederations in Darfur: one Arab, the other black (Zurga). Racialized identities were incorporated into the census and provided the frame for government policy. It is not out of the blue that the two sides in the 1987-89 civil war described themselves as Arab and Zurga. If anything, the evidence shows that successive Sudanese governments--Bashir's included--looked down on all Darfuris, non-Arab Zurga as well as Arab nomads.

Having falsely attributed to Bashir the racialization of the conflict, Moreno-Ocampo focuses on two consequences of the conflict in Darfur: ethnic cleansing through land-grabbing and atrocities in the camps. He attributes both to Bashir. He is again wrong. The land-grabbing has been a consequence of three different, if related, causes. The first is the colonial system, which reorganized Darfur as a series of tribal homelands, designating the largest for settled peasant tribes and none for fully nomadic tribes. The second is environmental degradation: according to the United Nations Environment Program, the Sahara expanded by 100 kilometers in four decades; this process reached a critical point in the mid-1980s, pushing all tribes of North Darfur, Arab and non-Arab, farther south, onto more fertile Fur and Masalit lands. This in turn led to a conflict between tribes with homelands and those without them. The imperative of sheer survival explains in part the unprecedented brutality of the violence in every successive war since 1987-89. The third cause came last: the brutal counterinsurgency unleashed by the Bashir regime in 2003-04 in response to an insurgency backed up by peasant tribes.

It is not just the early history of the conflict that the prosecutor is poorly informed about. In his eagerness to build a case, Moreno-Ocampo glosses over recent history as well. He charges Bashir with following up the mass slaughter of 2003-04 with attrition by other means in the camps: "He did not need bullets. He used other weapons: rape, hunger and fear." This claim flies in the face of evidence from UN sources in Darfur, quoted by Julie Flint in the London Independent, that the death rate in the camps came down to around 200 a month from early 2005, less than in South Sudan or in the poor suburbs of Khartoum.

The point of the prosecutor's case is to connect all consequences in Darfur to a single cause: Bashir. Moreno-Ocampo told journalists in The Hague, "What happened in Darfur is a consequence of Bashir's will." The prosecution of Bashir comes across as politicized justice. As such, it will undermine the legitimacy of the ICC and almost certainly will not help solve the crisis in Darfur. It is perhaps understandable that a prosecutor in a rush would gloss over all evidence that might undermine his case. But we must not. A workable solution to the conflict requires that all its causes be understood in their full complexity.

Darfur was the site of mass deaths in 2003-04. World Health Organization sources--still the most reliable available information on mortality levels then--trace these deaths to two major causes: roughly 80 percent to drought-related diarrhea and 20 percent to direct violence. There is no doubt that the perpetrators of violence should be held accountable, but when and how are political decisions that cannot belong to the ICC prosecutor. More than the innocence or guilt of the president of Sudan, it is the relationship between law and politics--including the politicization of the ICC--that poses a wider issue, one of greatest concern to African governments and peoples.

The New Humanitarian Order

When World War II broke out, the international order could be divided into two unequal parts: one privileged, the other subjugated; one a system of sovereign states in the Western Hemisphere, the other a colonial system in most of Africa, Asia and the Middle East.

Postwar decolonization recognized former colonies as states, thereby expanding state sovereignty as a global principle of relations between states. The end of the cold war has led to another basic shift, heralding an international humanitarian order that promises to hold state sovereignty accountable to an international human rights standard. Many believe that we are in the throes of a systemic transition in international relations.

The standard of responsibility is no longer international law; it has shifted, fatefully, from law to rights. As the Bush Administration made patently clear at the time of the invasion of Iraq, humanitarian intervention does not need to abide by the law. Indeed, its defining characteristic is that it is beyond the law. It is this feature that makes humanitarian intervention the twin of the "war on terror."

This new humanitarian order, officially adopted at the UN's 2005 World Summit, claims responsibility for the protection of vulnerable populations. That responsibility is said to belong to "the international community," to be exercised in practice by the UN, and in particular by the Security Council, whose permanent members are the great powers. This new order is sanctioned in a language that departs markedly from the older language of law and citizenship. It describes as "human" the populations to be protected and as "humanitarian" the crisis they suffer from, the intervention that promises to rescue them and the agencies that seek to carry out intervention. Whereas the language of sovereignty is profoundly political, that of humanitarian intervention is profoundly apolitical, and sometimes even antipolitical. Looked at closely and critically, what we are witnessing is not a global but a partial transition. The transition from the old system of sovereignty to a new humanitarian order is confined to those states defined as "failed" or "rogue" states. The result is once again a bifurcated system, whereby state sovereignty obtains in large parts of the world but is suspended in more and more countries in Africa and the Middle East.

The Westphalian coin of state sovereignty is still the effective currency in the international system. It is worth looking at both sides of this coin: sovereignty and citizenship. If "sovereignty" remains the password to enter the passageway of international relations, "citizenship" still confers membership in the sovereign national political (state) community. Sovereignty and citizenship are not opposites; they go together. The state, after all, embodies the key political right of citizens: the right of collective self-determination.

The international humanitarian order, in contrast, does not acknowledge citizenship. Instead, it turns citizens into wards. The language of humanitarian intervention has cut its ties with the language of citizen rights. To the extent the global humanitarian order claims to stand for rights, these are residual rights of the human and not the full range of rights of the citizen. If the rights of the citizen are pointedly political, the rights of the human pertain to sheer survival; they are summed up in one word: protection. The new language refers to its subjects not as bearers of rights--and thus active agents in their emancipation--but as passive beneficiaries of an external "responsibility to protect." Rather than rights-bearing citizens, beneficiaries of the humanitarian order are akin to recipients of charity. Humanitarianism does not claim to reinforce agency, only to sustain bare life. If anything, its tendency is to promote dependence. Humanitarianism heralds a system of trusteeship.

It takes no great intellectual effort to recognize that the responsibility to protect has always been the sovereign's obligation. It is not that a new principle has been introduced; rather, its terms have been radically altered. To grasp this shift, we need to ask: who has the responsibility to protect whom, under what conditions and toward what end?

The era of the international humanitarian order is not entirely new. It draws on the history of modern Western colonialism. At the outset of colonial expansion in the eighteenth and nineteenth centuries, leading Western powers--Britain, France, Russia--claimed to protect "vulnerable groups." When it came to countries controlled by rival powers, such as the Ottoman Empire, Western powers claimed to protect populations they considered vulnerable, mainly religious minorities like specific Christian denominations and Jews. In lands not yet colonized by any power, like South Asia and large parts of Africa, they highlighted local atrocities--such as female infanticide and suttee in India, and slavery in Africa--and pledged to protect victims from their rulers.

From this history was born the international regime of trusteeship exercised under the League of Nations. The League's trust territories were mainly in Africa and the Middle East. They were created at the end of World War I, when colonies of defeated imperial powers (the Ottoman Empire, Germany and Italy) were handed over to the victorious powers, who pledged to administer them as guardians would administer wards, under the watchful eye of the League of Nations.

One of these trust territories was Rwanda, administered as a trust of Belgium until the 1959 Hutu Revolution. It was under the benevolent eye of the League of Nations that Belgium hardened Hutu and Tutsi into racialized identities, using the force of law to institutionalize an official system of discrimination between them. Thereby, Belgian colonialism laid the institutional groundwork for the genocide that followed half a century later. The Western powers that constituted the League of Nations could not hold Belgium accountable for the way it exercised an international trust, for one simple reason: to do so would have been to hold a mirror up to their own colonial record. Belgian rule in Rwanda was but a harder version of the indirect rule practiced to one degree or another by all Western powers in Africa. This system did not simply deny sovereignty to its colonies; it redesigned the administrative and political life of colonies by bringing each under a regime of group identity and rights. Belgian rule in Rwanda may have been an extreme version of colonialism, but it certainly was not exceptional.

Given the record of the League of Nations, it is worth asking how the new international regime of trusteeship would differ from the old one. What are the likely implications of the absence of citizenship rights at the core of this new system? Why would a regime of trusteeship not degenerate yet again into one of lack of accountability and responsibility?

On the face of it, these two systems--one defined by sovereignty and citizenship, the other by trusteeship and wardship--would seem to be contradictory rather than complementary. In practice, however, they are two parts of a bifurcated international system. One may ask how this bifurcated order is reproduced without the contradiction being flagrantly obvious, without it appearing like a contemporary version of the old colonial system of trusteeship. A part of the explanation lies in how power has managed to subvert the language of violence and war to serve its own claims.

Subverting the Language of Genocide

War has long ceased to be a direct confrontation between the armed forces of two states. As became clear during the confrontation between the Allied and the Axis powers in World War II, in America's Indochina War in the 1960s and '70s, its Gulf War in 1991 and then again in its 2003 invasion of Iraq, states do not just target the armed forces of adversary states; they target society itself: war-related industry and infrastructure, economy and work force, and sometimes, as in the aerial bombardment of cities, the civilian population in general. The trend is for political violence to become generalized and indiscriminate. Modern war is total war.

This development in the nature of modern war has tended to follow an earlier development of counterinsurgency in colonial contexts. Faced with insurgent guerrillas who were simply armed civilians, colonial powers targeted the populations of occupied territories. When Mao Zedong wrote that guerrillas must be as fish in water, American counterinsurgency theorist Samuel Huntington, writing during the Vietnam War, responded that the object of counterinsurgency must be to drain the water and isolate the fish. But the practice is older than post-World War II counterinsurgency. It dates back to the earliest days of modernity, to settler-colonial wars against American Indians in the decades and centuries that followed 1492. Settler America pioneered the practice of interning civilian populations in what Americans called "reservations" and the British called "reserves," a technology the Nazis would later develop into an extreme form called concentration camps. Often thought of as a British innovation put into effect during the late-nineteenth-century Boer War in South Africa, the practice of concentrating and interning populations in colonial wars was in origin an American settler contribution to the development of modern war.

The regime identified with the international humanitarian order makes a sharp distinction between genocide and other kinds of mass violence. The tendency is to be permissive of insurgency (liberation war), counterinsurgency (suppression of civil war or of rebel/revolutionary movements) and inter-state war as integral to the exercise of national sovereignty. Increasingly, they are taken as an inevitable if regrettable part of defending or asserting national sovereignty, domestically or internationally--but not genocide.

What, then, is the distinguishing feature of genocide? It is clearly not extreme violence against civilians, for that is very much a feature of both counterinsurgency and interstate war in these times. Only when extreme violence targets for annihilation a civilian population that is marked off as different "on grounds of race, ethnicity or religion" is that violence termed genocide. It is this aspect of the legal definition that has allowed "genocide" to be instrumentalized by big powers so as to target those newly independent states that they find unruly and want to discipline. More and more, universal condemnation is reserved for only one form of mass violence--genocide--as the ultimate crime, so much so that counterinsurgency and war appear to be normal developments. It is genocide that is violence run amok, amoral, evil. The former is normal violence, but the latter is bad violence. Thus the tendency to call for "humanitarian intervention" only where mass slaughter is named "genocide."

Given that the nature of twentieth-century "indirect rule" colonialism shaped the nature of administrative power along "tribal" (or ethnic) lines, it is not surprising that the exercise of power and responses to it tend to take "tribal" forms in newly independent states. From this point of view, there is little to distinguish between mass violence unleashed against civilians in Congo, northern Uganda, Mozambique, Angola, Darfur, Sierra Leone, Liberia, Ivory Coast and so on. So which ones are to be named "genocide" and which ones are not? Most important, who decides?

There is nothing new in legal concepts being used to serve the expedience of great powers. What is new about the "war on terror" is that action against certain forms of violence is simultaneously being moralized and legally deregulated. Is it then surprising that these very developments have led to violence run amok, as in Iraq after 2003 or, indeed, in Bashir's own little war on terror in Darfur in 2003-04? As the new humanitarian order does away with legal limits to pre-emptive war--thus, to the global war on terror--it should not be surprising that counterinsurgency defines itself as a local war on terror.

The year 2003 saw the unfolding of two counterinsurgencies. One was in Iraq, and it grew out of foreign invasion. The other was in Darfur, and it grew as a response to an internal insurgency. The former involved a liberation war against a foreign occupation; the latter, a civil war in an independent state. True, if you were an Iraqi or a Darfuri, there was little difference between the brutality of the violence unleashed in either instance. Yet much energy has been invested in how to define the brutality in each instance: whether as counterinsurgency or as genocide. We have the astonishing spectacle of the state that has perpetrated the violence in Iraq, the United States, branding an adversary state, Sudan, the one that has perpetrated genocidal violence in Darfur. Even more astonishing, we had a citizens' movement in America calling for a humanitarian intervention in Darfur while keeping mum about the violence in Iraq.

The International Criminal Court

The emphasis on big powers as the protectors of rights internationally is increasingly being twinned with an emphasis on big powers as enforcers of justice internationally. This much is clear from a critical look at the short history of the International Criminal Court.

The ICC was set up by treaty in Rome in 1998 to try the world's most heinous crimes: mass murder and other systematic abuses. The relationship between the ICC and successive US administrations is instructive: it began with Washington criticizing the ICC and then turning it into a useful tool. The effort has been bipartisan: the first attempts to weaken the ICC and to create US exemptions from an emerging regime of international justice were made by leading Democrats during the Clinton Administration.

Washington's concerns were spelled out in detail by a subsequent Republican ambassador to the UN, John Bolton: "Our main concern should be for our country's top civilian and military leaders, those responsible for our defense and foreign policy." Bolton went on to ask "whether the United States was guilty of war crimes for its aerial bombing campaigns over Germany and Japan in World War II" and answered in the affirmative: "Indeed, if anything, a straightforward reading of the language probably indicates that the court would find the United States guilty. A fortiori, these provisions seem to imply that the United States would have been guilty of a war crime for dropping atomic bombs on Hiroshima and Nagasaki. This is intolerable and unacceptable." He also aired the concerns of America's principal ally in the Middle East, Israel: "Thus, Israel justifiably feared in Rome that its preemptive strike in the Six-Day War almost certainly would have provoked a proceeding against top Israeli officials. Moreover, there is no doubt that Israel will be the target of a complaint concerning conditions and practices by the Israeli military in the West Bank and Gaza."

When it came to signing the treaty, Washington balked. Once it was clear that it would not be able to keep the ICC from becoming a reality, the Bush Administration changed tactics and began signing bilateral agreements with countries whereby both signatories would pledge not to hand over each other's nationals--even those accused of crimes against humanity--to the ICC. By mid-June 2003, the United States had signed such agreements with thirty-seven countries, starting with Sierra Leone, a site of massive atrocities.

The Bush Administration's next move was accommodation, made possible by the kind of pragmatism practiced by the ICC's leadership. The fact of mutual accommodation between the world's only superpower and an international institution struggling to find its feet on the ground is clear if we take into account the four countries where the ICC has launched its investigations: Sudan, Uganda, Central African Republic and Congo. All are places where the United States has no major objection to the course chartered by ICC investigations. Its name notwithstanding, the ICC is rapidly turning into a Western court to try African crimes against humanity. It has targeted governments that are US adversaries and ignored actions the United States doesn't oppose, like those of Uganda and Rwanda in eastern Congo, effectively conferring impunity on them.

If the ICC is accountable, it is to the Security Council, not the General Assembly. It is this relationship that India objected to when it--like the United States, China and Sudan--refused to sign the Rome Statute. India's primary objection was summed up by the Hindu, India's leading political daily, which argued that "granting powers to the Security Council to refer cases to the ICC, or to block them, was unacceptable, especially if its members were not all signatories to the treaty," for it "provided escape routes for those accused of serious crimes but with clout in the U.N. body." At the same time, "giving the Security Council power to refer cases from a non-signatory country to the ICC was against the Law of Treaties under which no country can be bound by the provisions of a treaty it has not signed."

The absence of formal political accountability has led to the informal politicization of the ICC. No one should be surprised that the United States used its position as the leading power in the Security Council to advance its bid to capture the ICC. This is how the Hindu summed up the US relationship to the court: "The wheeling-dealing by which the U.S. has managed to maintain its exceptionalism to the ICC while assisting 'to end the climate of impunity in Sudan' makes a complete mockery of the ideals that informed the setting up of a permanent international criminal court to try perpetrators of the gravest of crimes against humanity."

Law and Politics in Transitional Societies

Human rights fundamentalists argue for an international legal standard regardless of the political context of the country in question. Their point of view is bolstered by the widespread and understandable popular outrage, not just in the West but throughout Africa, against the impunity with which a growing number of regimes have been resorting to slaughter to brutalize their populations into silence. The realization that the ICC has tended to focus only on African crimes, and mainly on crimes committed by adversaries of the United States, has introduced a note of sobriety into the African discussion, raising concerns about a politicized justice and wider questions about the relationship between law and politics.

In no country is the distinction between legal and political issues self-evident. In a democracy, the domain of the legal is defined through the political process. What would happen if we privileged the legal over the political, regardless of context? The experience of a range of transitional societies--post-Soviet, postapartheid and postcolonial--suggests that such a fundamentalism would call into question their political existence. Several post-Soviet societies of Eastern Europe with a history of extensive informing, spying and compromising have decided either not to fully open secret police and Communist Party files or to do so at a snail's pace. Societies torn apart by civil war, like post-Franco Spain, have chosen amnesia over truth, for the simple reason that they have prioritized the need to forge a future over agreeing on the past. The contrast is provided by Bosnia and Rwanda, where the administration of justice became an international responsibility and the decision to detach war crimes from the underlying political reality has turned justice into a regime for settling scores.

Those who face human rights as the language of an externally driven "humanitarian intervention" have to contend with a legal regime where the content of human rights law is defined outside a political process--whether democratic or not--that includes them as formal participants. Particularly for those in Africa, the ICC heralds a regime of legal and political dependence, much as the postwar Bretton Woods institutions began to pioneer an international regime of economic dependence in the 1980s and '90s. The real danger of detaching the legal from the political regime and handing it over to human rights fundamentalists is that it will turn the pursuit of justice into revenge-seeking, thereby obstructing the search for reconciliation and a durable peace. Does that mean that the very notion of justice must be postponed as disruptive of peace? No.

Survivors' Justice

If peace and justice are to be complementary rather than conflicting objectives, we must distinguish victors' justice from survivors' justice: if one insists on distinguishing right from wrong, the other seeks to reconcile different rights. In a situation where there is no winner and thus no possibility of victors' justice, survivors' justice may indeed be the only form of justice possible.

If Nuremberg is the paradigm for victors' justice, South Africa's postapartheid transition is the paradigm for survivors' justice. The end of apartheid was driven by a key principle: forgive but do not forget. The first part of the compact was that the new power will forgive all past transgressions so long as they are publicly acknowledged as wrongs. There will be no prosecutions. The second was that there will be no forgetting and that henceforth rules of conduct must change, thereby ensuring a transition to a postapartheid order. It was South Africa's good fortune that its transition was in the main internally driven.

South Africa is not a solitary example but a prototype for conflicts raging across Africa about the shape of postcolonial political communities and the definition of membership in them. The agreement that ended the South Sudan war combined impunity for all participants with political reform. The same was true of the settlement ending Mozambique's civil war. Had the ICC been involved in these conflicts in the way it is now in Darfur, it is doubtful there would be peace in either place.


Mahmoud Mamdani is Herbert Lehman Professor of Government and Professor of Anthropology at Columbia University. Born in Uganda to South Asian parents, he is a noted expert on African studies and Muslim societies in Africa. He was recently named one of the top 100 most influential public intellectuals by Foreign Policy magazine.