Feeds:
Entradas
Comentarios

Archive for the ‘Guerra Fría’ Category

(Image: Library of Congress)

Obama: Ike Redivivus?

by Victor Davis Hanson

National Review Online March 11, 2014
In critique of the George W. Bush administration, and in praise of the perceived foreign-policy restraint of Obama’s first five years in the White House, a persistent myth has arisen that Obama is reminiscent of Eisenhower — in the sense of being a president who kept America out of other nations’ affairs and did not waste blood and treasure chasing imaginary enemies.

Doris Kearns Goodwin, Andrew Bacevich, Fareed Zakaria (“Why Barack Is like Ike”), and a host of others have made such romantic, but quite misleading, arguments about the good old days under the man they consider the last good Republican president.

Ike was no doubt a superb president. Yet while he could be sober and judicious in deploying American forces abroad, he was hardly the non-interventionist of our present fantasies, who is so frequently used and abused to score partisan political points.

There is a strange disconnect about Eisenhower’s supposed policy of restraint, especially in reference to the Middle East, and his liberal use of the CIA in covert operations. While romanticizing Ike, we often deplore the 1953 coup in Iran and the role of the CIA, but seem to forget that it was Ike who ordered the CIA intervention that helped to lead to the ouster of Mossadegh and to bring the Shah to absolute power. Ike thought that he saw threats to Western oil supplies, believed that Mossadegh was both unstable and a closet Communist, sensed the covert hand of the Soviet Union at work, was won over by the arguments of British oil politics, and therefore simply decided Mossadegh should go — and he did.

Ike likewise ordered the CIA-orchestrated removal of the leaders of Guatemala and the Congo. He bequeathed to JFK the plans for the Bay of Pigs invasion, which had been born on the former’s watch. His bare-faced lie that a U-2 spy plane had not been shot down in Russia did terrible damage to U.S. credibility at the time.

The Eisenhower administration formulated the domino theory, and Ike was quite logically the first U.S. president to insert American advisers into Southeast Asia, a move followed by a formal SEATO defense treaty to protect most of Southeast Asia from Communist aggression — one of the most interventionist commitments of the entire Cold War, which ended with over 58,000 Americans dead in Vietnam and helicopters fleeing from the rooftop of the U.S. embassy in Saigon.

Eisenhower’s “New Look” foreign policy of placing greater reliance on threats to use nuclear weapons, unleashing the CIA, and crafting new entangling alliances may have fulfilled its short-term aims of curbing the politically unpopular and costly use of conventional American troops overseas. Its long-term ramifications, however, became all too clear in the 1960s and 1970s. Mostly, Ike turned to reliance on nuke-rattling because of campaign promises to curb spending and balance the budget by cutting conventional defense forces — which earned him the furor of Generals Omar Bradley, Douglas MacArthur, and Matthew Ridgway.

In many ways, Eisenhower’s Mideast policy lapsed into incoherency, notably in the loud condemnation of the 1956 British-French operations in Suez (after Nasser had nationalized the Suez Canal), which otherwise might have weakened or toppled Nasser. This stance of Eisenhower’s (who was up for reelection) may have also contradicted prior tacit assurances to the British that the U.S. would in fact look the other way.

The unexpected American opposition eroded transatlantic relations for years as well as helped to topple the Eden government in Britain. Somehow all at once the U.S. found itself humiliating its two closest allies, empowering Nasser, and throwing its lot in with the Soviet Union and the oil blackmailers of Saudi Arabia — with ramifications for the ensuing decades.

Yet just two years later, Ike ordered 15,000 troops into Lebanon to prevent a coup and the establishment of an anti-Western government — precisely those anti-American forces that had been emboldened by the recent Suez victory of the pan-Arabist Nasser. We forget that Ike was nominated not just in opposition to the non-interventionist policies of Robert Taft, but also as an antidote to the purportedly milk-toast Truman administration, which had supposedly failed to confront global Communism and thereby “lost” much of Asia.

Eisenhower gave wonderful speeches about the need to curtail costly conventional forces and to avoid overseas commitments, but much of his defense strategy was predicated on a certain inflexible and dangerous reliance on nuclear brinksmanship. In 1952 he ran to the right of the departing Harry Truman on the Korean War, and unleashed Nixon to make the argument of Democratic neo-appeasement in failing to get China out of Korea. Yet when he assumed office, Eisenhower soon learned that hinting at the use of nuclear weapons did not change the deadlock near the 38th Parallel. Over 3,400 casualties (including perhaps over 800 dead) were incurred during the Eisenhower administration’s first six months. Yet the July 1953 ceasefire ended the war with roughly the same battlefield positions as when Ike entered office. Pork Chop Hill — long before John Kerry’s baleful notion about the last man to die in Vietnam — became emblematic of a futile battle on the eve of a negotiated stalemate.

Ike’s occasional opportunism certainly turned off more gifted field generals like Matthew Ridgway, who found it ironic that candidate Ike had cited a lack of American resolve to finish the Korean War with an American victory, only to institutionalize Ridgway’s much-criticized but understandable restraint after his near-miraculous restoration of South Korea. In addition, Ridgway deplored the dangerous false economy of believing that postwar conventional forces could be pruned while the U.S. could rely instead on threatening the use of nuclear weapons. He almost alone foresaw rightly that an emerging concept of mutually assured destruction would make the conventional Army and Marines as essential as ever.

As a footnote, Eisenhower helped to marginalize the career of Ridgway, the most gifted U.S. battlefield commander of his era. Ike bore grudges and was petty enough to write, quite untruthfully, that General James Van Fleet, not Ridgway, had recaptured Seoul — even though the former had not even yet arrived in the Korean theater. That unnecessary snub was reminiscent of another to his former patron George Marshall during the campaign of 1952. Ridgway, remember, would later talk Eisenhower out of putting more advisers into Vietnam.

The problem with the Obama administration is not that it does or does not intervene, given the differing contours of each crisis, but rather that it persists in giving loud sermons that bear no relationship to the actions that do or do not follow: red lines in Syria followed by Hamlet-like deliberations and acceptance of Putin’s bogus WMD removal plan; flip-flop-flip in Egypt; in Libya, lead from behind followed by Benghazi and chaos; deadlines and sanctions to no deadlines and no sanctions with Iran; reset reset with Russia; constant public scapegoating of his predecessors, especially Bush; missile defense and then no missile defense in Eastern Europe; Guantanamo, renditions, drones, and preventive detentions all bad in 2008 and apparently essential in 2009; civilian trials for terrorists and then not; and Orwellian new terms like overseas contingency operations, workplace violence, man-caused disasters, a secular Muslim Brotherhood, jihad as a personal journey, and a chief NASA mission being outreach to Muslims. We forget that the non-interventionist policies of Jimmy Carter abruptly ended with his bellicose “Carter Doctrine” — birthed after the Soviets invaded Afghanistan, American hostages were taken in Tehran and Khomeinists had taken power, China went into Vietnam, and Communist insurgencies swept Central America.

As for Dwight Eisenhower, of course he was an admirable and successful president who squared the circle of trying to contain expansionary Soviet and Chinese Communism at a time when the postwar American public was rightly tired of war, while balancing three budgets, building infrastructure, attempting to deal with civil rights, and promoting economic growth. Yet the Republican Ike continued for six months the identical Korean War policies of his unpopular Democratic predecessor Harry Truman, and helped to lay the foundation for the Vietnam interventions of his successors, Democrats John F. Kennedy and Lyndon Johnson. That the initial blow-ups in Korea and Vietnam bookended his own administration may have been a matter of luck, given his own similar interventionist Cold War policies.

Bush was probably no Ike (few are), and certainly Obama is not either. But to score contemporary political points against one and for the other by reinventing Eisenhower into a model non-interventionist is a complete distortion of history. So should we laugh or cry at the fantasies offered by Andrew Bacevich? He writes: “Remember the disorder that followed the Korean War? It was called the Eisenhower era, when budgets balanced, jobs were plentiful and no American soldiers died in needless wars.”

In fact, the post–Korean War “Eisenhower era” was characterized by only three balanced budgets (in at least one case with some budget gimmickry) out of the remaining seven Eisenhower years. In 1958 the unemployment rate spiked at over 7 percent for a steady six months. Bacevich’s simplistic notion that “jobs were plentiful” best applies to the first six months of 1953, when Ike entered office and, for the only time during his entire tenure, the jobless rate was below 3 percent — coinciding roughly with the last six months of fighting the Korean War. This was an age, remember, when we had not yet seen the West German, South Korean, and Japanese democratic and economic miracles (all eventually due to U.S. interventions and occupations), China and Russia were in ruins, Western Europe was still recovering from the war, Britain had gone on a nationalizing binge, and for a brief time the U.S. was largely resupplying the world, and mostly alone — almost entirely with its own oil, gas, and coal. Eisenhower’s term was characterized by intervention in Lebanon, fighting for stalemate in Korea, CIA-led coups and assassinations, the insertion of military advisers into Vietnam, new anti-Communist treaty entanglements to protect Southeast Asian countries, a complete falling out with our European allies, abject lies about spy flights over the Soviet Union, serial nuclear saber-rattling, and Curtis LeMay’s nuclear-armed overflights of the Soviet Union — in other words, the not-so-abnormal stuff of a Cold War presidency.

And the idea that, to quote from Doris Kearns Goodwin, Eisenhower “could then take enormous pride in the fact that not a single soldier had died in combat during his time” is, well, unhinged.

National Review Online contributor Victor Davis Hanson is a senior fellow at the Hoover Institution and the author, most recently, of The Savior Generals

Read Full Post »

The New Yorker    JANUARY 23, 2014

dr-strangelove-still-580.jpg

This month marks the fiftieth anniversary of Stanley Kubrick’s black comedy about nuclear weapons, “Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb.” Released on January 29, 1964, the film caused a good deal of controversy. Its plot suggested that a mentally deranged American general could order a nuclear attack on the Soviet Union, without consulting the President. One reviewer described the film as “dangerous … an evil thing about an evil thing.” Another compared it to Soviet propaganda. Although “Strangelove” was clearly a farce, with the comedian Peter Sellers playing three roles, it was criticized for being implausible. An expert at the Institute for Strategic Studies called the events in the film “impossible on a dozen counts.” A former Deputy Secretary of Defense dismissed the idea that someone could authorize the use of a nuclear weapon without the President’s approval: “Nothing, in fact, could be further from the truth.” (See a compendium of clips from the film.) When “Fail-Safe”—a Hollywood thriller with a similar plot, directed by Sidney Lumet—opened, later that year, it was criticized in much the same way. “The incidents in ‘Fail-Safe’ are deliberate lies!” General Curtis LeMay, the Air Force chief of staff, said. “Nothing like that could happen.” The first casualty of every war is the truth—and the Cold War was no exception to that dictum. Half a century after Kubrick’s mad general, Jack D. Ripper, launched a nuclear strike on the Soviets to defend the purity of “our precious bodily fluids” from Communist subversion, we now know that American officers did indeed have the ability to start a Third World War on their own. And despite the introduction of rigorous safeguards in the years since then, the risk of an accidental or unauthorized nuclear detonation hasn’t been completely eliminated.

The command and control of nuclear weapons has long been plagued by an “always/never” dilemma. The administrative and technological systems that are necessary to insure that nuclear weapons are always available for use in wartime may be quite different from those necessary to guarantee that such weapons can never be used, without proper authorization, in peacetime. During the nineteen-fifties and sixties, the “always” in American war planning was given far greater precedence than the “never.” Through two terms in office, beginning in 1953, President Dwight D. Eisenhower struggled with this dilemma. He wanted to retain Presidential control of nuclear weapons while defending America and its allies from attack. But, in a crisis, those two goals might prove contradictory, raising all sorts of difficult questions. What if Soviet bombers were en route to the United States but the President somehow couldn’t be reached? What if Soviet tanks were rolling into West Germany but a communications breakdown prevented NATOofficers from contacting the White House? What if the President were killed during a surprise attack on Washington, D.C., along with the rest of the nation’s civilian leadership? Who would order a nuclear retaliation then?

With great reluctance, Eisenhower agreed to let American officers use their nuclear weapons, in an emergency, if there were no time or no means to contact the President. Air Force pilots were allowed to fire their nuclear anti-aircraft rockets to shoot down Soviet bombers heading toward the United States. And about half a dozen high-level American commanders were allowed to use far more powerful nuclear weapons, without contacting the White House first, when their forces were under attack and “the urgency of time and circumstances clearly does not permit a specific decision by the President, or other person empowered to act in his stead.” Eisenhower worried that providing that sort of authorization in advance could make it possible for someone to do “something foolish down the chain of command” and start an all-out nuclear war. But the alternative—allowing an attack on the United States to go unanswered or NATO forces to be overrun—seemed a lot worse. Aware that his decision might create public unease about who really controlled America’s nuclear arsenal, Eisenhower insisted that his delegation of Presidential authority be kept secret. At a meeting with the Joint Chiefs of Staff, he confessed to being “very fearful of having written papers on this matter.”

President John F. Kennedy was surprised to learn, just a few weeks after taking office, about this secret delegation of power. “A subordinate commander faced with a substantial military action,” Kennedy was told in a top-secret memo, “could start the thermonuclear holocaust on his own initiative if he could not reach you.” Kennedy and his national-security advisers were shocked not only by the wide latitude given to American officers but also by the loose custody of the roughly three thousand American nuclear weapons stored in Europe. Few of the weapons had locks on them. Anyone who got hold of them could detonate them. And there was little to prevent NATO officers from Turkey, Holland, Italy, Great Britain, and Germany from using them without the approval of the United States.

In December, 1960, fifteen members of Congress serving on the Joint Committee on Atomic Energy had toured NATO bases to investigate how American nuclear weapons were being deployed. They found that the weapons—some of them about a hundred times more powerful than the bomb that destroyed Hiroshima—were routinely guarded, transported, and handled by foreign military personnel. American control of the weapons was practically nonexistent. Harold Agnew, a Los Alamos physicist who accompanied the group, was especially concerned to see German pilots sitting in German planes that were decorated with Iron Crosses—and carrying American atomic bombs. Agnew, in his own words, “nearly wet his pants” when he realized that a lone American sentry with a rifle was all that prevented someone from taking off in one of those planes and bombing the Soviet Union.

* * *

The Kennedy Administration soon decided to put locking devices inside NATO’s nuclear weapons. The coded electromechanical switches, known as “permissive action links” (PALs), would be placed on the arming lines. The weapons would be inoperable without the proper code—and that code would be shared with NATO allies only when the White House was prepared to fight the Soviets. The American military didn’t like the idea of these coded switches, fearing that mechanical devices installed to improve weapon safety would diminish weapon reliability. A top-secret State Department memo summarized the view of the Joint Chiefs of Staff in 1961: “all is well with the atomic stockpile program and there is no need for any changes.”

After a crash program to develop the new control technology, during the mid-nineteen-sixties, permissive action links were finally placed inside most of the nuclear weapons deployed by NATO forces. But Kennedy’s directive applied only to the NATO arsenal. For years, the Air Force and the Navy blocked attempts to add coded switches to the weapons solely in their custody. During a national emergency, they argued, the consequences of not receiving the proper code from the White House might be disastrous. And locked weapons might play into the hands of Communist saboteurs. “The very existence of the lock capability,” a top Air Force general claimed, “would create a fail-disable potential for knowledgeable agents to ‘dud’ the entire Minuteman [missile] force.” The Joint Chiefs thought that strict military discipline was the best safeguard against an unauthorized nuclear strike. A two-man rule was instituted to make it more difficult for someone to use a nuclear weapon without permission. And a new screening program, the Human Reliability Program, was created to stop people with emotional, psychological, and substance-abuse problems from gaining access to nuclear weapons.

Despite public assurances that everything was fully under control, in the winter of 1964, while “Dr. Strangelove” was playing in theatres and being condemned as Soviet propaganda, there was nothing to prevent an American bomber crew or missile launch crew from using their weapons against the Soviets. Kubrick had researched the subject for years, consulted experts, and worked closely with a former R.A.F. pilot, Peter George, on the screenplay of the film. George’s novel about the risk of accidental nuclear war, “Red Alert,” was the source for most of “Strangelove” ’s plot. Unbeknownst to both Kubrick and George, a top official at the Department of Defense had already sent a copy of “Red Alert” to every member of the Pentagon’s Scientific Advisory Committee for Ballistic Missiles. At the Pentagon, the book was taken seriously as a cautionary tale about what might go wrong. Even Secretary of Defense Robert S. McNamara privately worried that an accident, a mistake, or a rogue American officer could start a nuclear war.

Coded switches to prevent the unauthorized use of nuclear weapons were finally added to the control systems of American missiles and bombers in the early nineteen-seventies. The Air Force was not pleased, and considered the new security measures to be an insult, a lack of confidence in its personnel. Although the Air Force now denies this claim, according to more than one source I contacted, the code necessary to launch a missile was set to be the same at every Minuteman site: 00000000.

* * *

The early permissive action links were rudimentary. Placed in NATO weapons during the nineteen-sixties and known as Category A PALs, the switches relied on a split four-digit code, with ten thousand possible combinations. If the United States went to war, two people would be necessary to unlock a nuclear weapon, each of them provided with half the code. Category A PALs were useful mainly to delay unauthorized use, to buy time after a weapon had been taken or to thwart an individual psychotic hoping to cause a large explosion. A skilled technician could open a stolen weapon and unlock it within a few hours. Today’s Category D PALs, installed in the Air Force’s hydrogen bombs, are more sophisticated. They require a six-digit code, with a million possible combinations, and have a limited-try feature that disables a weapon when the wrong code is repeatedly entered.

The Air Force’s land-based Minuteman III missiles and the Navy’s submarine-based Trident II missiles now require an eight-digit code—which is no longer 00000000—in order to be launched. The Minuteman crews receive the code via underground cables or an aboveground radio antenna. Sending the launch code to submarines deep underwater presents a greater challenge. Trident submarines contain two safes. One holds the keys necessary to launch a missile; the other holds the combination to the safe with the keys; and the combination to the safe holding the combination must be transmitted to the sub by very-low-frequency or extremely-low-frequency radio. In a pinch, if Washington, D.C., has been destroyed and the launch code doesn’t arrive, the sub’s crew can open the safes with a blowtorch.

The security measures now used to control America’s nuclear weapons are a vast improvement over those of 1964. But, like all human endeavors, they are inherently flawed. The Department of Defense’s Personnel Reliability Program is supposed to keep people with serious emotional or psychological issues away from nuclear weapons—and yet two of the nation’s top nuclear commanders were recently removed from their posts. Neither appears to be the sort of calm, stable person you want with a finger on the button. In fact, their misbehavior seems straight out of “Strangelove.”

Vice Admiral Tim Giardina, the second-highest-ranking officer at the U.S. Strategic Command—the organization responsible for all of America’s nuclear forces—-was investigated last summer for allegedly using counterfeit gambling chips at the Horseshoe Casino in Council Bluffs, Iowa. According to the Iowa Division of Criminal Investigation, “a significant monetary amount” of counterfeit chips was involved. Giardina was relieved of his command on October 3, 2013. A few days later, Major General Michael Carey, the Air Force commander in charge of America’s intercontinental ballistic missiles, was fired for conduct “unbecoming an officer and a gentleman.” According to a report by the Inspector General of the Air Force, Carey had consumed too much alcohol during an official trip to Russia, behaved rudely toward Russian officers, spent time with “suspect” young foreign women in Moscow, loudly discussed sensitive information in a public hotel lounge there, and drunkenly pleaded to get onstage and sing with a Beatles cover band at La Cantina, a Mexican restaurant near Red Square. Despite his requests, the band wouldn’t let Carey onstage to sing or to play the guitar.

While drinking beer in the executive lounge at Moscow’s Marriott Aurora during that visit, General Carey made an admission with serious public-policy implications. He off-handedly told a delegation of U.S. national-security officials that his missile-launch officers have the “worst morale in the Air Force.” Recent events suggest that may be true. In the spring of 2013, nineteen launch officers at Minot Air Force base in North Dakota were decertified for violating safety rules and poor discipline. In August, 2013, the entire missile wing at Malmstrom Air Force base in Montana failed its safety inspection. Last week, the Air Force revealed that thirty-four launch officers at Malmstrom had been decertified for cheating on proficiency exams—and that at least three launch officers are being investigated for illegal drug use. The findings of a report by the RAND Corporation, leaked to the A.P., were equally disturbing. The study found that the rates of spousal abuse and court martials among Air Force personnel with nuclear responsibilities are much higher than those among people with other jobs in the Air Force. “We don’t care if things go properly,” a launch officer told RAND. “We just don’t want to get in trouble.”

The most unlikely and absurd plot element in “Strangelove” is the existence of a Soviet “Doomsday Machine.” The device would trigger itself, automatically, if the Soviet Union were attacked with nuclear weapons. It was meant to be the ultimate deterrent, a threat to destroy the world in order to prevent an American nuclear strike. But the failure of the Soviets to tell the United States about the contraption defeats its purpose and, at the end of the film, inadvertently causes a nuclear Armageddon. “The whole point of the Doomsday Machine is lost,” Dr. Strangelove, the President’s science adviser, explains to the Soviet Ambassador, “if you keep it a secret!”

A decade after the release of “Strangelove,” the Soviet Union began work on the Perimeter system—-a network of sensors and computers that could allow junior military officials to launch missiles without oversight from the Soviet leadership. Perhaps nobody at the Kremlin had seen the film. Completed in 1985, the system was known as the Dead Hand. Once it was activated, Perimeter would order the launch of long-range missiles at the United States if it detected nuclear detonations on Soviet soil and Soviet leaders couldn’t be reached. Like the Doomsday Machine in “Strangelove,” Perimeter was kept secret from the United States; its existence was not revealed until years after the Cold War ended.

In retrospect, Kubrick’s black comedy provided a far more accurate description of the dangers inherent in nuclear command-and-control systems than the ones that the American people got from the White House, the Pentagon, and the mainstream media.

“This is absolute madness, Ambassador,” President Merkin Muffley says in the film, after being told about the Soviets’ automated retaliatory system. “Why should you build such a thing?” Fifty years later, that question remains unanswered, and “Strangelove” seems all the more brilliant, bleak, and terrifyingly on the mark.

You can read Eric Schlosser’s guide to the long-secret documents that help explain the risks America took with its nuclear arsenal, and watch and read his deconstruction of clips from “Dr. Strangelove” and from a little-seen film about permissive action links.

Eric Schlosser is the author of “Command and Control.”

Read Full Post »

La América de John F. Kennedy

Por: Julián Casanova

El país  | 21 de noviembre de 2013

PeticionImagenCA4MLKAA

John F. Kennedy y su esposa, Jackie, en Dallas momentos antes del magnicidio. / ken features

Lo escribió Martin Luther King en su autobiografía: “Aunque la pregunta “¿Quién mató al presidente Kennedy?” es importante, la pregunta “¿Qué lo mató”? es más importante”.

En realidad, 1963 fue un año de numerosos asesinatos políticos en Estados Unidos, la mayoría de dirigentes negros. Y en esa década fue asesinado Malcolm X, en Harlem, Nueva York, el 21 de febrero de 1965, por uno de sus antiguos seguidores, en un momento en el que estaba rompiendo con los líderes más radicales de su movimiento. El 4 de abril de 1968, en el balcón de su habitación del hotel Loraine, en Memphis, Tennessee, un solo disparo acabó con la vida de Martin Luther King. Dos meses más tarde, el 6 de junio, tras un discurso triunfante en California en su campaña para ganar la candidatura por el Partido Demócrata, otro asesino se llevó la del senador Robert F. Kennedy. “No votaré”, declaró un negro neoyorquino en una encuesta: “Matan a todos los hombres buenos que tenemos”.

Todo ocurrió de forma muy rápida, en una década de protestas masivas y de desobediencia civil que precedió al asesinato de JFK. Estados Unidos era entonces la primera potencia militar y económica del mundo, en la que, sin embargo, prevalecía todavía el racismo, una herencia de la esclavitud que esa sociedad tan rica y democrática no había sabido eliminar. Millones de norteamericanos de otras razas diferentes a la blanca se topaban en la vida cotidiana con una aguda discriminación en el trabajo, en la educación, en la política y en la concesión de los derechos legales.

Montgomery, Alabama, la antigua capital de la Confederación durante la guerra civil de los años sesenta del siglo XIX, a donde se trasladó Luther King en octubre de 1954 para ocupar su primer trabajo como pastor y predicador de la iglesia baptista, constituía un excelente ejemplo de cómo la vida de los negros estaba gobernada por los arbitrarios caprichos y voluntades del poder blanco. La mayoría de sus 50.000 habitantes negros trabajaban como criados al servicio de la comunidad blanca, compuesta por 70.000 habitantes, y apenas 2.000 de ellos podían ejercer el derecho al voto en las elecciones. Allí, en Montgomery, en esa pequeña ciudad del sur profundo, donde nada parecía moverse, comenzaron a cambiar las cosas el 1 de diciembre de 1955.

PeticionImagenCAXERLOL

Rosa Parks, en un autubús de Montgomery. / AP

Ese día por la tarde, Rosa Parks, una costurera de 42 años, cogió el autobús desde el trabajo a casa, se sentó en los asientos reservados por la ley a los blancos y cuando el conductor le ordenó levantarse para cedérselo a un hombre blanco que estaba de pie, se negó. Dijo no porque, tal y como lo recordaba después Martin Luther King, no aguantaba más humillaciones y eso es lo que le pedía “su sentido de dignidad y autoestima”. Rosa Parks fue detenida y comenzó un boicot espontáneo a ese sistema segregacionista que regía en los autobuses de la ciudad. Uno de sus promotores, E.D. Nixon, pidió al joven pastor baptista, casi nuevo en la ciudad, que se uniera a la protesta. Y ese fue el bautismo de Martin Luther King como líder del movimiento de los derechos civiles. Unos días después, en una iglesia abarrotada de gente, King avanzó hacia el púlpito y comenzó “el discurso más decisivo” de su vida. Y les dijo que estaban allí porque eran ciudadanos norteamericanos y amaban la democracia, que la raza negra estaba ya harta “de ser pisoteada por el pie de hierro de la opresión”, que estaban dispuestos a luchar y combatir “hasta que la justicia corra como el agua”.

Los trece meses que duró el boicot alumbraron un nuevo movimiento social. Aunque sus dirigentes fueron predicadores negros y después estudiantes universitarios, su auténtica fuerza surgió de la capacidad de movilizar a decenas de miles de trabajadores negros. Una minoría racial, dominada y casi invisible, lideró un amplio repertorio de protestas –boicots, marchas a las cárceles, ocupaciones pacíficas de edificios…- que puso al descubierto la hipocresía del segregacionismo y abrió el camino a una cultura cívica más democrática. La conquista del voto por los negros sería, según percibió desde el principio Martin Luther King, “la llave para la solución completa del problema del sur”.

Pero la libertad y la dignidad para millones de negros no podía ganarse sin un desafío fundamental a la distribución existente del poder. La estrategia de desobediencia civil no violenta, predicada y puesta en práctica por Martin Luther King hasta su muerte, encontró muchos obstáculos.

PeticionImagenCAPD1PBT

Luther King se dirige a los asistentes a la Marcha de Washington el 28 de agosto de 1963. / france press

A John Fitzgerald Kennedy, ganador de las elecciones presidenciales de noviembre de 1960, el reconocimiento de los derechos civiles le creó numerosos problemas con los congresistas blancos del sur y trató por todos los medios de evitar que se convirtiera en el tema dominante de la política nacional. No lo consiguió, porque antes de que fuera asesinado en Dallas, Texas, el 22 de noviembre de 1963, el movimiento se había extendido a las ciudades más importantes del norte del país y había protagonizado una multitudinaria marcha a Washington en agosto de ese año, la manifestación política más importante de la historia de Estados Unidos.

No fue todo un camino de rosas. La batalla contra el racismo se llenó de rencores y odios, dejando cientos de muertos y miles de heridos. La violencia racial no era una fenómeno nuevo en la sociedad norteamericana. Pero hasta el final de la Segunda Guerra Mundial, esa violencia había sido protagonizada por grupos de blancos armados que atacaban a los negros y por el Ku Klux Klan, la organización terrorista establecida en el sur precisamente para impedir la concesión de derechos legales a los ciudadanos negros. En los disturbios de los años sesenta, por el contrario, muchos negros respondieron a la discriminación y a la represión policial con asaltos a las propiedades de los blancos, incendios y saqueos. Las versiones oficiales y muchos periódicos culparon de la violencia y de los derramamientos de sangre a pequeños grupos de agitadores radicales, aunque posteriores investigaciones revelaron que la mayoría de las víctimas fueron negros que murieron por los disparos de las fuerzas gubernamentales.

Con tanta violencia, la estrategia pacífica de Martin Luther King parecía tambalearse. Y frente a ella surgieron nuevos dirigentes negros con visiones alternativas. El más carismático fue un hombre llamado Malcolm X, que había visto de niño cómo el Ku Klux Klan incendiaba su casa y mataba a su padre, un predicador baptista, y que se había convertido al islamismo después de una larga estancia en prisión. Criticó el movimiento a favor de los derechos civiles, despreció la estrategia de la no violencia y sostuvo una agria disputa con Martin Luther King, al que llamó “traidor al pueblo negro”. King deploró su “oratoria demagógica” y dijo estar convencido de que era ese racismo tan enfermo y profundo el que alimentaba figuras como Malcolm X. Cuando éste fue asesinado, King recordó de nuevo que “la violencia y el odio sólo engendran violencia y odio”.

Los negros sabían muy bien qué eran los asesinatos políticos. Cuando subió al poder, John F. Kennedy no conocía a muchos negros. Pero tuvo que abordar el problema, el más acuciante de la sociedad estadounidense. Hubo dos Kennedys, como también recordó Luther King. El presionado y acuciado, durante sus dos primeros años de mandato, por la incertidumbre causada por la dura campaña electoral y su escaso margen de victoria sobre Richard Nixon en 1960; y el que tuvo el coraje, desde 1963, de convertirse en un defensor de los derechos civiles.

PeticionImagenCA62DS35

Marines cruzando un río en Vietnam el 30 de octubre de 1968. / agencia keystone

Pero si todos esos conflictos sobre los derechos civiles revelaban algunas de las enfermedades de aquella sociedad, la política exterior, desde la crisis de los misiles en Cuba hasta la guerra de Vietnam, sacó a la superficie las tensiones inherentes a los esfuerzos de Kennedy por manejar el imperio. Kennedy decidió demostrar al mundo el poder estadounidense y comenzó a convertir a Vietnam en el territorio idóneo para destruir al enemigo. Kennedy no lo vio, pero la guerra que siguió a su muerte fue el desastre más grande de la historia de Estados Unidos en el siglo XX.

“Hemos creado una atmósfera en la que la violencia y el odio se han convertido en pasatiempos populares”, escribió Luther King en el epitafio que le dedicó al presidente. El asesinato de Kennedy no sólo mató a un hombre, sino a un montón de “ilusiones”. Cuando se conoció su muerte, en muchos sitios, en medio del duelo general, se escuchó la Dance of the Blessed Spirits. Cuando asesinaron a Luther King, casi cinco años después, la rabia y la violencia se propagaron en forma de disturbios por más de un centenar de ciudades, el final amargo de una era de sueños y esperanzas. Lo dijo su padre, el predicador baptista que le había inculcado los valores de la dignidad y de la justicia: “Fue el odio en esta tierra el que me quitó a mi hijo”.

, catedrático de Historia Contemporánea de la Universidad de Zaragoza, defiende, como Eric J. Hobsbawm, que los historiadores son «los ‘recordadores’ profesionales de lo que los ciudadanos desean olvidar». Es autor de una veintena de libros sobre anarquismo, Guerra Civil y siglo XX.

Read Full Post »

Otros punto de vista sobre JFK

Joseph Nye

El país, 20 de noviembre de 2013

El 22 de noviembre se cumplirán 50 años del asesinato del presidente John F. Kennedy. Fue uno de esos acontecimientos tan estremecedores, que las personas que lo vivieron se acuerdan dónde estaban cuando supieron la noticia. Yo estaba bajando del tren en Nairobi cuando vi el dramático encabezado. Kennedy tenía tan solo 46 años cuando Lee Harvey Oswald lo asesinó en Dallas. Oswald era un ex marino descontento que había desertado a la Unión Soviética. Aunque su vida estuvo llena de enfermedades, Kennedy proyectaba una imagen de juventud y vigor, que hicieron más dramática y patética su muerte.

El martirio de Kennedy hizo que muchos estadounidenses lo elevaran al nivel de grandes presidentes, como George Washington y Abraham Lincoln, pero los historiadores son más reservados en sus evaluaciones. Sus críticos hacen referencia a su conducta sexual a veces imprudente, a su escaso récord legislativo y a su incapacidad para ser congruente con sus palabras. Si bien Kennedy hablaba de derechos civiles, reducciones de los impuestos y de la pobreza; fue su sucesor, Lyndon Johnson, el que utilizó la condición de mártir de Kennedy –aunado a sus muy superiores habilidades políticas– para pasar leyes históricas sobre estos temas.

En una encuesta de 2009 de especialistas sobre 65 presidentes estadounidenses JKF es considerado el sexto más importante, mientras que en una encuesta reciente realizada por expertos británicos en política estadounidense, Kennedy obtiene el lugar quince. Estas clasificaciones son sobresalientes para un presidente que estuvo en el cargo menos de tres años. Sin embargo, ¿qué logró verdaderamente Kennedy y cuán diferente habría sido la historia si hubiera sobrevivido?

En mi libro, Presidential Leadership and the Creation of the American Era, clasifico los presidentes en dos categorías: aquellos que fueron transformadores en la definición de sus objetivos, que actuaron con gran visión en cuanto a importantes cambios; y los líderes operativos, que se centran sobre todo en aspectos “prácticos”, para garantizar que todo marchaba sobre ruedas (y correctamente). Como era un activista y con grandes dones de comunicación con un estilo inspirador, Kennedy parecía ser un presidente transformador. Su campaña en 1960 se desarrolló bajo la promesa de “hacer que el país avance de nuevo».

En su discurso de toma de posesión, Kennedy llamó a hacer esfuerzos (“No hay que preguntarse qué puede hacer el país por mí, sino que puedo hacer yo por mi país”). Creó programas como el Cuerpo de Paz y la Alianza para el Progreso para América Latina; además, preparó a su país para enviar al hombre a la luna a finales de los años sesenta. Sin embargo, a pesar de su activismo y retórica, Kennedy tenía una personalidad más precavida que ideológica. Como señaló el historiador de presidentes, Fred Greenstein, “Kennedy tenía muy poca perspectiva global.”

En lugar de criticar a Kennedy por no cumplir lo que dijo, deberíamos agradecerle que en situaciones difíciles actuaba con prudencia y sentido práctico y no de forma ideológica y transformadora. Su logro más importante durante su breve mandato fue el manejo de la crisis de los misiles de Cuba en 1962, y apaciguamiento de lo que fue probablemente el episodio más peligroso desde el comienzo de la era nuclear.

Sin duda se puede culpar a Kennedy por el desastre de la invasión a Bahía de Cochinos en Cuba y la subsiguiente Operación Mangosta, el esfuerzo encubierto de la CIA contra el régimen de Castro, que hizo pensar a la Unión Soviética de que su aliado estaba bajo amenaza. Sin embargo, Kennedy aprendió de su derrota en Bahía de Cochinos y creó un procedimiento detallado para controlar la crisis que vino después de que la Unión Soviética emplazara misiles nucleares en Cuba.

Muchos de los asesores de Kennedy, así como líderes militares de los Estados Unidos, querían una invasión y un ataque aéreo, que ahora sabemos podrían haber hecho que los comandantes soviéticos en el terreno usaran sus armas nucleares tácticas. En cambio, Kennedy ganó tiempo y mantuvo abiertas sus opciones mientras negociaba una solución para la crisis con el líder soviético, Nikita Khrushchev. A juzgar por los duros comentarios del vicepresidente de la época, Lyndon Johnson, el resultado habría sido mucho peor si Kennedy no hubiera sido el presidente.

Además, Kennedy también aprendió de la crisis cubana de misiles: el 10 de junio de 1963 dio un discurso destinado a apaciguar las tensiones de la Guerra Fría. Señaló, “hablo de paz, por lo tanto, como el fin racional necesario del ser humano racional”. Si bien una visión presidencial de paz no era nueva, Kennedy le dio seguimiento mediante la negociación del primer acuerdo de control de armas nucleares, el Tratado de prohibición parcial de los ensayos nucleares.

La gran pregunta sin respuesta sobre la presidencia de Kennedy y cómo su asesinato afectó la política exterior estadounidense, es ¿qué habría hecho él en cuanto a la guerra en Vietnam? Cuando Kennedy llegó a la presidencia los Estados Unidos había algunos cientos de asesores en Vietnam del sur; pero ese número aumentó a 16.000. Johnson finalmente incrementó las tropas estadounidenses a más de 500.000.

Muchos partidarios de Kennedy sostienen que él nunca habría cometido ese error. Aunque respaldó un golpe para sustituir al presidente de Vietnam del sur, Ngo Dinh Diem, y dejó a Johnson una situación deteriorada y un grupo de asesores que recomendaban no retirarse. Algunos seguidores fervientes de Kennedy –por ejemplo, el historiador Arthur Schlesinger, y el asesor de discursos de Kennedy, Theodore Sorensen– han señalado que Kennedy planeaba retirarse de Vietnam después de ganar la reelección en 1964, y sostenían que había comentado su plan al senador, Mike Mansfield. No obstante, los escépticos mencionan que Kennedy siempre habló públicamente de la necesidad de permanecer en Vietnam. La pregunta sigue abierta.

En mi opinión, Kennedy fue un buen presidente pero no extraordinario. Lo que lo distinguía no era solo su habilidad para inspirar a otros, sino su cautela cuando se trataba de tomar decisiones complejas de política exterior. Tuvimos la suerte de que tuviera más sentido práctico que transformador en lo que se refiere a política exterior. Para nuestra mala suerte lo perdimos tras solo mil días.

Joseph S. Nye es profesor de la Universidad de Harvard y autor de Presidential Leadership and the Creation of the American Era.

Traducción de Kena Nequiz

Read Full Post »

Read Full Post »

Seleccione aquí para una versión en castellano de este artículo.

TomDispatch

Naming Our Nameless War 

How Many Years Will It Be?
By Andrew J. Bacevich

For well over a decade now the United States has been “a nation at war.” Does that war have a name?

It did at the outset.  After 9/11, George W. Bush’s administration wasted no time in announcing that the U.S. was engaged in a Global War on Terrorism, or GWOT.  With few dissenters, the media quickly embraced the term. The GWOT promised to be a gargantuan, transformative enterprise. The conflict begun on 9/11 would define the age. In neoconservative circles, it was known as World War IV.

Upon succeeding to the presidency in 2009, however, Barack Obama without fanfare junked Bush’s formulation (as he did again in a speech at the National Defense University last week).  Yet if the appellation went away, the conflict itself, shorn of identifying marks, continued.

Does it matter that ours has become and remains a nameless war? Very much so.

Names bestow meaning.  When it comes to war, a name attached to a date can shape our understanding of what the conflict was all about.  To specify when a war began and when it ended is to privilege certain explanations of its significance while discrediting others. Let me provide a few illustrations.

With rare exceptions, Americans today characterize the horrendous fraternal bloodletting of 1861-1865 as the Civil War.  Yet not many decades ago, diehard supporters of the Lost Cause insisted on referring to that conflict as the War Between the States or the War for Southern Independence (or even the War of Northern Aggression).  The South may have gone down in defeat, but the purposes for which Southerners had fought — preserving a distinctive way of life and the principle of states’ rights — had been worthy, even noble.  So at least they professed to believe, with their preferred names for the war reflecting that belief.

Schoolbooks tell us that the Spanish-American War began in April 1898 and ended in August of that same year.  The name and dates fit nicely with a widespread inclination from President William McKinley’s day to our own to frame U.S. intervention in Cuba as an altruistic effort to liberate that island from Spanish oppression.

Yet the Cubans were not exactly bystanders in that drama.  By 1898, they had been fighting for years to oust their colonial overlords.  And although hostilities in Cuba itself ended on August 12th, they dragged on in the Philippines, another Spanish colony that the United States had seized for reasons only remotely related to liberating Cubans.  Notably, U.S. troops occupying the Philippines waged a brutal war not against Spaniards but against Filipino nationalists no more inclined to accept colonial rule by Washington than by Madrid.  So widen the aperture to include this Cuban prelude and the Filipino postlude and you end up with something like this:  The Spanish-American-Cuban-Philippines War of 1895-1902.  Too clunky?  How about the War for the American Empire?  This much is for sure: rather than illuminating, the commonplace textbook descriptor serves chiefly to conceal.

Strange as it may seem, Europeans once referred to the calamitous events of 1914-1918 as the Great War.  When Woodrow Wilson decided in 1917 to send an army of doughboys to fight alongside the Allies, he went beyond Great.  According to the president, the Great War was going to be the War To End All Wars.  Alas, things did not pan out as he expected.  Perhaps anticipating the demise of his vision of permanent peace, War Department General Order 115, issued on October 7, 1919, formally declared that, at least as far as the United States was concerned, the recently concluded hostilities would be known simply as the World War.

In September 1939 — presto chango! — the World Warsuddenly became the First World War, the Nazi invasion of Poland having inaugurated a Second World War, also known asWorld War II or more cryptically WWII.  To be sure, Soviet dictator Josef Stalin preferred the Great Patriotic War. Although this found instant — almost unanimous — favor among Soviet citizens, it did not catch on elsewhere.

Does World War II accurately capture the events it purports to encompass?  With the crusade against the Axis now ranking alongside the crusade against slavery as a myth-enshrouded chapter in U.S. history to which all must pay homage, Americans are no more inclined to consider that question than to consider why a playoff to determine the professional baseball championship of North America constitutes a “World Series.”

In fact, however convenient and familiar, World War II is misleading and not especially useful.  The period in question saw at least two wars, each only tenuously connected to the other, each having distinctive origins, each yielding a different outcome.  To separate them is to transform the historical landscape.

On the one hand, there was the Pacific War, pitting the United States against Japan.  Formally initiated by the December 7, 1941, attack on Pearl Harbor, it had in fact begun a decade earlier when Japan embarked upon a policy of armed conquest in Manchuria.  At stake was the question of who would dominate East Asia.  Japan’s crushing defeat at the hands of the United States, sealed by two atomic bombs in 1945, answered that question (at least for a time).

Then there was the European War, pitting Nazi Germany first against Great Britain and France, but ultimately against a grand alliance led by the United States, the Soviet Union, and a fast fading British Empire.  At stake was the question of who would dominate Europe.  Germany’s defeat resolved that issue (at least for a time): no one would.  To prevent any single power from controlling Europe, two outside powers divided it.

This division served as the basis for the ensuing Cold War,which wasn’t actually cold, but also (thankfully) wasn’t World War III, the retrospective insistence of bellicose neoconservatives notwithstanding.  But when did the Cold Warbegin?  Was it in early 1947, when President Harry Truman decided that Stalin’s Russia posed a looming threat and committed the United States to a strategy of containment?  Or was it in 1919, when Vladimir Lenin decided that Winston Churchill’s vow to “strangle Bolshevism in its cradle” posed a looming threat to the Russian Revolution, with an ongoing Anglo-American military intervention evincing a determination to make good on that vow?

Separating the war against Nazi Germany from the war against Imperial Japan opens up another interpretive possibility.  If you incorporate the European conflict of 1914-1918 and the European conflict of 1939-1945 into a single narrative, you get a Second Thirty Years War (the first having occurred from 1618-1648) — not so much a contest of good against evil, as a mindless exercise in self-destruction that represented the ultimate expression of European folly.

So, yes, it matters what we choose to call the military enterprise we’ve been waging not only in Iraq and Afghanistan, but also in any number of other countries scattered hither and yon across the Islamic world.  Although the Obama administration appears no more interested than the Bush administration in saying when that enterprise will actually end, the date we choose as its starting point also matters.

Although Washington seems in no hurry to name its nameless war — and will no doubt settle on something self-serving or anodyne if it ever finally addresses the issue — perhaps we should jump-start the process.  Let’s consider some possible options, names that might actually explain what’s going on.

The Long War: Coined not long after 9/11 by senior officers in the Pentagon, this formulation never gained traction with either civilian officials or the general public.  Yet the Long War deserves consideration, even though — or perhaps because — it has lost its luster with the passage of time.

At the outset, it connoted grand ambitions buoyed by extreme confidence in the efficacy of American military might.  This was going to be one for the ages, a multi-generational conflict yielding sweeping results.

The Long War did begin on a hopeful note.  The initial entry into Afghanistan and then into Iraq seemed to herald “home by Christmas” triumphal parades.  Yet this soon proved an illusion as victory slipped from Washington’s grasp.  By 2005 at the latest, events in the field had dashed the neo-Wilsonian expectations nurtured back home.

With the conflicts in Iraq and Afghanistan dragging on, “long” lost its original connotation.  Instead of “really important,» it became a synonym for “interminable.”  Today, the Long Wardoes succinctly capture the experience of American soldiers who have endured multiple combat deployments to Iraq and Afghanistan.

For Long War combatants, the object of the exercise has become to persist.  As for winning, it’s not in the cards. TheLong War just might conclude by the end of 2014 if President Obama keeps his pledge to end the U.S. combat role in Afghanistan and if he avoids getting sucked into Syria’s civil war.  So the troops may hope.

The War Against Al-Qaeda: It began in August 1996 when Osama bin Laden issued a «Declaration of War against the Americans Occupying the Land of the Two Holy Places,” i.e., Saudi Arabia.  In February 1998, a second bin Laden manifesto announced that killing Americans, military and civilian alike, had become “an individual duty for every Muslim who can do it in any country in which it is possible to do it.”

Although President Bill Clinton took notice, the U.S. response to bin Laden’s provocations was limited and ineffectual.  Only after 9/11 did Washington take this threat seriously.  Since then, apart from a pointless excursion into Iraq (where, in Saddam Hussein’s day, al-Qaeda did not exist), U.S. attention has been focused on Afghanistan, where U.S. troops have waged the longest war in American history, and on Pakistan’s tribal borderlands, where a CIA drone campaign is ongoing.  By the end of President Obama’s first term, U.S. intelligence agencies were reporting that a combined CIA/military campaign had largely destroyed bin Laden’s organization.  Bin Laden himself, of course, was dead.

Could the United States have declared victory in its unnamed war at this point?  Perhaps, but it gave little thought to doing so.  Instead, the national security apparatus had already trained its sights on various al-Qaeda “franchises” and wannabes, militant groups claiming the bin Laden brand and waging their own version of jihad.  These offshoots emerged in the Maghreb, Yemen, Somalia, Nigeria, and — wouldn’t you know it — post-Saddam Iraq, among other places.  The question as to whether they actually posed a danger to the United States got, at best, passing attention — the label “al-Qaeda” eliciting the same sort of Pavlovian response that the word “communist” once did.

Americans should not expect this war to end anytime soon.  Indeed, the Pentagon’s impresario of special operations recently speculated — by no means unhappily — that it would continue globally for “at least 10 to 20 years.”   Freely translated, his statement undoubtedly means: “No one really knows, but we’re planning to keep at it for one helluva long time.”

The War For/Against/About Israel: It began in 1948.  For many Jews, the founding of the state of Israel signified an ancient hope fulfilled.  For many Christians, conscious of the sin of anti-Semitism that had culminated in the Holocaust, it offered a way to ease guilty consciences, albeit mostly at others’ expense.  For many Muslims, especially Arabs, and most acutely Arabs who had been living in Palestine, the founding of the Jewish state represented a grave injustice.  It was yet another unwelcome intrusion engineered by the West — colonialism by another name.

Recounting the ensuing struggle without appearing to take sides is almost impossible.  Yet one thing seems clear: in terms of military involvement, the United States attempted in the late 1940s and 1950s to keep its distance.  Over the course of the 1960s, this changed.  The U.S. became Israel’s principal patron, committed to maintaining (and indeed increasing) its military superiority over its neighbors.

In the decades that followed, the two countries forged a multifaceted “strategic relationship.”  A compliant Congress provided Israel with weapons and other assistance worth many billions of dollars, testifying to what has become an unambiguous and irrevocable U.S. commitment to the safety and well-being of the Jewish state.  The two countries share technology and intelligence.  Meanwhile, just as Israel had disregarded U.S. concerns when it came to developing nuclear weapons, it ignored persistent U.S. requests that it refrain from colonizing territory that it has conquered.

When it comes to identifying the minimal essential requirements of Israeli security and the terms that will define any Palestinian-Israeli peace deal, the United States defers to Israel.  That may qualify as an overstatement, but only slightly.  Given the Israeli perspective on those requirements and those terms — permanent military supremacy and a permanently demilitarized Palestine allowed limited sovereignty — the War For/Against/About Israel is unlikely to end anytime soon either.  Whether the United States benefits from the perpetuation of this war is difficult to say, but we are in it for the long haul.

The War for the Greater Middle East: I confess that this is the name I would choose for Washington’s unnamed war and is, in fact, the title of a course I teach.  (A tempting alternative is the Second Hundred Years War, the «first» having begun in 1337 and ended in 1453.)

This war is about to hit the century mark, its opening chapter coinciding with the onset of World War I.  Not long after the fighting on the Western Front in Europe had settled into a stalemate, the British government, looking for ways to gain the upper hand, set out to dismantle the Ottoman Empire whose rulers had foolishly thrown in their lot with the German Reich against the Allies.

By the time the war ended with Germany and the Turks on the losing side, Great Britain had already begun to draw up new boundaries, invent states, and install rulers to suit its predilections, while also issuing mutually contradictory promises to groups inhabiting these new precincts of its empire.  Toward what end?  Simply put, the British were intent on calling the shots from Egypt to India, whether by governing through intermediaries or ruling directly.  The result was a new Middle East and a total mess.

London presided over this mess, albeit with considerable difficulty, until the end of World War II.  At this point, by abandoning efforts to keep Arabs and Zionists from one another’s throats in Palestine and by accepting the partition of India, they signaled their intention to throw in the towel. Alas, Washington proved more than willing to assume Britain’s role.  The lure of oil was strong.  So too were the fears, however overwrought, of the Soviets extending their influence into the region.

Unfortunately, the Americans enjoyed no more success in promoting long-term, pro-Western stability than had the British.  In some respects, they only made things worse, with the joint CIA-MI6 overthrow of a democratically elected government in Iran in 1953 offering a prime example of a “success” that, to this day, has never stopped breeding disaster.

Only after 1980 did things get really interesting, however.  The Carter Doctrine promulgated that year designated the Persian Gulf a vital national security interest and opened the door to greatly increased U.S. military activity not just in the Gulf, but also throughout the Greater Middle East (GME).  Between 1945 and 1980, considerable numbers of American soldiers lost their lives fighting in Asia and elsewhere.  During that period, virtually none were killed fighting in the GME.  Since 1990, in contrast, virtually none have been killed fighting anywhere except in the GME.

What does the United States hope to achieve in its inherited and unending War for the Greater Middle East?  To pacify the region?  To remake it in our image?  To drain its stocks of petroleum?  Or just keeping the lid on?  However you define the war’s aims, things have not gone well, which once again suggests that, in some form, it will continue for some time to come.  If there’s any good news here, it’s the prospect of having ever more material for my seminar, which may soon expand into a two-semester course.

The War Against Islam: This war began nearly 1,000 years ago and continued for centuries, a storied collision between Christendom and the Muslim ummah.  For a couple of hundred years, periodic eruptions of large-scale violence occurred until the conflict finally petered out with the last crusade sometime in the fourteenth century.

In those days, many people had deemed religion something worth fighting for, a proposition to which the more sophisticated present-day inhabitants of Christendom no longer subscribe.  Yet could that religious war have resumed in our own day?  Professor Samuel Huntington thought so, although he styled the conflict a “clash of civilizations.”  Some militant radical Islamists agree with Professor Huntington, citing as evidence the unwelcome meddling of “infidels,” mostly wearing American uniforms, in various parts of the Muslim world.  Some militant evangelical Christians endorse this proposition, even if they take a more favorable view of U.S. troops occupying and drones targeting Muslim countries.

In explaining the position of the United States government, religious scholars like George W. Bush and Barack (Hussein!) Obama have consistently expressed a contrary view.  Islam is a religion of peace, they declare, part of the great Abrahamic triad.  That the other elements of that triad are likewise committed to peace is a proposition that Bush, Obama, and most Americans take for granted, evidence not required.  There should be no reason why Christians, Jews, and Muslims can’t live together in harmony.

Still, remember back in 2001 when, in an unscripted moment, President Bush described the war barely begun as a “crusade”?  That was just a slip of the tongue, right?  If not, we just might end up calling this one the Eternal War.

Andrew J. Bacevich is a professor of history and international relations at Boston University and a TomDispatch regular. His next book, Breach of Trust: How Americans Failed Their Soldiers and Their Countrywill appear in September.

Follow TomDispatch on Twitter and join us on Facebook orTumblr. Check out the newest Dispatch book, Nick Turse’sThe Changing Face of Empire: Special Ops, Drones, Proxy Fighters, Secret Bases, and Cyberwarfare.

View this story online at: http://www.tomdispatch.com/blog/175704/

Read Full Post »

AHuellas2caba de salir el cuarto número de la revista on-line Huellas de los Estados Unidos. Estudios, perspectivas y debates desde América Latina. Este número está dedicado al análisis de las recientes elecciones presidenciales norteamericanas con  trabajos de Tom Engelhardt (creador de la famosa y valiosa TomDispatch.com), del puertorriqueño Raúl L. Cotto Serrano, de Valeria L. Carbone y de Fabio Nigra, entre otros.

Completan este número un grupo de ensayos dedicados a temas tan variados como esclavitud, raza e ideología,  guerra fría y criminalidad en Puerto Rico, así como también un análisis de la película  El Álamo (1960).

Nuestras felicitaciones a lo editores de Huellas de los Estados Unidos por otra aportación valiosa al estudio  de los Estados Unidos en América Latina.

Norberto Barreto Velázquez, PhD

Lima, Perú, 18 de marzo de 2013

Read Full Post »

Stephen M. Walt

El excepcionalismo norteamericano sigue siendo un tema de discusión en los medios estadounidenses gracias a los ataques de los pre-candidatos republicanos a la presidencia contra Obama por su supuesto rechazo a la excepcionalidad norteamericana. Una de las aportaciones más interesantes a esta discusión es un artículo del Dr. Stephen M. Walt aparecido en  la edición de noviembre  del 2011 de la revista Foreign Policy.  El Dr. Walt es profesor de la Escuela de Gobierno de la Universidad de Harvard y coautor junto a John Mearsheimer del controversial e importante libro The Israeli Lobby and the U. S. Foreign Policy (2007), analizando la influencia de los grupos de presión pro-israelíes sobre la política exterior norteamericana.

Titulado “The Myth of American Exceptionalism”, el artículo de Walt examina críticamente la alegada excepcionalidad de los Estados Unidos. Lo primero que hace el autor es reconocer el peso histórico y, especialmente político, de esta idea. Por más de doscientos años los líderes y políticos estadounidenses han  hecho uso de la idea del excepcionalismo. De ahí las críticas que recibe Obama por parte de los republicanos por su alegada abandono del credo de la excepcionalidad.

Esta pieza clave de la formación nacional norteamericana parte, según Walt, de la idea de que los valores, la historia  y el sistema político de los Estados Unidos no son sólo únicos, sino también universales. El autor reconoce que esta idea está asociada a la visión de los Estados Unidos como nación destinada a jugar un papel especial y positivo, recogida muy bien por la famosa frase de la ex Secretaria de Estado Madeleine Albright, quien en 1998 dijo que Estados Unidos era la nación indispensable (“we are the indispensable nation”).

Para Walt, el principal problema con la idea del excepcionalismo es que es un mito, ya que el comportamiento internacional de los Estados Unidos no ha estado determinado por su alegada unicidad, sino por su poder y por lo que el autor denomina como la naturaleza inherentemente competitiva de la política internacional (Walt compara la política internacional con un deporte de contacto (“contact sport”). Además, la creencia en la  excepcionalidad no permite que los estadounidenses se vean como realmente son: muy similares a cualquier otra nación poderosa de la historia. El predominio de esta imagen falseada tampoco ayuda a los estadounidenses a entender  cómo son vistos por otros países ni a comprender las críticas a la hipocresía de los Estados Unidos en temas como las armas nucleares, la promoción de la democracia y otros. Todo ello le resta efectividad a la política exterior de la nación norteamericana.

Como parte de su análisis,  Walt identifica y examina cinco mitos del excepcionalismo norteamericano:

  1. No hay nada excepcional en el excepcionalismo norteamericano: Contrario a lo que piensan muchos estadounidenses, el comportamiento de  su país no ha sido muy diferente al de otras potencias mundiales. Según Walt, los Estados Unidos no ha enfrentado responsabilidades únicas  que le han obligado a asumir cargas y responsabilidades especiales. En otras palabras, Estados Unidos no ha sido una nación indispensable como alegaba la Sra. Albright. Además, los argumentos  de superioridad moral y de buenas intenciones tampoco han sido exclusivos  de los norteamericanos. Prueba de ello son el “white man´s burden” de los británicos, la “mission civilisatrice” de los franceses o la “missão civilizadora” de los portugueses. Todo ellos, añado yo, sirvieron para justificar el colonialismo como una empresa civilizadora.
  2. La superioridad moral: quienes creen en la excepcionalidad de los Estados Unidos alegan que ésta es una nación virtuosa, que promueve la libertad, amante de la paz, y respetuosa de la ley y de los derechos humanos. En otras palabras, moralmente superior y siempre regida por propósitos nobles y superiores. Walt platea que Estados Unidos tal vez no sea la nación más brutal de la historia, pero tampoco es el faro moral que imaginan algunos de sus conciudadanos. Para demostrar su punto enumera algunos de los  “pecados” cometidos por la nación estadounidense: el exterminio y sometimiento de los pueblos americanos originales como parte de su expansión continental, los miles de muertos de la guerra filipino-norteamericana de principios del siglo XX, los bombardeos que mataron miles de alemanes y japoneses durante la segunda guerra mundial, las más de 6 millones de toneladas de explosivos lanzadas en Indochina en los años 1960 y 1970, los más de 30,000 nicaragüenses muertos en los años 1980 en la campaña contra el Sandinismo y los miles de muertos causados por la invasión de Irak.  A esta lista el autor le añade la negativa a firmar tratados sobre derechos humanos, el rechazo a la Corte Internacional de Justicia, el apoyo a dictaduras violadores de derechos humanos en defensa de intereses geopolíticos, Abu Ghraib, el “waterboarding” y el “extraordinary rendition”.
  3. El genio especial de los norteamericanos: los creyentes de la excepcionalidad han explicado el desarrollo y poderío norteamericano como la confirmación de la superioridad y unicidad de los Estados Unidos. Según éstos, el éxito de su país se ha debido al genio especial de los norteamericanos. Para Walt, el poderío estadounidense ha sido producto de la suerte, no de su superioridad moral o genialidad. La suerte de poseer una territorio grande y con abundante recursos naturales. La suerte de estar ubicado lejos de los problemas y guerras de las potencias europeas. La suerte de que las potencias europeas estuvieran enfrentadas entre ellas y no frenaran la expansión continental de los Estados Unidos. La suerte de que dos guerras mundiales devastaran a sus competidores.
  4. EEUU como la fuente de “most of the good in the World”: los defensores de la excepcionalidad ven a Estados Unidos como una fuerza positiva mundial. Según Walt, es cierto que Estados Unidos ha contribuido a la paz y estabilidad mundial a través de acciones como el Plan Marshall, los acuerdos de Bretton Wood y su retórica a favor de los derechos humanos y la democracia. Pero no es correcto pensar que las acciones estadounidenses son buenas por defecto. El autor plantea que es necesario que los estadounidenses reconozcan el papel que otros países jugaron en el fin de la guerra fría, el avance d e los derechos civiles, la justicia criminal, la justicia económica, etc.  Es preciso que los norteamericanos reconozcan sus “weak spots” como el rol de su país como principal emisor de  gases de invernadero, el apoyo del gobierno norteamericano al régimen racista de Sudáfrica, el apoyo irrestricto a Israel, etc.
  5. “God is on our side”: un elemento crucial del excepcionalismo estadounidense es la idea de que Estados Unidos es un pueblo escogido por Dios, con una plan divino a seguir. Para el autor, creer que se tiene un mandato divino es muy peligroso porque lleva a creerse  infalible y caer en el riesgo de ser víctima de gobernantes incompetentes o sinvergüenzas como el caso de la Francia napoleónica y el Japón imperial. Además, un examen de la historia norteamericana en la última década deja claro sus debilidades y fracasos: un “ill-advised tax-cut”, dos guerras desastrosas y una crisis financiera producto de la corrupción y la avaricia. Para Walt, los norteamericanos deberían preguntarse, siguiendo a Lincoln, si su nación está del lado de Dios y no si éste está de su lado.

Walt concluye señalando que, dado los problemas que enfrenta Estados Unidos, no es sorprendente que se recurra al patriotismo del excepcionalismo con fervor. Tal patriotismo podría tener sus beneficios, pero lleva a un entendimiento incorrecto del papel internacional que juega la nación norteamericana y  a la toma de malas decisiones. En palabras de Walt,

  Ironically, U.S. foreign policy would probably be more effective if Americans were less convinced of their own unique virtues and less eager to proclaim them. What   we        need, in short, is a more realistic and critical assessment of America’s true  character and contributions.

Este análisis de los elementos que componen el discurso del excepcionalismo norteamericano es un esfuerzo valiente y sincero  que merece todas mis simpatías y respeto. En una sociedad tan ideologizada como la norteamericana, y en donde los niveles de ignorancia e insensatez son tal altos, se hacen imprescindibles  voces como las  Stephen M. Walt. Es indiscutible que los norteamericanos necesitan superar las gríngolas ideológicas que no les permiten verse tal como son y no como se imaginan. El mundo entero se beneficiaría de un proceso así.

Norberto Barreto Velázquez, PhD

Lima, 14 de diciembre de 2011

Read Full Post »

A mediados de abril de 1961, un grupo de exiliados cubanos armados y entrenados por el gobierno de los Estados Unidos desembarcaron en Cuba con la intención de derrocar el gobierno revolucionario cubano. La famosa invasión de Bahía de Cochinos es, sin lugar a dudas, uno de los más grandes fiascos de la  guerra fría. Mucho se ha escrito y comentado sobre este singular evento. Hoy, cincuenta años después, contamos con una nueva interpretación con una valor muy especial, ya que fue escrita por los historiadores de la oficina que tuvo en sus manos la planificación y ejecución de la invasión, la Agencia Central de Inteligencia (CIA).

La Freedom of Information Act es una ley estadounidense que capacita a ciudadanos norteamericanos a requerir del gobierno de Estados Unidos la revelación total o parcial de documentos e información bajo su poder. Amparado en esta ley,  el National Security Archive –un instituto de investigación ubicado en la George Washington University– logró que la CIA hiciera público  la Official History of Bay of Pigs Operation.

Invasores capturados por las fuerzas revolucionarias cubanas

Esta historia –redactada por Jack Peiffer, historiador oficial de la CIA–  revela que la invasión fue un fracaso mayor de lo que hasta ahora sabíamos. Entre otras cosas, deja claro que incapaces de distinguir entre los aviones de la fuerza aérea  cubana y los de la CIA, los invasores dispararon contra su apoyo aéreo. Otro dato interesante es el uso de fondos destinados para la invasión para contratar sicarios profesionales para que asesinaran a Fidel Castro. La participación directa de estadounidenses en la invasión es otro tema relevante. Según esta historia, cuatro pilotos norteamericanos murieron en Bahía de Cochinos, pero no fue hasta 1976 que  recibieron medallas   póstumas por sus servicios.

Me parece indiscutible que esta versión oficial abonará al análisis de un momento cumbre de la guerra fría.

            Norberto Barreto Velázquez, PhD

Lima, Perú, 15 de agosto de 2011

Read Full Post »

huellas imperiales - libroGracias a las gestiones de la amiga Cristina Hinojosa de la biblioteca de la Pontificia Universidad Católica del Perú, he podido examinar un libro que me resultó no sólo muy interesante, sino también de un gran potencial pedagógico. Se trata de Huellas imperiales: Historia de los Estados Unidos de América: De la crisis de acumulación a la globalización capitalista, 1929-2000 (Buenos Aires: Ediciones Imago Mundi, 2003). Compilado por los Doctores Pablo Pozzi y Fabio G. Nigra –profesores de la Cátedra de Historia de los Estados Unidos de la Universidad de Buenos AiresHuellas imperiales es una colección de ensayos  que analizan la  historia norteamericana desde la crisis de la Gran Depresión hasta los albores del siglo XXI.

Pozzi

Pablo Pozzi

Divididos en  una sección de introducción general y cuatro grupos cronológicos, los veintiocho ensayos que componen este libro abarcan una interesante variedad de temas.   El primer grupo cronológico abarca los años 1929 y 1945 y está compuesta por ensayos que analizan el desarrollo de una cultura consumista, la crisis de 1929, y otros temas relacionados con el Nuevo Trato. Curiosamente, no hay un sólo trabajo que enfoque el tema de la Segunda Guerra Mundial. El segundo grupo cronológico está definido por las primeras décadas de la Guerra Fría (1945-1961), por lo que no debe sorprender que la mayoría de los ensayos de esta sección enfoquen de algún modo el tema del conflicto soviético-norteamericano. Además, destacan dos trabajos sobre los derechos civiles (uno de Analía Martí y otro de María Graciela Abarca) y un ensayo de Claudio González Chiaramonte analizando el desarrollo de la política exterior norteamericana en el siglo XX.

La tercera sección del libro  enfoca el desarrollo histórico norteamericano en las décadas de 1960 y 1970. Aquí resalta un ensayo de Howard Zinn sobre la desobediencia civil. La última parte de Huellas imperiales está dedicada a los últimos veinticinco años del siglo XX y destaca aquí un trabajo de Márgara Averbach analizando el contenido de las películas de dibujos animados creadas por Disney en la década de 1990.

Fabio Nigra

Fabio Nigra

Este libro ha sido un gran descubrimiento para mí, pues una de mis grandes frustraciones en la enseñanza de historia de los Estados Unidos a estudiantes hispanoparlantes ha sido la dificultad de encontrar lecturas en castellano de calidad que enfoquen temas de importancia. Los profesores Pozzi y Nigra han hecho una gran aportación a la enseñanza de la historia estadounidense al compilar un número significativo de ensayos escritos o traducidos al español que abarcan un variado campo temático y que abren un mundo de posibilidades de análisis y discusión del desarrollo histórico de los Estados Unidos de América. A ellos va mi agradecimiento.

Norberto Barreto Velázquez, Ph. D.
Lima, 25 de julio de 2009

Read Full Post »

« Newer Posts - Older Posts »