Fair Use Blog

Archive for the ‘Quotes’ Category

Over My Shoulder #15: Robert Whitaker (2002), Mad in America

You know the rules; here’s the quote. This is again delayed (this time, by the belated Tyrannicide Day celebration of going to see V for Vendetta on opening night; in case you’re wondering, it’s very good, but you should read the comic book, too, or you’ll miss out on a lot of good stuff). This week’s reading is from the bus on the way to work: a long passage from the first chapter of Robert Whitaker’s Mad in America: Bad Science, Bad Medicine, and the Enduring Mistreatment of the Mentally Ill (2002). Whitaker is explaining the historical backdrop of Benjamin Rush’s European medical training:

One of the first English physicians to write extensively on madness, its nature, and the proper treatment for it was Thomas Willis. He as highly admired for his investigations into the nervous system, and his 1684 text on insanity set the tone for the many medical guides that would be written over the next 100 years by English mad-doctors. The book’s title neatly summed up his views of the mad: The Practice of Physick: Two Discourses Concerning the Soul of Brutes. His belief—that the insane were animal-like in kind—reflected prevailing conceptions about the nature of man. The great English scientists and philosophers of the seventeenth century—Francis Bacon, Isaac Newton, John Locke, and others—had all argued that reason was the faculty that elevated humankind above the animals. This was the form of intelligence that enabled man to scientifically know his world, and to create a civilized society. Thus the insane, by virtue of having lost their reason, were seen as having descended to a brutish state. They were, Willis explained, fierce creatures who enjoyed superhuman strength. They can break cords and chains, break down doors or walls … they are almost never tired … they bear cold, heat, watching, fasting, strokes, and wounds, without any sensible hurt. The mad, he added, if they were to be cured, needed to hold their physicians in awe and think of them as their tormentors.

Discipline, threats, fetters, and blows are needed as much as medical treatment … Truly nothing is more necessary and more effective for the recovery of these people than forcing them to respect and fear intimidation. By this method, the mind, held back by restraint is induced to give up its arrogance and wild ideas and it soon becomes meek and orderly. This is why maniacs often recover much sooner if they are treated with tortures and torments in a hovel instead of with medicaments.

A medical paradigm for treating the mad had been born, and eighteenth-century English medical texts regularly repeated this basic wisdom. In 1751, Richard Mead explained that the madman was a brute who could be expected to attack his fellow creatures with fury like a wild beast and thus needed to be tied down and even beat, to prevent his doing mischief to himself or others. Thomas Bakewell told of how a maniac bellowed like a wild beast, and shook his chain almost constantly for several days and nights … I therefore got up, took a hand whip, and gave him a few smart stripes upon the shoulders… He disturbed me no more. Physician Charles Bell, in his book Essays on the Anatomy of Expression in Painting, advised artists wishing to depict madmen to learn the character of the human countenance when devoid of expression, and reduced to the state of lower animals.

Like all wild animals, lunatics needed to be dominated and broken. The primary treatments advocated by English physicians were those that physically weakened the mad—bleeding to the point of fainting and the regular use of powerful purges, emetics, and nausea-inducing agents. All of this could quickly reduce even the strongest maniac to a pitiful, whimpering state. William Cullen, reviewing bleeding practices, noted that some advised cutting into the jugular vein. Purges and emetics, which would make the mad patient violently sick, were to be repeatedly administered over an extended period. John Monro, superintendent of Bethlehem Asylum, gave one of his patients sixty-one vomit-inducing emetics in six months, including strong doses on eighteen sucessive nights. Mercury and other chemical agents, meanwhile, were used to induce nausea so fierce that the patient could not hope to have the mental strength to rant and rave. While nausea lasts, George Man Burrows advised, hallucinations of long adherence will be suspended, and sometimes be perfectly removed, or perhaps exchanged for others, and the most furious will become tranquil and obedient. It was, he added, far safer to reduce the patient by nauseating him than by depleting him.

A near-starvation diet was another recommendation for robbing the madman of his strength. The various depleting remedies—bleedings, purgings, emetics, and nausea-inducing agents—were also said to be therapeutic because they inflicted considerable pain, and thus the madman’s mind became focused on this sensation rather than on his usual raving thoughts. Blistering was another treatment useful for stirring great bodily pain. Mustard powders could be rubbed on a shaved scalp, and once the blisters formed, a caustic rubbed into the blisters to further irritate and infect the scalp. The suffering that attends the formation of these pustules is often indescribable, wrote one physician. The madman’s pain could be expected to increase as he rubbed his hands in the caustic and touched his genitals, a pain that would enable the patient to regain consciousness of his true self, to wake from his supersensual slumber and to stay awake.

All of these physically depleting, painful therapies also had a psychological value: They were feared by the lunatics, and thus the mere threat of their employment could get the lunatics to behave in a better manner. Together with liberal use of restraints and an occasional beating, the mad would learn to cower before their doctors and attendants. In most cases it has appeared to be necessary to employ a very constant impression of fear; and therefore to inspire them with the awe and dread of some particular persons, especially of those who are to be constantly near them, Cullen wrote. This awe and dread is therefore, by one means or other, to be acquired; in the first place by their being the authors of all the restraints that may be occasionally proper; but sometimes it may be necessary to acquire it even by stripes and blows. The former, although having the appearance of more severity, are much safer than strokes or blows about the head.

Such were the writings of English mad-doctors in the 1700s. The mad were to be tamed. But were such treatments really curative? In the beginning, the mad-doctors were hesitant to make that claim. But gradually they began to change their tune, and they did so for a simple reason: It gave them a leg up in the profitable madhouse business.

In eighteenth-century England, the London asylum Bethlehem was almost entirely a place for the poor insane. The well-to-do in London shipped their family lunatics to private madhouses, a trade that had begun to emerge in the first part of the century. These boarding houses also served as convenient dumping grounds for relatives who were simply annoying or unwanted. Men could get free from their wives in this manner—had not their noisome, bothersome spouses gone quite daft in the head? A physician who would attest to this fact could earn a nice sum—a fee for the consultation and a referral fee from the madhouse owner. Doctors who owned madhouses mad out particularly well. William Battie, who operated madhouses in Islington and Clerkenwell, left an estate valued at between £100,000 and £200,000, a fabulous sum for the time, which was derived largely from this trade.

Even though most of the mad and not-so-mad committed to the private madhouses came from better families, they could still expect neglect and the harsh flicker of the whip. As reformer Daniel Defoe protested in 1728, Is it not enough to make any one mad to be suddenly clap’d up, stripp’d, whipp’d, ill fed, and worse us’d? In the face of such public criticism, the madhouse operators protested that their methods, while seemingly harsh, were remedies that could restore the mad to their senses. The weren’t just methods for managing lunatics, but curative medical treatments. In 1758, Battie wrote: Madness is, contrary to the opinion of some unthinking persons, as manageable as many other distempers, which are equally dreadful and obstinate. He devoted a full three chapters to cures.

In 1774, the English mad trade got a boost with the passage of the Act for Regulating Madhouses, Licensings, and Inspection. Thenew law prevented the commitment of a person to a madhouse unless a physician had certified the person as insane (which is the origin of the term certifiably insane). Physicians were now the sole arbiters of insanity, a legal authority that mad the mad-doctoring trade more profitable than ever. Then, in 1788, King George III suffered a bout of madness, and his recovery provided the mad-doctors with public proof of their curative ways.

Francis Willis, the prominent London physician called upon by the queen to treat King George, was bold in proclaiming his powers. He boasted to the English Parliament that he could reliably cure nine out of ten mad patients and that he rarely missed curing any [patients] that I had so early under my care: I mean radically cured. On December 5, 1788, he arrived at the king’s residence in Kew with an assistant, three keepers, a straight waistcoat, and the belief that a madman needed to be broken like a horse in a manège. King George III was so appalled by the sight of the keepers and the straight waistcoat that he flew into a rage—a reaction that caused Willis to immediately put him into the confining garment.

As was his custom, Willis quickly strove to assert his dominance over his patient. When the king resisted or protested in any way, Willis had him clapped into the straight-waistcoat, often with a band across his chest, and his legs tied to the bed. Blisters were raised on the king’s legs and quickly became infected, the king pleading that the pustules burned and tortured him—a complaint that earned him yet another turn in the straight waistcoat. Soon his legs were so painful and sore that he couldn’t walk, his mind now wondering how a king lay in this damned confined condition. He was repeatedly bled, with leeches placed on his templates, and sedated with opium pills. Willis also surreptitiously laced his food with emetics, which made the king so violently sick that, on one occasion, he knelt on his chair and prayed that God would be pleased either to restore Him to his Senses, or permit that He might die directly.

In the first month of 1789, the battle between the patient and doctor became ever more fierce. King George III—bled, purged, blistered, restrained, and sedated, his food secretly sprinkled with a tartar emetic to make him sick—sought to escape, offering a bribe to his keepers. He would give them annuities for life if they would just free him from the mad-doctor. Willis responded by bringing in a new piece of medical equipment—a restraint chair that bound him more tightly than the straight waistcoat—and by replacing his pages with strangers. The king would no longer be allowed the sight of familiar faces, which he took as evidence that Willis’s men meant to murder him.

In late February, the king made an apparently miraculous recovery. His agitation and delusions abated, and he soon resumed his royal duties. Historians today believe that King George III, rather than being mad, suffered from a rare genetic disorder, called porphyria, which can lead to high levels of toxic substance in the body that cause temporary delirium. He might have recovered more quickly, they believe, if Willis’s medical treatment had not so weakened him that they aggravated the underlying condition. But in 1789, the return of the king’s sanity was, for the mad-doctors, a medical triumph of the most visible sort.

In the wake of the king’s recovery, a number of English physicians raced to exploit the commercial opportunity at hand by publishing their novel methods for curing insanity. Their marketing message was often as neat as a twentieth century sound bite: Insanity proved curable. One operator of a madhouse in Chelsea, Benjamin Faulkner, even offered a money-back guarantee: Unless patients were cured within six months, all board, lodging, and medical treatments would be provided free of all expence whatever. The mad trade in England flourished. The number of private madhouses in the London area increased from twenty-two in 1788 to double that number by 1820, growth so stunning that many began to worry that insanity was a malady particularly common to the English.

In this era of medical optimism, English physicians—and their counterparts in other European countries—developed an ever more innovative array of therapeutics. Dunking the patient in water became quite popular—a therapy intended both to cool the patient’s scalp and to provoke terror. Physicians advised pouring buckets of water on the patient from a great height or placing the patient under a waterfall; they also devised machines and pumps that could pummel the patient with a torrent of water. The painful blasts of water were effective as a remedy and a punishment, one that made patients complain of pain as if the lateral lobes of the cerebrum were split asunder. The Bath of Surprise became a staple of many asylums: The lunatic, often while being led blindfolded across a room, would suddenly be dropped through a trapdoor into a tub of cold water—the unexpected plunge hopefully inducing such terror that the patient’s senses might be dramatically restored. Cullen found this approach particularly valuable:

Maniacs have often been relieved, and sometimes entirely cured, by the use of cold bathing, especially when administered in a certain manner. This seems to consist, in throwing the madman in the cold water by surprise; by detaining him in it for some length of time; and pouring water frequently upon the head, while the whole of the body except the head is immersed in the water; and thus managing the whole process, so as that, with the assistance of some fear, a refrigerant effect may be produced. This, I can affirm, has been often useful.

The most extreme form of water therapy involved temporarily drowning the patient. This practice had its roots in a recommendation made by the renowned clinician of Leyden, Hermann Boerhaave. The greatest remedy for [mania] is to throw the Patient unwarily into the Sea, and to keep him under Water as long as he can possibly bear without being quite stifled. Burrows, reviewing this practice in 1828, said it was designed to create the effect of asphyxia, or suspension of vital as well as of all intellectual operations, so far as safety would permit. Boerhaave’s advice led mad-doctors to concoct various methods for stimulating drowning such as placing the patient into a box drilled with holes and then submerging it underwater. Joseph Guislain built an elaborate mechanism for drowning the patient, which he called The Chinese Temple. The maniac would be locked into an iron cage that would be mechanically lowered, much in the manner of an elevator car, into a pond. To expose the madman to the action of this device, Guislain explained, he is led into the interior of this cage: one servant shutsthe door from the outside while the other releases a break which, by this maneuver, causes the patient to sink down, shut up in the cage, under the water. Having produced the desired effect, one raises the machine again.

The most common mechanical device to be employed in European asylums during this period was a swinging chair. Invented by Englishma Joseph Mason Cox, the chair could, in one fell swoop, physically weaken the patient, inflict great pain, and invoke terror—all effects perceived as therapeutic for the mad. The chair, hung from a wooden frame, would be rotated rapidly by an operator to induce in the patient fatigue, exhaustion, pallor, horripilatio [goose bumps], vertigo, etc, thereby producing new associations and trains of thoughts. In the hands of a skilled operator, able to rapidly alter the directional motion of the swing, it could reliably produce nausea, vomiting, and violent convulsions. Patients would also involuntarily urinate and defecate, and plead for the machine to be stopped. The treatment was so powerful, said one nineteenth-century physician, that if the swing didn’t make a mad person obedient, nothing would.

Once Cox’s swing had been introduced, asylum doctors tried many variations on the theme—spinning beds, spinning stools, and spinning boards were all introduced. In this spirit of innovation and medical advance, one inventor built a swing that could twirl four patients at once, at revolutions up to 100 per minute. Cox’s swing and other twirling devices, however, were eventually banned by several European governments, the protective laws spurred by a public repulsed by the apparent cruelty of such therapeutics. This governmental intrusion into medical affairs caused Burrows, a madhouse owner who claimed that he cured 91 percent of his patients, to complain that an ignorant public would instruct us that patient endurance and kindliness of heart are the only effectual remedies for insanity!

Even the more mainstream treatments—the Bath of Surprise, the swinging chair, the painful blistering—might have given a compassionate physician like Rush pause. But mad-doctors were advised not to let their sentiments keep them from doing their duty. It was the highest form of cruelty, one eighteenth-century physician advised, not to be bold in the Administration of Medicine. Even those who urged that the insane, in general, should be treated with kindness, saw a need for such heroic treatments to knock down mania. Certain cases of mania seem to require a boldness of practice, which a young physician of sensibility may feel a reluctance to adopt, wrote Thomas Percival, setting forth ethical guidelines for physicians. On such occasions he must not yield to timidity, but fortify his mind by the councils of his more experienced brethren of the faculty.

—Robert Whitaker (2002), Mad in America, pp. 6–13.

This book is one of the only things I’ve read that ever made me cry.

Further reading

Over My Shoulder #14: Robin Morgan (1981), Blood Types: An Anatomy of Kin

You know the rules; here’s the quote. This one has been delayed from Friday to Saturday by the government attacks on women at a International Women’s Day commemoration in Tehran. So in commemoration of those women, and of what they put their bodies on the line for, here’s something on the theme of feminist internationalism, women, and governments. This is bus reading, collected in Robin Morgan’s The Word of a Woman: Feminist Dispatches 1968–1992 (ISBN 0-393-03427-5): specifically, Blood Types: An Anatomy of Kin a meditative discussion on family, identity, sex, and race, written in 1981.

Mary Daly’s turn-the-concept-inside-out phrase, The Sisterhood of Man seems not only a hope but a dynamic actuality—since it’s grounded not in abstract notions of cooperation but in survival need, not in static posture but in active gesture, not in vague sentiments of similarity but in concrete experience shared to an astonishing degree, despite cultural, historical, linguistic, and other barriers. Labor contractions feel the same everywhere. So does rape and battery. I don’t necessarily always agree with many feminists that women have access to some mysteriously inherent biological nexus, but I do believe that Elizabeth Cady Stanton was onto something when she signed letters, Thine in the bonds of oppressed womanhood (italics mine). Let us hope—and act to ensure—that as women break those bonds of oppression, the process of freeing the majority of humanity will so transform human consciousness that women will not use our freedom to be isolatedly individuated as men have done. In the meanwhile, the bonds do exist; let’s use them creatively.

Not that the mechanistic universe inhabited by the family of Man takes notice of this quarky interrelationship between the hardly visible subparticles that merely serve to keep Man and his [sic] family alive. No, such particles are unimportant, fantastical, charming perhaps (as quarks or the fair sex tend to be). But they are to be taken no more seriously than fairytales.

Yet if Hans Christian Andersen characters so diverse as the Little Mermaid, the Robber Girl, the Snow Queen, and the Little Match Girl had convened a meeting to discuss ways of bettering their condition, one could imagine that the world press would cover that as a big story. When something even more extraordinary, because more real, happened in Andersen’s own city for three weeks during July 1980, it barely made the news.

Approximately ten thousand women from all over the planet began arriving in Copenhagen, Denmark, even before the formal opening on July 14 of the United Nations Mid-Decade World Conference for women. The conference was to become a great, sprawling, rollicking, sometimes quarrelsome, highly emotional, unashamedly idealistic, unabashedly pragmatic, visionary family reunion. In 1975, the U.N. had voted to pay some attention to the female more-than-half of the human population for one year—International Women’s Year—but extended the time to a decade after the indignant outcry of women who had been living, literally, in the International Men’s Year for approximately ten millennia of patriarchy. Still, here we were, in the middle of our decade, in Copenhagen. We came in saris and caftans, in blue jeans and chadors, in African geles, pants-suits, and dresses. We were women with different priorities, ideologies, political analyses, cultural backgrounds, and styles of communication. The few reports that made it into the U.S. press emphasized those differences, thereby overlooking the big story—that these women forged new and strong connections.

There were two overlapping meetings in Copenhagen. One was the official U.N. conference—which many feminists accurately had prophesied would be more a meeting of governments than of women. Its delegates were chosen by governments of U.N. member states to psittaceously repeat national priorities—as defined by men.

The official conference reflected the government orientation: many delegations were headed by men and many more were led by safe women whose governments were certain wouldn’t make waves. This is not to say that there weren’t some real feminists tuckd away even in the formal delegations, trying gallantly to influence their respective bureaucracies towards more human concern with actions that really could better women’s lives. But the talents of these sisters within were frequently ignored or abused by their own delegations for political reasons.

A case in point was the U.S. delegation, which availed itself greedily of all the brilliant and unique expertise of Koryne Horbal (then U.S. representative to the U.N. Commission on the Status of Women), and of all the groundwork she had done on the conference for the preceding two years—including being the architect of CEDAW, the Convention to Eliminate All Forms of Discrimination Against Women—but denied her press visibility and most simple courtesies because she had been critical of the Carter administration and its official policies on women. But Horbal wasn’t the only feminist within. There were New Zealand’s member of Parliament, the dynamic twenty-eight-year-old Marilyn Waring, and good-humored Maria de Lourdes Pintasilgo, former prime minister of Portugal, and clever Elizabeth Reid of Australia—all of them feminists skilled in the labyrinthian ways of national and international politics, but with priority commitment to populist means of working for women—who still managed to be effective inside and outside the structures of their governments.

The other conference, semiofficially under U.N. aegis, was the NGO (Non-Governmental Organization) Forum. It was to the Forum that ordinary folks came, having raised the travel fare via their local women’s organizations, feminist alternative media, or women’s religious, health, and community groups. Panels, workshops, kaffeeklatsches, cultural events, and informal sessions abounded.

Statements emerged and petitions were eagerly signed: supporting the prostitutes in São Palo, Brazil, who that very week, in an attempt to organize for their human rights, were being jailed, tortured, and, in one case, accidentally executed; supporting Arab and African women organizing against the practice of female genital mutilation; supporting U.S. women recently stunned by the 1980 Supreme Court decision permitting federal and state denial of funds for medical aid to poor women who need safe, legal abortions—thus denying the basic human right of reproductive freedom; supporting South African women trying to keep families together under the maniacal system of apartheid; supporting newly exiled feminist writers and activists from the U.S.S.R.; supporting women refugees from Afghanistan, Campuchea [Cambodia], Palestine, Cuba, and elsewhere.

Protocol aside, the excitement among women at both conference sites was electric. If, for instance, you came from Senegal with a specific concern about rural development, you would focus on workshops about that, and exchange experiences and how-to’s with women from Peru, India—and Montana. After one health panel, a Chinese gynecologist continued talking animatedly with her scientific colleague from the Soviet Union—Sino-Soviet saber-rattling forgotten or transcended.

Comparisons developed in workshops on banking and credit between European and U.S. economists and the influential market women of Africa. The list of planned meetings about Women’s Studies ran to three pages, yet additional workshops on the subject were created spontaneously. Meanwhile, at the International Women’s Art Festival, there was a sharing of films, plays, poetry readings, concerts, mime shows, exhibits of painting and sculpture and batik and weaving, the interchanging of art techniques and of survival techniques. Exchange subscriptions were pledged between feminist magazines in New Delhi and Boston and Tokyo, Maryland and Sri Lanka and Australia. And everywhere the conversations and laughter of recognition and newfound friendships spilled over into the sidewalks of Copenhagen, often until dawn.

We ate, snacked, munched—and traded diets—like neighbor women, or family. A well-equipped Argentinian supplied a shy Korean with a tampon in an emergency. A Canadian went into labor a week earlier than she’d expected, and kept laughing hilariously between the contractions, as she was barraged with loving advice on how to breathe, where to rub, how to sit (or stand or squat), and even what to sing—in a chorus of five languages, while waiting for the prompt Danish ambulance. North American women from diverse ethnic ancestries talked intimately with women who still lived in the cities, towns, and villages from which their own grandmothers had emigrated to the New World. We slept little, stopped caring about washing our hair, sat on the floor, and felt at home with one another.

Certainly, there were problems. Simultaneous translation facilities, present everywhere at the official conference, were rarely available at the grass-roots forum. This exacerbated certain sore spots, like the much-ballyhooed Palestinian-Israeli conflict, since many Arab women present spoke Arabic or French but not English—the dominant language at the forum. That conflict—played out by male leadership at both the official conference and the forum, using women as pawns in the game—was disheartening, but not as bad as many of us had feared.

The widely reported walkout of Arab women during Madam Jihan Sadat’s speech at the conference was actually a group of perhaps twenty women tiptoeing quietly to the exit. This took place in a huge room packed with delegates who—during all the speeches—were sitting, standing, and walking about to lobby loudly as if on the floor of the U.S. Congress (no one actually listens to the speeches; they’re for the recrd).

Meanwhile, back at the forum, there was our own invaluable former U.S. congresswoman Bella Abzug (officially unrecognized by the Carter-appointed delegation but recognized and greeted with love by women from all over the world). Bella, working on coalition building, was shuttling between Israelis and Arabs. At that time, Iran was still holding the fifty-two U.S. hostages, but Bella accomplished the major miracle of getting a pledge from the Iranian women that if U.S. mothers would demonstrate in Washington for the shah’s ill-gotten millions to be returned to the Iranian people (for the fight against women’s illiteracy and children’s malnutrition), then the Iranian women would march simultaneously in Teheran for the hostages to be returned home to their mothers. Bella’s sensitivity and cheerful, persistent nudging on this issue caused one Iranian woman to throw up her hands, shrug, and laugh to me, What is with this Bella honey person? She’s wonderful. She’s impossible. She’s just like my mother.

The conference, the forum, and the arts festival finally came to an end. Most of the official resolutions were predictably bland by the time they were presented, much less voted on. Most of the governments will act on them sparingly, if at all. Consequently, those women who went naively trusting that the formal U.N. procedures would be drastically altered by such a conference were bitterly disappointed. But those of us who went with no such illusions, and who put not our trust in patriarchs, were elated. Because what did not end at the closing sessions isthat incredible networking—the echoes of all those conversations, the exchanged addresses—and what that will continue to accomplish.

—Robin Morgan (1981): Blood Types: An Anatomy of Kin, reprinted in The Word of a Woman: Feminist Dispatches 1968–1992, pp. 115–120.

Over My Shoulder #13: Jill Lepore’s New York Burning: Liberty, Slavery, and Conspiracy in Eighteenth-Century Manhattan

You know the rules; here’s the quote. Lucky #13 was either airplane reading or bus reading; I don’t recall precisely what I was reading when. In either case, though, it’s from the Preface to Jill Lepore’s new book, New York Burning: Liberty, Slavery, and Conspiracy in Eighteenth-Century Manhattan. It’s the story of something that many of us know about, and some other things that almost all of us have forgotten, but need to remember. Thus:

This book tells the story of how one kind of slavery made another kind of liberty possible in eighteenth-century New York, a place whose past has long been buried. It was a beautiful city, a crisscross of crooked cobblestone streets boasting both grand and petty charms: a grassy park at the Bowling Green, the stone arches at City Hall, beech trees shading Broadway like so many parasols, and, off rocky beaches, the best oysters anywhere. I found it extremely pleasant to walk the town, one visitor wrote in 1748, for it seemed like a garden. But on this granite island poking out like a sharp tooth between the Hudson and East rivers, one in five inhabitants was enslaved, making Manhattan second only to Charleston, South Carolina, in a wretched calculus of urban unfreedom.

New York was a slave city. Its most infamous episode is hardly known today: over a few short weeks in 1741, ten fires blazed across the city. Nearly two hundred slaves were suspected of conspiring to burn every building and murder every white. Tried and convicted before the colony’s Supreme court, thirteen black men were burned at the stake. Seventeen more were hanged, two of their dead bodies chained to posts not far from the Negroes Burial Ground, left to bloat and rot. One jailed man cut his own throat. Another eighty-four men and women were sold into yet more miserable, bone-crushing slavery in the Caribbean. Two white men and two white women, the alleged ringleaders, were hanged, one of them in chains; seven more white men were pardoned on the condition that they never set foot in New York again.

What happened in New York in 1741 is so horrifying—Bonfires of the Negroes, one colonist called it—that it’s easy to be blinded by the brightness of the flames. But step back, let the fires flicker in the distance, and they cast their light not only on the 1741 slave conspiracy but on the American paradox, illuminating a far better known episode in New York’s past: the 1735 trial of the printer John Peter Zenger.

In 1732, a forty-two-year-old English gentleman named William Cosby arrived in New York, having been appointed governor by the king. New Yorkers soon learned, to their dismay, that their new governor ruled y a three-word philosophy: God damn ye. Rage at Cosby’s ill-considered appointment grew with his every abuse of the governorship. Determined to oust Cosby from power, James Alexander, a prominent lawyer, hired Zenger, a German immigrant, to publish an opposition newspaper. Alexander supplied scathing, unsigned editorials criticizing the governor’s administration; Zenger set the type. The first issue of Zenger’s New-York Weekly Journal was printed in November 1733. Cosby could not, would not abide it. He assigned Daniel Horsmanden, an ambitious forty-year-old Englishman new to the city, to a committee, charged with pointing out the particular Seditious paragraphs in Zenger’s newspaper. The governor then ordered the incendiary issues of Zenger’s newspaper burned, and had Zenger arrested for libel.

Zenger was tried before the province’s Supreme Court in 1735. His attorney did not deny that Cosby was the object of the editorials in the New-York Weekly Journal. Instead, he argued, first, that Zenger was innocent because what he printed was true, and second, that freedom of the press was especially necessary in the colonies, where other checks against governors’ powers were weakened by their distance from England. It was an almost impossibly brilliant defense, which at once defied legal precedent—before the Zenger case, truth had never been a defense against libel—and had the effect of putting the governor on trial, just what Zenger’s attorney wanted, since William Cosby, God damn him, was a man no jury could love. Zenger was acquitted. The next year, James Alexander prepared and Zenger printed A Brief Narrative of the Case and Trial of John Peter Zenger, which was soon after reprinted in Boston and London. It made Zenger famous.

But the trial of John Peter Zenger is merely the best-known episode in the political maelstrom that was early eighteenth-century New York. We are in the midst of Party flames, Daniel Horsmanden wrly observed in 1734, as Cosby’s high-handedness ignited the city. Horsmanden wrote in an age when political parties were considered sinister, invidious, and destructive of good government. As Alexander Pope put it in 1727, Party is the madness of many, for the gain of a few. Or, as Viscount St. John Bolingbroke remarked in his 1733 Dissertation upon Parties: The spirit of party … inspires animosity and breeds rancour. Nor did the distaste for parties diminish over the course of the century. In 1789, Thomas Jefferson wrote: If I could not go to heaven but with a party, I would not go there at all.

Parties they may have despised, but, with William Cosby in the governor’s office, New Yorkers formed them, dividing themselves between the opposition Country Party and the Court Party, loyal to the governor. Even Cosby’s death in March 1736 failed to extinguish New York’s Party flames. Alexander and his allies challenged the authority of Cosby’s successor, George Clarke, and established a rival government. Warned of a plot to seize his person or kill him in the Attempt, Clarke retreated to Fort George, at the southern tip of Manhattan, & put the place in a posture of Defence. In the eyes of one New Yorker, we had all the appearance of a civil War.

And then: nothing. No shots were fired. Nor was any peace ever brokered: the crisis did no so much resolve as it dissipated. Soon after barricading himself in Fort George, Clarke received orders from London confirming his appointment. The rival government was disbanded. By the end of 1736, Daniel Horsmanden could boast, Zenger is perfectly silent as to polliticks. Meanwhile, Clarke rewarded party loyalists: in 1737 he appointed Horsmanden to a vacant seat on the Supreme Court. But Clarke proved a more moderate man than his predecessor. By 1739, under his stewardship, the colony quieted.

What happened in New York City in the 1730s was much more than a dispute over the freedom of the press. It was a dispute about the nature of political opposition, during which New Yorkers briefly entertained the heretical idea that parties were not only necessary in free Government, but of great Service to the Public. As even a supporter of Cosby wrote in 1734, Parties are a check upon one another, and by keeping the Ambition of one another within Bounds, serve to maintain the public Liberty. And it was, equally, a debate about the power of governors, the nature of empire, and the role of the law in defending Americans against arbitrary authority—the kind of authority that constituted tyranny, the kind of authority that made men slaves. James Alexander saw himself as a defender of the rule of law in a world that, because of its very great distance from England, had come to be ruled by men. His opposition was not so much a failure as a particularly spectacular stretch of road along a bumpy, crooked path full of detours that, over the course of the century, led to American independence. Because of it, New York became infamous for its unruly spirit of independency. Clarke, shocked, reported to his superiors in England that New Yorkers believe if a Governor misbehave himself they may depose him and set up an other. the leaders of the Country Party trod very near to what, in the 1730s, went by the name of treason. A generation later, their sons would call it revolution.

In early 1741, less than two years after Clarke calmed the province, ten fires swept through the city. Fort George was nearly destroyed; Clarke’s own mansion, inside the fort, burned to the ground. Daniel Horsmanden was convinced that the fires had been set on Foot by some villainous Confederacy of latent Enemies amongst us, a confederacy that sounded a good deal like a violent political party. But which enemies? No longer fearful that Country Party agitators were attempting to take his life, Clarke, at Horsmanden’s urging, turned his suspicion on the city’s slaves. With each new fire, panicked white New Yorkers cried from street corners, The Negroes are rising! Early evidence collected by a grand jury appointed by the Supreme Court hinted at a vast and elaborate conspiracy: on the outskirts of the city, in a tavern owned by a poor and obscure English cobbler named John Hughson, tens and possibly hundreds of black men had been meeting secretly, gathering weapons and plotting to burn the city, murder every white man, appoint Hughson their king, and elect a slave named Caesar governor.

This political opposition was far more dangerous than anything led by James Alexander. The slave plot to depose one governor and set up another—a black governor—involved not newspapers and petitions but arson and murder. It had to be stopped. In the spring and summer of 1741, New York magistrates arrested 20 whies and 152 blacks. To Horsmanden, it seemed very probable that most of the Negroes in Town were corrupted. Eighty black men and one black woman confessed and named names, sending still more to the gallows and the stake.

That summer, a New Englander wrote an anonymous letter to New York. I am a stranger to you & to New York, he began. But he had heard of the bloody Tragedy afflicting the city: the relentless cycle of arrests, accusations, hasty trials, executions, and more arrests. This puts me in mind of our New England Witchcraft in the year 1692, he remarked, Which if I dont mistake New York justly reproached us for, & mockt at our Credulity about.

Here was no idle observation. The 1741 New York conspiracy trials and the 1692 Salem witchcraft trials had much in common. Except that what happened in New York in 1741 was worse, and has been almost entirely forgotten. In Salem, twenty people were executed, compared to New York’s thirty-four, and none were burned at the stake. However much it looks like Salem in 1692, what happened in New York in 1741 had more to do with revolution than witchcraft. and it is inseparable from the wrenching crisis of the 1730s, not least because the fires in 1741 included attacks on property owned by key members of the Court Party; lawyers from both sides of the aisle in the legal battles of the 1730s joined together to prosecute slaves in 1741; and slaves owned by prominent members of the Country Party proved especially vulnerable to prosecution.

But the threads that tie together the crises of the 1730s and 1741 are longer than the list of participants. The 1741 conspiracy and the 1730s opposition party were two faces of the same coin. By the standards of the day, both faces were ugly, disfigured, deformed; they threatened the order of things. But one was very much more dangerous than the other: Alexander’s political party plotted to depose the governor; the city’s slaves, allegedly, plotted to kill him. The difference made Alexander’s opposition seem, relative to slave rebellion, harmless, and in doing so made the world safer for democracy, or at least, and less grandly, both more amenable to and more anxious about the gradual and halting rise of political parties.

Whether enslaved men and women actually conspired in New York in 1741 is a question whose answer lies buried deep in the evidence, if it survives at all. It is worth excavating carefully. But even the specter of a slave conspiracy cast a dark shadow across the political landscape. Slavery was, always and everywhere, a political issue, but what happened in New York suggests that it exerted a more powerful influence on political life: slaves suspected of conspiracy constituted both a phantom political party and an ever-threatening revolution. In the 1730s and ’40s, the American Revolution was years away and the real emergence of political parties in the new United States, a fitful process at best, would have to wait until the last decade of the eighteenth century. (Indeed, one reason that colonists only embraced revolution with ambivalence and accepted parties by fits and starts may be that slavery alternately ignited and extinguished party flames: the threat of black rebellion made white political opposition palatable, even as it established its limits and helped heal the divisions it created.) But during those fateful months in the spring and summer of 1741, New York’s Court Party, still reeling from the Country Party’s experiments in political opposition, attempted to douse party flames by burning black men at the stake. New York is not America, but what happened in that eighteenth-century slave city tells one story, and a profoundly troubling one, of how slavery destabilized—and created—American politics.

— Jill Lepore (2005), New York Burning: Liberty, Slavery, and Conspiracy in Eighteenth-Century Manhattan (ISBN 1400040299). xii–xviii.

Over My Shoulder #12: Michael Fellman (2002), The Making of Robert E. Lee

You know the rules. Here’s the quote. This is from Chapter 4 (Race and Slavery of Michael Fellman’s The Making of Robert E. Lee (2000). Of course I’ve written about this before, in GT 2005-01-03: Robert E. Lee owned slaves and defended slavery. I picked up Fellman’s book as another source to consult over the relevant sections of WikiPedia:Robert E. Lee. The passage contains some new material that I hadn’t been aware of before. It also contains a couple of minor factual errors; see below.

No historian has established how many slaves Lee actually owned before 1857, or how much income he derived from this source. The more general point is that to some extent he was personally involved in slave owning his whole adult life, as was the norm for better-off Southerners, even those who did not own plantations. Unlike many other slaveholders in Baltimore, for example, he did not manumit his personal slaves while he lived in that city and, indeed, recoiled at the thought of losing them. He carried them back with him when he returned to Virginia.

When his father-in-law died, late in 1857, Lee was left with the job of supervising Arlington and the various other Custis estates, perhaps as many as three others. Moreover, the Custis will specified that these slaves be freed by January 1, 1863 {sic—see below —RG}; therefore Lee had the dual tasks of managing these slaves in the interim and then freeing them, immersing him in the contradictions of owning, protecting, and exploiting people of a different and despised race. It was very likely that the Custis slaves knew that they were to be freed, which could have only made Lee’s efforts to succor, discipline, and extract labor from them in the meantime considerably more difficult.

Faced with this set of problems, Lee attempted to hire an overseer. He wrote to his cousin Edward C. Turner, I am no farmer myself & do not expect to be always here. I wish to get an energetic honest farmer, who while he will be considerate & kind to the negroes, will be firm & make them do their duty. Such help was difficult to find or to retain, and despite himself Lee had to take a leave of absence from the army for two years to become a slave manager himself, one who doubtless tried to combine kindness with firmness but whose experience was altogether unhappy. Any illusions he may have had about becoming a great planter, which apparently were at least intermittent, dissipated dramatically as he wrestled with workers who were far less submissive to his authority than were enlisted men in the army. The coordination and discipline central to Lee’s role in the army proved less compatible with his role as manager of slaves than he must have expected.

Sometimes, the carrot and the stick both worked ineffectively. On May 30, 1858, Lee wrote his son Rooney, I have had some trouble with some of the people. Reuben, Parks & Edward, in the beginning of the previous week, rebelled against my authority—refused to obey my orders, & said they were as free as I was, etc., etc.—I succeeded in capturing them & lodged them in jail. They resisted till overpowered & called upon the other people to rescue them. Enlightened masters in the upper South often sent their rebellious slaves to jail, where the sheriff would whip them, presumably dispassionately, rather than apply whippings themselves. Whatever happened in the Alexandria jail after this event, less than two months later Lee sent these three men down under lock and key to the Richmond slave trader William Overton Winston, with instructions to keep them in jail until Winston could hire them out to good & responsible men in Virginia, for a term lasting until December 31, 1862, by which time the Custis will stipulated that they be freed. Lee also noted to Winston, in a rather unusual fashion, I do not wish these men returned here during the usual holy days, but to be retained until called for. He hoped to quarantine his remaining slaves against these three men, to whom the deprivation of the customary Christmas visits would be a rather cruel exile, though well short, of course, of being sold to the cotton fields of the Deep South. At the same time, Lee sent along three women house slaves to Winston, adding, I cannot recommend them for honesty. Lee was packing off the worst malcontents. More generally, as he wrote in exasperation to Rooney, who was managing one of the other Custis estates at the time, so few of the Custis slaves had been broken to hard work in their youth that it would be accidental to fall in with a good one.

This sort of snide commentary about inherent slave dishonesty and laziness was the language with which Lee expressed his racism; anything more vituperative and crudely expressed would have diminished his gentlemanliness. Well-bred men expressed caste superiority with detached irony, not with brutal oaths about niggers.

The following summer, Lee conducted another housecleaning of recalcitrant slaves, hiring out six more to lower Virginia. Two, George Wesley and Mary Norris {sic—see below —RG}, had absconded that spring but had been recaptured in Maryland as they tried to reach freedom in Pennsylvania.

As if this were not problem enough, on June 24, 1859, the New York Tribune published two letters that accused Lee—while calling him heir to the Father of this free country—of cruelty to Wesley and Norris {sic—see below —RG}. They had not proceeded far [north] before their progress was intercepted by some brute in human form, who suspected them to be fugitives. They were transported back, taken in a barn, stripped, and the men [sic] received thirty and nine lashes each [sic], from the hands of the slave-whipper … when he refused to whip the girl … Mr. Lee himself administered the thirty and nine lashes to her. They were then sent to the Richmond jail. Lee did not deign to respond to this public calumny. All he said at that time was to Rooney: The N.Y. Tribune has attacked me for the treatment of your grandfather’s slaves, but I shall not reply. He has left me an unpleasant legacy. Remaining in dignified silence then, Lee continued to be agonized by this accusation for the rest of his life. Indeed, in 1866, when the Baltimore American reprinted this old story, Lee replied in a letter that might have been intended for publication, the statement is not true; but I have not thought proper to publish a contradiction, being unwilling to be drawn into a newspaper discussion, believing that those who know me would not credit it; and those who do not, would care nothing about it. With somewhat less aristocratic detachment, Lee wrote privately to E. S. Quirk of San Fransisco about this slander … There is not a word of truth in it. … No servant, soldier, or citizen that was ever employed by me can with truth charge me with bad treatment.

That Lee personally beat Mary Norris seems extremely unlikely, and yet slavery was so violent that it cast all masters in the roles of potential brutes. Stories such as this had been popularized earlier in the 1850s by Harriet Beecher Stowe in Uncle Tom’s Cabin, and they stung even the most restrained of masters, who understood that kindness alone would have been too indulgent, and corporal punishment (for which Lee substituted the euphemism firmness) was an intrinsic and necessary part of slave discipline. Although it was supposed to be applied only in a calm and rational manner, overtly physical domination of slaves, unchecked by law, was always brutal and potentially savage.

— Michael Fellman (2000), The Making of Robert E. Lee. New York: Random House. 64–67

No servant, soldier, or citizen that was ever employed by Robert E. Lee could with truth charge him with bad treatment. Except for having enslaved them.

The letters to the Trib are online at Letter from A Citizen (dated June 21, 1859) and Some Facts That Should Come to Light (dated June 19, 1859). Wesley Norris told his own story in 1866 after the war; it was printed in the National Anti-Slavery Standard on April 14, 1866.

Although Lee acted as if the will provided for him to keep the slaves until the last day of 1862, what Custis’s will actually said was And upon the legacies to my four granddaughters being paid, and my estates that are required to pay the said legacies, being clear of debts, then I give freedom to my slaves, the said slaves to be emancipated by my executors in such manner as to my executors may seem most expedient and proper, the said emancipation to be accomplished in not exceeding five years from the time of my decease. (Meaning that at the very latest the slaves should have been manumitted by October 10, 1862, the fifth anniversary of Custis’s death.) Fellman also seems to have misread the primary sources, which state that three slaves tried to leave in 1859 — Wesley Norris, Mary Norris, and a cousin whose name I haven’t yet been able to find. Mary and Wesley were the children of Sally Norris. It’s possible that Fellman misread a reference to a George, on the one hand, and Wesley and Mary Norris, on the other; in which case the third might have been George Clarke or George Parks. I’ll let you know if I find out more later.

Further reading

Over My Shoulder #11: Andrea Dworkin, Preface to the 1995 edition of Intercourse

You know the rules. Here’s the quote. After last week’s entry I’m running the risk of seeming as if I intend to use this gimmick as an outlet for all the Andrea Dworkin quotes that I find particularly apropos at the end of the week. I already have a running feature for that, but the fact is that other than fiction and material that I’m already transcribing for the Fair Use Repository, Dworkin’s most of what I’ve been reading for the past two weeks — in part as a result of a sometimes rather combative editing process over at WikiPedia:Andrea Dworkin, and in part because the stuff is nearly impossible to put down for long once you start reading parts of it. So rather than break the rules by picking up some item just to read it at the last minute to pick out another quote in the name of avoiding repetition, here we have some bus reading from earlier this afternoon: a passage from the Preface to the 1995 edition of Intercourse (first edition 1987).

My colleagues, of course, had been right; but their advice offended me. I have never written for a cowardly or passive or stupid reader, the precise characteristics of most reviewers—overeducated but functionally illiterate, members of a gang, a pack, who do their drive-by shootings in print and experience what they call the street at cocktail parties. I heard it onthe street, they say, meaning a penthouse closer to heaven. It is no accident that most of the books published in the last few years about the decline and fall of Anglo-European culture because of the polluting effect of women of all races and some men of color—and there are a slew of such books—have been written by white-boy journalists. Abandoning the J-school ethic of who, what, where, when, how and the discipline of Hemingway’s lean, masculine prose, they now try to answer why. That decline and fall, they say, is because talentless, uppity women infest literature; or because militant feminists are an obstacle to the prorape, prodominance art of talented living or dead men; or because the multicultural reader—likely to be female and/or not white—values Alice Walker and Toni Morrison above Aristotle and the Marquis de Sade. Hallelujah, I say.

Intercourse is a book that moves through the sexed world of dominance and submission. It moves in descending circles, not in a straight line, and as in a vortex each spiral goes down deeper. Its formal model is Dante’s Inferno; its lyrical debt is to Rimbaud; the equality it envisions is rooted in the dreams of women, silent generations, pioneer voices, lone rebels, and masses who agitated, demanded, cried out, broke laws, and even begged. The begging was a substitute for retaliatory violence: doing bodily harm back to those who use or injure you. I want women to be done with begging.

The public censure of women as if we are rabid because we speak without apology about the world in which we live is a strategy of threat that usually works. Men often react to women’s words—speaking and writing—as if they were acts of violence; sometimes men react to women’s words with violence. So we lower our voices. Women whisper. Women apologize. Women shut up. Women trivialize what we know. Women shrink. Women pull back. Most women have experienced enough dominance from men—control, violence, insult, contempt—that no threat seems empty.

Intercourse does not say, forgive me and love me. It does not say, I forgive you, I love you. For a woman writer to thrive (or, arguably, to survive) in these current hard times, forgiveness and love must be subtext. No. I say no.

Can a man read Intercourse? Can a man read a book written by a woman in which she uses language without its ever becoming decorative or pretty? Can a man read a book written by a woman in which she, the author, has a direct relationship to experience, ideas, literature, life, including fucking, without mediation—such that what she says and how she says it are not determined by boundaries men have set for her? Can a man read a woman’s work if it does not say what he already knows? Can a man let in a challenge not just to his dominance but to his cognition? And, specifically, am I saying that I know more than men about fucking? Yes, I am. Not just different: more and better, deeper and wider, the way anyone used knows the user.

Intercourse does not narrate my experience to measure it against Norman Mailer’s or D. H. Lawrence’s. The first-person is embedded in the way the book is built. I use Tolstoy, Kobo Abe, James Baldwin, Tennessee Williams, Isaac Bashevis Singer, Flaubert not as authorities but as examples. I use them; I cut and slie into them in order to exhibit them; but the authority behind the book—behind each and every choice—is mine. In formal terms, then, Intercourse is arrogant, cold, and remorseless. You, the reader, will not be looking at me, the girl; you will be looking at them. In Intercourse I created an intellectual and imaginative environment in which you can see them. The very fact that I usurp their place—make them my characters—lessens the unexamined authority that goes not with their art but with their gender. I love the literature these men created; but I will not live my life as if they are real and I am not. Nor will I tolerate the continuing assumption that they know more about women than we know about ourselves. And I do not believe that they know more about intercourse. Habits of deference can be broken, and it is up to writers to break them. Submission can be refused; and I refuse it.

Of course, men have read and do read Intercourse. Many like it and understand it. Some few have been thrilled by it—it suggests to them a new possibility of freedom, a new sexual ethic: and they do not want to be users. Some men respond to the radicalism of Intercourse: the ideas, the prose, the structure, the questions that both underlie and intentionally subvert meaning. But if one’s sexual experience has always and without exception been based on dominance—not only overt acts but also metaphysical and ontological assumptions—how can one read this book? The end of male dominance would mean—in the understanding of such a man—the end of sex. If one has eroticized a differential in power that allows for force as a natural and inevitable part of intercourse, how could one understand that this book does not say that all men are rapists or that all intercourse is rape? Equality in the realm of sex is an antisexual idea if sex requires dominance in order to register as sensation. As said as I am to say it, the limits of the old Adam—and the material power he still has, especially in publishing and media—have set limits on the public discourse (by both men and women) about this book.

In general women get to say yea or nay to intercourse, which is taken to be a synonym for sex, echt sex. In this reductive brave new world, women like sex or we do not. We are loyal to sex or we are not. The range of emotions and ideas expressed by Tolstoy et al. is literally forbidden to contemporary women. Remorse, sadness, despair, alienation, obsession, fear, greed, hate—all of which men, especially male artists, express—are simple no votes for women. Compliance means yes; a simplistic rah-rah means yes; affirming the implicit right of men to get laid regardless of the consequences to women is a yes. Reacting against force or exploitation means no; affirming pornography and prostitution means yes. I like it is the standard for citizenship, and I want it pretty much exhausts the First Amendment’s meaning for women. Critical thought or deep feeling puts one into the Puritan camp, that hallucinated place of exile where women with complaints are dumped, after which we can be abandoned. Why—socially speaking—feed a woman you can’t fuck? Why fuck a woman who might ask questions let alone have a complex emotional life or a political idea? I refuse to tolerate this loyalty-oath approach to women and intercourse or women and sexuality or, more to the point, women and men. …

—Andrea Dworkin (1995), Preface to the 1995 edition of Intercourse, pp. vii-x.

This may help to shed some light, from a few different directions, on long-standing discussions on this site and elsewhere.

Over My Shoulder #10: Andrea Dworkin’s Preface to the British Edition of Right-wing Women

You know the rules. Here’s the quote. This is from Andrea Dworkin’s Preface to the British edition of Right-wing Women (1983). It’s reprinted for American readers in Letters from a War Zone, pp. 185-194. I re-read the essay (along with a great deal of Andrea Dworkin’s stuff) in the process of following citations and culling material for expansions to WikiPedia: Andrea Dworkin — partly on its own merits, and partly because I’ve had to spend some time on it dealing with crusading anti-Dworkin editor / vandals. This is unrelated to anything that was under discussion in the article, but it caught my eye as I was flipping through, so I slowed down to re-read it in full:

The political concepts of Right and Left could not have originated in England or the United States; they come out of the specificity of the French experience. They were born in the chaos of the first fully modern revolution, the French Revolution, in reaction to which all Europe subsequently redefined itself. As a direct result of the French Revolution, the political face of Europe changed and so did the political discourse of Europeans. One fundamental change was the formal division of values, parties, and programs into Right and Left—modern alliances and allegiances emerged, heralded by new, modern categories of organized political thought. What had started in France’s National Assembly as perhaps an expedient seating arrangement from right to left became a nearly metaphysical political construction that swept Western political consciousness and practice.

In part this astonishing development was accomplished through the extreme reaction against the French Revolution embodied especially in vitriolic denunciations of it by politicians in England and elsewhere committed to monarchy, the class system, and the values implicit in feudalism. Their arguments against the French Revolution and in behalf of monarchy form the basis for modern right-wing politics, or conservatism. The principles of organized conservatism, in social, economic, and moral values, were enunciated in a great body of reactionary polemic, most instrumentally in the English Whig Edmund Burke’s Reflections on the Revolution in France. Written in 1789 before the ascendancy of the Jacobins—and therefore not in response to the Terror or to Jacobin ideological absolutism—Burke’s Reflections is suffused with fury at the audacity of the Revolution itself because this revolution uniquely insisted that political freedom required some measure of civil, economic, and social equality. The linking of freedom with equality philosophically or programmatically remains anathema to conservatives today. Freedom, according to Burke, required hierarchy and order. That was his enduring theme.

I flatter myself, Burke wrote, that I love a manly, moral, regulated liberty. Manly liberty is bold, not effeminate or timorous (following a dictionary definition of the adjective manly). Manly liberty (following Burke) has a king. Manly liberty is authoritarian: the authority of the king—his sovereignty—presumably guarantees the liberty of everyone else by arcane analogy. Moral liberty is the worship of God and property, especially as they merge in the institutional church. Moral liberty means respect for the authority of God and king, especially as it manifests in feudal hierarchy. Regulated liberty is limited liberty: whateveri s left over once the king is obeyed, God is worshipped, property is respected, hierarchy is honored, and the taxes or tributes that support all these institutions are paid. The liberty Burke loved particularly depended on the willingness of persons not just to accept but to love the social circumstances into which they were born: To be attached to the subdivision, to love the little platoon we belong to in society, is the first principle (the germ as it were) of public affections. It is the first link in the series by which we proceed towards a love to our country and mankind. The French rabble had noticeably violated this first principle of public affections.

To Burke, history showed that monarchy and the rights of Englishmen were completely intertwined so that the one required the other. Because certain rights had been exercised under monarchy, Burke held that monarchy was essential to the exercise of those rights. England had no proof, according to Burke, that rights could exist and be exercised without monarchy. Burke indicted political theorists who claimed that there were natural rights of men that superseded in importance the rights of existing governments. These theorists have wrought under-ground a mine that will blow up, at one grand explosion, all examples of antiquity, all precedents, charters, and acts of parliament. They have rights of men. Against these there can be no prescription… I have nothing to say to the clumsy subtility of their political metaphysicks. In Burke’s more agile metaphysics, hereditary rights were transmitted through a hereditary crown because they had been before and so would continue to be. Burke provided no basis for evaluating the quality or fairness of the rights of the little platoon we belong to in society as opposed to the rights of other little platoons: to admit such a necessity would not be loving our little platoon enough. The hereditary crown, Burke suggests, restrains dictatorship because it gives the king obeisance without making him fight for it. It also inhibits civil conflict over who the ruler will be. This is as close as Burke gets to a substantive explanation of why rights and monarchy are inextricably linked.

—Andrea Dworkin (1983), Preface to the British Edition of Right-wing Women, reprinted in Letters from a War Zone, 187—189.

For some similar points, partly influenced by Dworkin’s comments here and elsewhere in the preface, see GT 2005-02-03: By George, I think he’s got it!

Over My Shoulder #9: Arthur C. Danto’s Staring at the Sea

You know the rules. Here’s the quote. This is from Staring at the Sea, Arthur C. Danto’s review of an exhibition of Édouard Manet’s marine paintings at the Philadelphia Museum of Art, in one of my piled-up back issues of The Nation, from April 2004 (pp. 34—37). (I note in passing that The Nation is one of the few establishment leftist rags worth keeping around for nearly two years; mainly because most issues have one or two reviews like this one.)

Toward the end of January, I received an invitation to a press opening for Manet and the Sea, at the Philadelphia Museum of Art. It reproduced a painting of people on a beach, taking the sea air. The scene was as fresh as the air itself, bringing a virtual whiff of saltwater, a feeling of sunshine and physical happiness, and of the freedom and adventure the mere thought of the ocean awakens. In part because of the harsh cold we had all been enduring, in part because of the surge of pleasure French painting of that era always induces, I simply forgave the phrase in the press release (The artist and 8 contemporaries chart a new course toward pure painting) and resolved to fuir là-bas—flee down there, to cite Mallarmé’s great poem Sea Breeze—even if là-bas was Philadelphia in February rather than Boulogne-sur-Mer in August.

The chief problem of the press description is that it invites us to view the show as pointing the way to pure painting, whatever that is, instead of situating the works in the art world of their time. Manet’s 1868 Beach at Boulogne, with the lightness, the clarity, the sense of life at its best, conveyed by the loosely sketched disjunction of holidaymakers surrendering to simple summer enjoyments more than a century ago—promenading under parasols, peering at seashells, wading, gazing at the passing boats, riding a docile donkey, playing in the sand—is a wonderful work in itself. It is not a finished tableau but preserves the quality of a sketch, however intensely Manet may have worked on it; it is clear, just from looking at it, that he transcribed onto the canvas pictorial notations from his sketchbooks, drawn on the spot. It resembles a horizontal scroll, with the kind of spontaneously drawn figures the Japanese master Hokusai distributed across a sheet for one of his booklets. The figures have little to do with one another, without that implying, as a wall text suggests, a proposition regarding the loneliness of modern life. Who really cares what in the twentieth century it heralds? Who really cares about pure painting when one stands in front of it?

Writing of one of Manet’s masterpieces, Déjeuner sur l’herbe, a hostile critic once observed that his paintings had the quality of rebuses. A rebus is a kind of puzzle in which pictures are juxtaposed that have nothing obvious to do with one another. One solves a rebus by pronouncing the names of the objects the pictures show, producing a coherent message. Freud thought the images in a dream have the apparent dislogic of a rebus, and there is a sense in which The Beach at Boulogne has the quality of a dream, with the difference that there is no organizing interpretation to seek. The beach and the sea beyond it have an essential emptiness, with people dotted here and there on the one and boats dashed here and there on the other. It is not a Salon picture, like most of the paintings most of us know by Manet. It feels as if it were made for pleasure and to give pleasure, rather than for the heroic purpose of creating Modernism.

—Arthur C. Danto, Staring at the Sea, in The Nation, 19 April 2004, p. 34.

Over my shoulder #8: Susan Brownmiller’s In Our Time: Memoir of a Revolution

You know the rules. Here’s the quote. This is from Susan Brownmiller’s In Our Time: Memoir of a Revolution, which I’ve been re-reading in parts recently, as a source for WikiPedia contributions on Andrea Dworkin and a new entry on Women Against Pornography. I mention, off to one side, that things are often more complicated than they seem, and that this is relevant to one of the most frequent questions that Roderick and I most frequently get on our qualified defense of Andrea Dworkin and Catharine MacKinnon, and our passing comments about anti-pornography radical feminism, in our paper on libertarian feminism.

Brownmiller has been discussing the fights over municipal anti-pornography civil rights ordinances authored by Dworkin and MacKinnon in 1983-1984.

Andrea mailed me a copy of the ordinance on December 29, the day before it passed by one vote in the city council. I hadn’t even known that she and MacKinnon were in Minneapolis and working on legislation, but on reading the bill I quickly concluded that it was unworkable—full of overblown rhetoric, overly broad and vague intentions, tricky and convoluted legal locutions. Any court in the land, I believed, would find it unconstitutional, an observation I offered in my usual blunt manner when Andrea called a few days later to get my endorsement.

I assured her I would not go public with my negative opinion. I still cared tremendously about the issue, and for all its flaws, I figured the ordinance might be a valuable consciousness-raiser and organizing tool. In a bad lapse of political judgment, I failed to perceive how it would polarize an already divided feminist community by providing an even better organizing tool for the opposition. Not that what I thought mattered at that point. I had ceded leadership in antipornography work to those willing to carry it forward when I’d retreated to finish my book on femininity, just then reaching bookstores after a very long haul.

Few people noticed my absence from the national list of ordinance supporters. Gloria Steinem, Robin Morgan, Phyllis Chesler, and the new leadership of Women Against Pornography had already sent Dworkin and MacKinnon their glowing commendations. I thought it was fucking brilliant, Robin Morgan remembers, just brilliant the way they circumvented the criminal statutes and obscenity codes identified with the right wing, and took a new path through the concept of harm and civil rights discrimination. Robin, coiner of the slogan Pornography is the theory, rape is the practice, did not se any constitutional problem. If I had, she concedes, I doubt that it would have affected my position.

The ordinance was vetoed within days of its passage by Mayor Donald Fraser, who maintained that the city did not have the financial resources to defend the law’s constitutionality in court. Seven months later it came up before the council again, with minor modifications. This time around, pornography was defined only as a contributory factor, not central to the subordination of women. Dorchen Leidholdt flew to Minneapolis to help with a petition drive. Upon her return, she persuaded Women Against Pornography to contribute a few thousand dollars from its dwindling treasury to the effort.

The switch from a plucky, inventive campaign to educate the public about pornography’s dangers to the promotion of new legislation was a huge change in direction for WAP, although given the times, it was probably inevitable. Mehrhof and Alexander, the last of WAP’s original full-time organizers, had already resigned, needing a more reliable weekly paycheck than antipornography work could offer. Increasingly frustrated, the remaining activists had lost their faith in the powers of hand-cranked slide shows and hastily organized protest demonstrations to curb a phenomenal growth industry which was taking advantage of the latest technologies (pre-Internet) to create a multibillion dollar X-rated home video market, Dial-a-Porn, and public-access television channels.

Although WAP backed the ordinance, other antiporn groups were not so sanguine about it. In Washington, political scientist Janet Gornick recalls, the ordinance split her group, Feminists Against Pornography, right down the middle, and ultimately she resigned. We were black and white, lesbian and straight, and almost every one of us had been a victim of sexual violence, says Gornick, whose own activism had started six years earlier, after she was stabbed on the street, dragged twenty feet, and raped a block away from the Harvard campus in a crime that was never solved. FAP was doing very daring direct-action things in addition to the usual slide shows and Take Back the Nights, she relates. We were waging a small war against the Fourteenth Street porn strip north of the White House. But the minute I heard about Minneapolis, I knew it was a strategic catastrophe. It broke my heart. Before then we’d always maintained that we wern’t for new legislation, that we weren’t trying to ban anything. Some of our younger members just couldn’t comprehend that very committed feminists—our elders, our leaders, who were pulling us along by their rhetoric—could make such a big mistake that would lead the movement astray.

… The decision to ally herself with FACT and against the ordinance had come only after some tortured soul-searching by [Adrienne] Rich, whose previous expressions of faith in Andrea Dworkin had attributed to her leadership the greatest depth and grasp. In a special statement for off our backs, optimistically titled We Don’t Have to Come Apart over Pornography, the activist poet wrote, I am less sure than Dworkin and MacKinnon that this is a time when further powers of suppression should be turned over to the State. The lawyer and writer Wendy Kaminer, another early WAP member, went public with her opposition to the ordinance a year or so later.

—Susan Brownmiller, In Our Time: Memoir of a Revolution (1999). 319-322.

Over My Shoulder #7: Allan Bloom’s Giants and Dwarfs

You know the rules. Here’s the quote. This is from Allan Bloom’s Giants and Dwarfs: An Outline of Gulliver’s Travels, as reprinted in Giants and Dwarfs: Essays 1960—1990. I add only an emphatic reminder of Rule 4, Quoting a passage doesn’t entail endorsement of what’s said in it. Sometimes I agree and sometimes I don’t. Whether I do or not isn’t really the point of the exercise anyway.

… And we may further suppose that Gulliver has certain hidden thoughts and intentions which are only to be revealed by closely cross-examining him. He indicates this himself at the close of his travels when he swears to his veracity. He uses for this solemn occasion Sinon’s treacherous oath to the Trojans, by means of which that worthy managed to gain admission for the horse and its concealed burden of Greeks.

I should like to suggest that this book is also such a container, filled with Greeks who are, once introduced, destined to conquer a new Troy, or, translated into the little language, destined to conquer Lilliput. In other words, I wish to contend that Gulliver’s Travels is one of the last explicit statements in the famous Quarrel between the Ancients and the Moderns and perhaps the greatest intervention in that notorious argument. By means of the appeal of its myth, it keeps alive the classical vision in ages when even the importance of the quarrel is denied, not to speak of the importance of that classical viewpoint, which appears to have been swamped by history. The laughter evoked by Gulliver’s Travels is authorized by a standard drawn from Homer and Plato.

Prior to entering directly into the contents of the book, I should try to make this assertion somewhat more extrinsically plausible. The quarrel itself is today regarded as a petty thing, rather ridiculous on both sides, a conventional debate between old and new, reactionary and progressive, which later ages have resolved by way of synthesis. Both sides lacked perspective; intellectual history is but one long continuous development. Moreover, the quarrel is looked on largely as a purely literary dispute, originating in the comparison of Greek and Roman poetry with French. Now this understanding is quite different from that of the participants, who, if not always the best judges, must be the first witnesses in any hearing. They understood the debate over poetry to be a mere subdivision of an opposition between two comprehensive systems of radically opposed thought, one finding its source in ancient philosophy, the other in modern philosophy. The moderns believed that they had found the true principles of nature, and that, by means of their methods, new sources of power could be found in physical nature, politics, and the arts. These new principles represented a fundamental break with classical thought and were incompatible with it. The poetic debate was meant, on the part of the advocates of modernity, only to show the superiority of modern thought based on modern talents and modern freedom in the domain where the classics were most indisputably masters and models. The quarrel involved the highest principles about the first causes of all things and the best way of life. It marked a crossroad, one of the very few at which mankind has been asked to make a decisive change in direction. The choice once made, we have forgotten that this was not the only road, that there was another one before us, either because we are ignorant of a possible choice or because we are so sure that this is the only road to Larissa. It is only by return to our starting point that the gravity of the choice can be realized; and at that crossroad one finds the quarrel. It is not, I repeat, a quarrel among authors as such, but among principles.

In his own way, Swift presents and contrasts those principles. He characterizes ancient philosophy as a bee whose wings produce music and flight and who thus visits all the blossoms of the field and garden … and in collecting from them enriches himself without the least injury to their beauty, their smell, or their taste. This bee is opposed to a house-building spider, who thinks he produces his own world from himself and is hence independent, but who actually feeds on filth and produces excrement. As the bee says, So, in short, the question comes all to this; whether is the nobler being of the two, that by a lazy contemplation of four inches round, by an overweening pride, feeding and engendering on itself, turns all into excrement or venom, producing nothing at all, but flybane and a cobweb; or that which by a universal range, with long search, much study, true judgment, and distinction of things, brings honey and wax.

This description is drawn from one of Swift’s earliest writings, The Battle of the Books. Gulliver’s Travels was one of his latest. Throughout his life Swift saw the Quarrel between the Ancients and the Moderns as the issue in physics, poetry, and politics, and it is in the light of it that he directed his literary career and his practical life. The quarrel is the key to the diverse strands of this various man; his standards of judgment are all classical; his praise and blame are always in accord with that of Plato. He learned how to live within his own time in the perspective of an earlier one. Swift, the Tory and the High Churchman, was a republican and a nonbeliever.

Gulliver’s Travels is always said to be a satire, and there is no reason to quarrel with this designation. But it is not sufficient, for satire is concerned with a view to what is serious and ridiculous, good and bad. It is not enough to say that human folly is ridiculed; what was follow to Aristophanes would not have seemed so to Tertullian, and conversely. If the specific intention of the satire is not uncovered, the work is trivialized. Swift intended his book to instruct, and the character of that instruction is lost if we do not take seriously the issues he takes seriously. But we do not even recognize the real issues in the Quarrel, let alone try to decide which side had the greatest share of truth. In our time, only Leo Strauss has provided us with the scholarship and the philosophic insight necessary to a proper confrontation of ancients and moderns, and hence his works are the prolegomena to a recovery of Swift’s teaching. Swift’s rejection of modern physical and political science seems merely ill-tempered if not viewed in relation to a possible alternative, and it is Leo Strauss who has elaborated the plausibility, nay, the vital importance, of that alternative. Now we are able to turn to Swift, not only for amusement but for possible guidance as to how we should live. Furthermore, Swift’s art of writing explicitly follows the rhetorical rules for public expression developed by the ancients, of which we have been reminded by Professor Strauss. The rhetoric was a result of a comprehensive reflection about the relation between philosophy and politics, and it points to considerations neglected by the men of letters of the Enlightenment. Gulliver’s Travels is in both substance and form a model of the problems which we have been taught to recognize as our own by Leo Strauss.

—Allan Bloom, Giants and Dwarfs: An Outline of Gulliver’s Travels (1964), in printed in Giants and Dwarfs: Essays 1960–1990 (1990). 35–38.

Over My Shoulder #6: Oliver Sacks’s Seeing Voices

You know the rules. Here’s the quote. This is from Oliver Sacks’s Seeing Voices: A Journey into the World of the Deaf (1989). I broke the rules a bit here: rather than a single passage of a few paragraphs, I have two, because the latter one reinforces one of the important points of the former, and also because it’s damn near impossible to pick out any one thing that is the most interesting from the chapter. So here goes:

The situation of the prelingually deaf, prior to 1750, was indeed a calamity: unable to acquire speech, hence dumb or mute; unable to enjoy free communication with even their parents and families; confined to a few rudimentary signs and gestures; cut off, except in large cities, even from the community of their own kind; deprived of literacy and education, all knowledge of the world; forced to do the most menial work; living alone, often close to destitution; treated by the law and society as little better than imbeciles—the lot of the deaf was manifestly dreadful.

But what was manifest was as nothing to the destitution inside—the destitution of knowledge and thought that prelingual deafness could bring, in the absence of any communication or remedial measures. The deplorable state of the deaf aroused both the curiosity and the compassion of the philosophes. Thus the Abbé Sicard asked:

Why is the uneducated deaf person isolated in nature and unable to communicate with other men? Why is he reduced to this state of imbecility? Does his biological constitution differ from ours? Does he not have everything he needs for having sensations, acquiring ideas, and combining them to do everything that we do? Does he not get sensory impressions from objects as we do? Are these not, as with us, the occasion of the mind’s sensations and its acquired ideas? Why then does the deaf person remain stupid while we become intelligent?

To ask this question—never really clearly asked before—is to grasp its answer, to see that the answer lies in the use of symbols. It is, Sicard continues, because the deaf person has no symbols for fixing and combining ideas … that there is a total communication-gap between him and other people. But what was all-important, and had been a source of fundamental confusion since Aristotle’s pronouncements on the matter, was the enduring misconception that symbols had to be speech. Perhaps indeed this passionte misperception, or prejudice, went back to biblical days: the subhuman status of mutes was part of the Mosaic code, and it was reinforced by the biblical exaltation of voice and ear as the one and true way in which man and God could speak (In the beginning was the Word). And yet, overborne by Mosaic and Aristotelian thunderings, some profound voices intimated that this need not be so. Thus Socrates’ remark in the Cratylus of Plato, which so impressed the youthful Abbé de l’Epée:

If we had neither voice nor tongue, and yet wished to manifest things to one another, should we not, like those which are at present mute, endeavour to signify our meaning by the hands, head, and other parts of the body?

Or the deep, yet obvious, insights of the philosopher-physician Cardan in the sixteenth century:

It is possible to place a deaf-mute in a position to hear by reading, and to speak by writing … for as different sounds are conventionally used to signify different things, so also may the various figures of objects and words …. Written characters and ideas may be connected without the intervention of actual sounds.

In the sixteenth century the notion that the understanding of ideas did not depend upon the hearing of words was revolutionary.

But it is not (usually) the ideas of philosophers that change reality; nor, conversely, is it the practice of ordinary people. What changes history, what kindles revolutions, is the meeting of the two. A lofty mind—that of the Abbé de l’Epée—had to meet a humble usage—the indigenous sign language of the poor deaf who roamed Paris—in order to make possible a momentous transformation. If we ask why this meeting had not occurred before, it has something to do with the vocation of Abbé, who could not bear to think of the souls of the deaf-mute living and dying unshriven, deprived of the Catechism, the Scriptures, the Word of God; and it is partly owing to his humility—that he listened to the deaf—and partly to a philosophical and linguistic idea then very much in the air—that of universal language, like the speceium of which Leibniz dreamed. Thus, de l’Epée approached sign language not with contempt but with awe.

The universal language that your scholars have sought for in vain and of which they have despaired, is here; it is right before your eyes, it is the mimicry of the impoverished deaf. Because you do not know it,you hold it in contempt, yet it alone will provide you with the key to all languages.

That this was a misapprehension—for sign language is not a universal language in this grand sense, and Leibniz’s noble dream was probably a chimera—did not matter, was even an advantage. For what mattered was that the Abbé paid minute attention to his pupils, acquired their language (which had scarcely ever been done by the hearing before). And then, by associating signs with pictures and written words, he taught them to read; and with this, in one swoop, he opened to them the world’s learning and culture. De l’Epée’s system of methodical signs—a combination of their own Sign with signed French grammar—enabled deaf students to write down what was said to them through a signing interpreter, a method so successful that, for the first time, it enaled ordinary deaf pupils to read and write French, and thus acquire an education. His school, founded in 1755, was the first to achieve public support. He trained a multitude of teachers for the deaf, who, by the time of his death in 1789, had established twenty-one schools for the deaf in France and Europe. The future of de l’Epée’s own school seemed uncertain during the turmoil of the revolution, but by 1791 it had become the National Institution for Deaf-Mutes in Paris, headed by the brilliant grammarian Sicard. De l’Epée’s own book, as revolutionary as Copernicus’ in its own way, was first published in 1776.

De l’Epée’s book, a classic, is available in many languages. But what have not been available, have been virtually unknown, are the equally important (and, in some ways, even more fascinating) original writings of the deaf—the first deaf-mutes ever able to write. Harlan Lane and Franklin Philip have done a great service in making these so readily available to us in The Deaf Experience. Especially moving and important are the 1779 Observations of Pierre Desloges—the first book to be published by a deaf person—now available in English for the first time. Desloges himself, deafened at an early age, and virtually without speech, provides us first with a frightening description of the world, or unworld, of the languageless.

At the beginning of my infirmity, and for as long as I was living apart from other deaf people … I was unaware of sign language. I used only scattered, isolated, and unconnected signs. I did not know the art of combining them to form distinct pictures with which one can represent various ideas, transmit them to one’s peers, and converse in logical discourse.

Thus Desloges, though obviously a highly gifted man, could scarcely entertain ideas, or engage in logical discourse, until he had acquired sign language (which, as is usual with the deaf, he learned from someone deaf, in his case from an illiterate deaf-mute).

—Oliver Sacks (1989), Seeing Voices: A Journey into the World of the Deaf, pp. 13—18.

And:

When Laurent Clerc (a pupil of Massieu, himself a pupil of Sicard) came to the United States in 1816, he had an immediate and extraordinary impact, for American teachers up to this point had never been exposed to, never even imagined, a deaf-mute of impressive intelligence and education, had never imagined the possibilities dormant in the deaf. With Thomas Gallaudet, Clerc set up the American Asylum for the Deaf, in Hartford, in 1817. As Paris—teachers, philosophes, and public-at-large—was moved, amazed, converted by de l’Epée in the 1770s, so America was to be converted fifty years later.

The atmosphere at the Hartford Asylum, and at other schools soon to be set up, was marked by the sort of enthusiasm and excitement only seen at the start of grand intellectual and humanitarian adventures. The prompt and spectacular success of the Hartford Asylum soon led to the opening of other schools wherever there was sufficient density of population, and thus of deaf students. Virtually all the teachers of the deaf (nearly all of whom were fluent signers and many of whom were deaf) went to Hartford. The French sign system imported by Cleric rapidly amalgamated with the indigenous sign languages here—the deaf generate sign languages wherever there are communities of deaf people; it is for them the easiest and most natural form of communication—to form a uniquely expressive and powerful hybrid, American Sign Language (ASL). A special indigenous strength—presented convincingly by Nora Ellen Groce in her book, Everyone Here Spoke Sign Language—was the contribution of Martha’s Vineyard deaf to the development of ASL. A substantial minority of the population there suffered from a hereditary deafness, and most of the island had adopted an easy and powerful sign language. Virtually all the deaf of the Vineyard were sent to the Hartford Asylum in its formative years, where they contributed to the developing national language the unique strength of their own.

One has, indeed, a strong sense of pollination, of people coming to and fro, bringing regional languages, with all their idiosyncracies and strengths, to Hartford, and taking back an increasingly polished and generalized language. The rise of deaf literacy and deaf education was as spectacular in the United States as it had been in France, and soon spread to other parts of the world.

Lane estimates that by 1869 there were 550 teachers of the deaf worldwide and that 41 percent of the teachers of the deaf in the United States were themselves deaf. In 1864 Congress passed a law authorizing the Columbia Institution for the Deaf and the Blind in Washington to become a national deaf-mute college, the first institution of higher learning specifically for the deaf. Its first principal was Edward Gallaudet—the son of Thomas Gallaudet, who had brought Clerc to the United States in 1816. Gallaudet College, as it was later rechristened (it is now Gallaudet University), is still the only liberal arts college for deaf students in the world—though there are now several programs and institutes for the deaf associated with technical colleges. (The most famous of these is at the Rochester Institute of Technology, where there are more than 1,500 deaf students forming the National Technical Institute for the Deaf.)

The great impetus of deaf education and liberation, which had swept France between 1770 and 1820, thus continued its triumphant course in the United States until 1870 (Clerc, immensely active to the end and personally charismatic, died in 1869). And then—and this is the turning point in the entire story—the tide turned, turned against the use of Sign by and for the deaf, so that within twenty years the work of a century was undone.

Indeed, what was happening with the deaf and sign was part of a general (and if one wishes, political) movement of the time: a trend to Victorian oppressiveness and conformism, intolerance of minorities, and minority usages, of every kind—religious, linguistic, ethnic. Thus it was at this time that the little nations and little languages of the world (for example, Wales and Welsh) found themselves under pressure to assimilate and conform.

—Oliver Sacks (1989), Seeing Voices: A Journey into the World of the Deaf, pp. 21—24.