Liberal Betrayal of America and the Tea Party Firestorm by William Davis Eaton - HTML preview

PLEASE NOTE: This is an HTML preview only and some elements such as links or page numbers may be incorrect.
Download the book in PDF, ePub, Kindle for a complete version.

LIBERAL BETRAYAL of AMERICA
and the
TEA PARTY FIRESTORM
How the Student Riots of the Sixties Generated a Civil War to Destroy A Great Nation
WILLIAM DAVIS EATON
Copyright © 2010 William Davis Eaton
All rights reserved.
To Richard and Arlene Heath and Renée G. Eaton
Preface

On Income Tax Day, April 15, 2009, multitudes of people across the United States from all races, creeds, and political convictions, from cities, towns, and countryside, threw a Tea Party. This remarkable and spontaneous cross section of America came to express anger at their betrayal, and fear around the kitchen table. They see their American Dream dissolving into a nightmare of terrifying uncertainty. These Tea Party Americans want to how it is that an administration of proclaimed liberalism is systematically destroying fundamental American values and institutions. These loyal Americans have come to understand, reluctantly, and then angrily, that their own government is waging war against their liberty and everything else their country stands for and has shown to the world. The Tea Party firestorm is lit to shine the light of liberty on the truth of how deeply, how profoundly, the “liberals” now in power have turned against their country and their own ideals. It is to lay the foundation for their defeat.

Liberalism has both a political and an economic history. In both aspects liberalism has undergone a remarkable transformation in the last half-century or so. In his book The Liberal Imagination, published in 1950, American author and critic Lionel Trilling termed liberalism the only viable philosophical and literary tradition. Trilling, often cited as the preeminent cultural commentator of his time, saw liberalism as “a political position that affirmed the value of individual existence in all its variety, complexity, and difficulty.” Trilling called liberalism so understood “not only the dominant but even the sole intellectual tradition.”

The Scottish philosopher Adam Smith formulated the classic principles of economic liberalism in his book The Wealth of Nations published in 1776. These principles include private property, the rule of law, limited government, and the free market economy. Curiously, it was in the same year, 1776, that the American Declaration of Independence proclaimed the God given rights of all men to include the right to, “Life, Liberty, and the pursuit of Happiness.”

Two centuries later the 1976 Nobel Prize economist Milton Friedman, shows in his book Free to Choose (1980) that the energizing elements of both kinds of classical liberalism are in steep decline. An “ever bigger government,” he warns, threatens “to destroy both the prosperity that we owe to the free market, and the human freedom proclaimed so eloquently in the Declaration of Independence.” Finally liberalism, in both the political and economic sense, has behaved like a man performing a slow half-somersault who ends up standing on his head. Turned upside down liberalism has steadily emptied its pockets of America’s founding principles of economic freedom and individual liberty that once defined itself.

The rebellion of the new liberalism, some now call it progressivism, began the1960s riots against authority on hundreds of college campuses. As the original rebels of the sixties graduated into society, they and their progeny of the next generation began a radical ideological and political assault against the entire American tradition. They entered upon what is often termed their “long march” through American institutions. One of the more remarkable successes of this long march has been its gradual conquest of policy-making positions within the Democratic Party. From that stronghold “progressive” liberals have been able to radicalize the Party and to use it toward achieving their goal of power and domination.
Former Maryland Governor Robert Ehrlich sees the Democratic Party of the 21st century as a different animal from its former self: “It’s a hard-Left, AFSCME (public employee unions), trial-lawyer, teachers union party, and they play for keeps, unlike business.” Novelist Allen Drury perceives that liberalism itself has been transformed into a “rigid, ruthless, intolerant, and unyielding orthodoxy.” Author and Yale professor of computer science David Gelernter finds that the resulting confrontation between the rebels and American society “is turning into a full fledged war.” And so it has.

America is once again “engaged in a great civil war, testing whether that nation, or any nation so conceived and so dedicated can long endure.” Fortunately this new Civil War is not marked (as yet) by such bloody battlefields so dedicated can long endure.” Fortunately this new Civil War is not marked (as yet) by such bloody battlefields 1865 the Confederates of the South fought simply for independence from the Union. There was no intent to transform the culture or structure of the North, as the new Civil War intends to do for the entire nation.

In its earlier stages what was to become a Civil War was commonly called a culture war, and had no centrally organized command or purpose. Each contingent attacked on its own front in its own way. But the separate insurgencies all shared the inherent goal of civil war, which is to weaken or dissolve existing social values and institutions and transform society.

Gradually the liberal assault against America has heavily infiltrated or captured the public schools, most of the print and broadcast media, the arts, the universities, the environmental movement, organized labor, leading elements of science, and much of the judiciary and the federal bureaucracy. The insurrection has destroyed or corrupted supporting ideals and institutions affecting religion, sexual relations, the family, the rule of law, how words are used, and patriotic loyalty. Underpinning the Civil War, and adding strength to it, are persistent efforts in core scientific disciplines to denigrate and even mock the value of human life.

The rebels promote massive growth of government power at the expense of individual liberty. Knowing that its true intentions lack support, the rebellion is intolerant of opposing speech, writing, or broadcasting, which it seeks aggressively to suppress.

The culminating assault of the Civil War was carried out in the election of 2008, its true intent concealed behind a glittering facade of “Hope” and “Change.” When the political branches of the federal government were decisively captured in that election the insurgents achieved a centrally controlled national base for their revolution. In a sophisticated “bait and switch” maneuver the agents of this “full fledged war” against America moved rapidly from idealistic campaign rhetoric to the consolidation of raw power. Destruction of the America we have known—economically, politically, and morally—is well under way. It is the horrifying specter of America destroyed that ignites the firestorm response of the Tea Parties, and the millions more of true Americans that have been awakened to strike back in defense of liberty and country.

The engine that powers this assault against America was assembled and set in motion at American colleges and universities during the 1964-1965 academic year. A well-planned uprising led by a few thousand students was carried out on hundreds of campuses across the country. It was a rebellion calculated to challenge the legitimacy of authority on each campus attacked, and by extension the authority of society as a whole. The target selected for the first strike was the Berkeley campus of the University of California.

Liberalism today is anti-American revolutionaries on a tear, tearing down the structure of our democratic society and free enterprise economy. The true purpose of its leaders is their drive for power; the power to command and control the daily lives of the American people. The essence of the destruction being wrought, an understanding of how this has come about, a feeling for the horror of the intended results, and what might be done about it is the subject of this book.

I. A Declaration of War
1. Opening Shots
Revolution

In the summer of 1964 Clark Kerr, President of the multi-campus University of California system, received an alert warning him that radical student groups across the country were planning a concerted, nation wide uprising. There were to be protests and demonstrations on hundreds of campuses to challenge and disrupt campus authority during the upcoming 1964-1965 academic year. The Berkeley campus of the University of California had been designated as the leadoff target. The plan for Berkeley was to form a rebellion of overwhelming strength and support sufficient to force from office both the Chancellor of the Berkeley campus, Edward Strong, and University President, Clark Kerr. President Kerr and Chancellor Strong were accustomed to juvenile eruptions on campus and took the alert they had received to predict nothing more than the same passionate, idealistic, unfocused student radicalism they had seen before. It would be noisy, senseless, mostly harmless, and something that would die out of its own accord. Their judgment could not have been further off the mark.

On the morning of December 3, 1964, the campus administrative hierarchy gathered in the Chancellor’s Suite in Sproul Hall, the Berkeley campus administration building, to consider how to deal with an escalating student rebellion.

“Hey! Hey! Ho! Ho! Western Civ Has Got to Go!!
Hey! Hey! Ho! Ho! Western Civ Has Got to Go!
No Justice No Peace!
No Justice No Peace!
Savio! Savio!! Savio!!!”

The roar of thousands in the plaza below penetrated the walls and windows of the Chancellor’s suite with passionate intensity.

The University authorities had determined, even though they thought the summer alert to be needlessly alarmist, to damp down any such protests before they could gain traction. To limit the areas of potential protest the Berkeley administration activated rules prohibiting on-campus solicitation of money or support for off campus political purposes. The rules applied only on campus. On the public sidewalks or streets bordering the campus such protests would be free speech protected by the First Amendment to the United States Constitution. The campus radicals set up tables for distribution of literature and solicitation of money and membership on public sidewalks at the very borderline between the city of Berkeley and University property. To the surprise of no one, except perhaps campus officials, those tables and their associated activities gravitated onto University property. Campus authorities cited and disciplined students attending the tables for violation of the rules.

Prodded by the leaders of various protest groups, clusters of angry students began to form, accusing the University of violating their rights of free expression. As charges were repeatedly brought against student violators, leaders of the dissident groups began holding protest rallies in Sproul Plaza just outside the front doors of Sproul Hall. New accusations against alleged University abuses were shouted out almost daily. Growing larger by the week these rallies attracted increasing support for the protest movement.

“Hey! Hey! Ho! Ho! Western Civ Has Got to Go!!
Hey! Hey! Ho! Ho! Western Civ Has Got to Go!
No Justice No Peace!
No Justice No Peace!
Savio! Savio!! Savio!!!”
The roar of the crowd grew more ominous with each repetition.

The rebels adopted the bold and deceptive tactics typical of insurgent groups, including delay, exaggerated grievances, false claims of abuse, and impossible demands in order to inflame student reaction at the ever-larger noon rallies. What the rebels needed was a motif, a motto, a battle cry that would bring the various campus movements such as nudism, socialism, anti- Vietnam war, drugs, women’s rights, presidential politics, or free speech under one banner of unification. The growing rallies claimed ever more insistently that the students being disciplined were denied free speech. And free speech it was that became the catalytic issue around which to consolidate the various protest movements into one central organization. As the protests grew in number and emotional appeal, a coherent leadership emerged led by Berkeley philosophy major Mario Savio. The now legendary Free Speech Movement, the FSM, was born. The attack against the Berkeley administration intensified.

On December 3, 1964 where there once had been hundreds at the noon rallies on the plaza below the Chancellor’s suite now there were thousands. As the noon hour struck the crowd (those above called it a mob) was immense, radiating anger, yearning to be led, eager for action against what they saw as an unjust and repressive University administration. Few would have predicted how far the events of that day, December 3, 1964, and the following day, December 4, 1964, would advance the rebels’ attack on the University. Even fewer, if any, would have predicted the inferno that this demonstration, and others to follow across the country, would set ablaze.

In the Chancellor’s suite, in addition to the campus administrative hierarchy, there were the Chief of the Campus Police, representatives of the Alameda County Sheriff’s department, and officers of the California Highway Patrol. The University had already experienced an embarrassing mob incident involving a captive police car two months earlier. Those aware of the danger did not wish to see a repetition of that event.

On October 1, 1964 police drove a squad car onto Sproul Plaza to arrest a member of the FSM for violation of the speech rules. A crowd of several hundred gathered around as police put the suspect into the car. Someone shouted “Sit down!” The crowd sat, and the police car could not move. The arrested student and the officers inside the car were held there while the campus administration struggled to manage the situation. There were discussions, charges and counter-charges, proposals and counter-proposals between campus officials and FSM leaders.

Law enforcement personnel, as well as a few faculty members, urged Chancellor Strong to take decisive action to break up the crowd and release the car, using force if necessary. The law enforcement officials cautioned the Chancellor that anything less would allow the rebellion to gather strength. Far more aggressive, perhaps violent action would be the result. Chancellor Strong, a former professor of philosophy, responded that he wished to take the longer more considered view. A candid dialogue, an honest exchange of views, he felt, would allow the students to state clearly what they really wanted. Their grievances could, he was convinced, be resolved in mutual consultation, and peace would return to the campus. The University sought a compromise over the police car incident. The FSM demanded that the trapped student be released and freed of all charges.

Talks continued as lack of resolve by University officials drained the hours away. The captive student and his police captors sat in the car for 32 hours. Finally a “settlement agreement” was reached and the FSM leaders announced its terms to the crowd sitting around the car. Their message was also directed to the much larger number assembled to see the spectacle, and to an expectant media, by then always on hand at Berkeley.

The University had capitulated.

The student in the car was absolved of all charges and released to the victorious roar of hundreds of ecstatic rebels and increasingly sympathetic onlookers as the police car made its ignominious escape. Though the police car incident had occurred by chance, the FSM had been handed a perfect recruiting issue that was instantly and expertly exploited by the leadership. Throughout the months of October and November the rallies grew larger and more intense. Anger and outrage against the University was intensified by clever oratory, false charges, and increasing passion. Mario Savio solidified his leadership as a glib and charismatic orator. The rebels worked to gain off campus support as well. That proved not difficult since the noon rallies always appeared on the evening news, more frequently than not with a slant favoring the protesting students. Fortified by widespread support, not only from the students, but also from many on the faculty as well, and assured of sympathetic media coverage, the FSM prepared for a major confrontation.

“Hey! Hey! Ho! Ho! Western Civ Has Got to Go!!
Hey! Hey! Ho! Ho! Western Civ Has Got to Go!
No Justice No Peace!
No Justice No Peace!
Savio! Savio!! Savio!!!”

Chanted over and over by a chorus of thousands the litany had about it a hypnotic effect that insinuated itself even into the atmosphere of the Chancellor’s conference.

The officials gathered there had urged the Chancellor take decisive action before the thousands packed into the plaza got out of control. The camps Police Chief advised the Chancellor that units of the County Sheriff’s riot control squad and officers of the Highway Patrol had been put on standby and were awaiting his orders.

The Chancellor smiled and re-lit his pipe. No orders were given.

As the crowd multiplied and tension tightened the Chancellor explained the situation to those around him so eager to use force against the students. Despite the noise and bluster the uprising was nothing more than the usual generational rebellion against authority, as President Kerr had repeatedly assured them. The Chancellor was reminded that those in the Plaza were the same people he had tried to reason with in the police car incident back in October. Pressed with his defeat in that confrontation, the Chancellor retained his philosophic approach. Nor did it disturb him when it was pointed out that the hundreds in the October incident had become thousands, angrier by the day.

“No Justice, No peace!
No Justice, No peace!
Free speech!
Free Speech!
Free speech!
Savio!! Savio!! Savio!!”

A speaker on the steps of the building, exciting the crowd to ever-higher ecstasy, extended his arms as though to embrace the thousands waiting in explosive anticipation. A dozen campus police guarded the doors to Sproul Hall behind the speaker.

“And now, comrades in Justice, brothers in peace, the time has come, the message you’re waiting for, our leader, our… ”

“Savio! Savio! Savio!
Justice! Justice!
Free speech!
Free speech!
Free speech!
“Savio!! Savio!! Savio!!!”
The crowd clapped, roared, stomped their feet, and raised clenched fists toward the upper story of the building.

The Chancellor was advised once again that riot police were at standby stations awaiting his orders. He persisted in his belief that by patient dialogue with the students he could ascertain what it was they really wanted, and that a resolution based on honest exchange of views could be reached. He proposed that a delegation be appointed to consult with the FSM leadership.

No orders are issued. On the plaza below Mario Savio walked slowly through the crowd as it parted like the Red Sea for Moses to allow him to pass. He walked with studied confidence up the few steps to the podium. His back to the crowd, Savio stared at the upper windows of Sproul Hall for several moments. He slowly extended his right arm, closing his fist in calm defiance. The crowd roared. As Savio lowered his arm and turned to face his audience, the assembled thousands became eerily silent. Their leader looked over a sea of glistening eyes, as though to catch each one eye-to-eye, to assure his command, and to prepare them for what he was about to say. The crowd’s deafening ecstasy, transposed into reverential adulation and awe, stood in utter silence. As Savio began his speech he waved his left arm back toward the administration building behind him.

“We have an autocracy which runs this university, that manages it like a business firm.” He let the odious image sink in for a few moments. “The Board of Regents is its board of directors, President Kerr is its Chief Executive, and the Chancellors of the nine campuses are his managers. Now I’ll tell you something. The faculty are a bunch of employees and we’re the raw material; we’re here to be turned into whatever products the corporations and their rotten system say they want. But we are a bunch of raw materials that don’t mean to be made into any product, and we don’t mean to end up being bought by some clients of the University, be they the government, be they industry, be they organized labor, be they anyone! Because we are human beings!!”

Wild applause, shouts, and slogans. After a few moments one hand, palm toward the crowd, quiets them.

“There is a time when the operation of the machine becomes so odious, makes you so sick at heart, that you can’t take part; you can’t even passively take part, and you’ve got to put your bodies upon the gears and upon the wheels, upon the levers, upon all the apparatus, and you’ve got to make it stop. And you’ve got to indicate to the people who run it, to the people who own it, that unless you’re free, the machine will be prevented from working at all!”

Prolonged shouts and growls of approval. Savio holds up both hands for silence. The crowd, intoxicated though it is, obeys.

“We have a plan. It is to be massive, and it is to be peaceful. Remember that. No violence. Don’t allow yourselves to be provoked. That would play right into their hands. Now, no more talking. We’re going to march in singing ‘We Shall Overcome.’ Slowly; there are a lot of us. That way. Into the building and up the stairs.”

Singing “We Shall Overcome,” and Bob Dylan’s “The Times They Are-a Changin’,” some eight hundred to a thousand protesters marched up the steps of Sproul Hall, gently forcing the few guards aside, and began to fill the building.

Those in the Chancellor’s conference room, advised that the mob was breaking in and coming up the main staircase, were ushered down a back way and taken by police vehicles out of harm’s reach. The University informed the few employees who remained that the building was closed and they should go home. The FSM leaders designated various areas for specific activities: one for movies; another for a Spanish class; one for quiet study; and an area for square dancing.

State law enforcement officers advised California Governor Pat Brown, by law a member of the University Board of Regents, of the situation. The Governor authorized police action to clear the building. But the action was not to begin until after dark to minimize public or campus reaction.

At around midnight, as law enforcement contingents moved toward the campus, the Chancellor appeared with a bullhorn outside Sproul Hall. He looked up at the bright lights in his suite, and raised the bullhorn to speak. He urged the students to evacuate the captured building, to be reasonable, to speak frankly with him about their grievances. He waited. It was a last, desolate attempt to breast the tide of his rapidly vanishing authority. There was no response from inside the building. He spoke again of honest dialogue and peace on the campus. Again there was no response. The Chancellor looked up once more at the bright lights of the building. This man a quirk of fate had named “Strong” dropped the bullhorn to his side, turned back toward his residence.

Chancellor Strong did not understand that what he faced was a revolution; and that revolutions do not play by the rules. Their very purpose is to break the rules and impose new ones.

At approximately 2:00 a.m. some six hundred California Highway Patrolmen and Alameda County Sheriff’s Deputies cordoned off the building and began to arrest the protesters. The FSM leaders advised their followers to resist by going limp so as to make the arrests more difficult, also more likely to lead to minor injuries to be exploited later. Each protester was identified, booked for trespassing, and taken away in a patrol wagon. Those who went limp were charged with resisting arrest as well. The resistance was designed principally to draw out the process till the next day when students and faculty arriving for class, as well as the ever present media, would see what was happening.

They did see. Students and faculty alike were shocked and angered at the spectacle of massed law enforcement officers in riot gear arresting hundreds of students, dragging off by their heels some who went limp to lengthen the proceedings. It was a tedious process not completed until mid-afternoon on Friday December 4, 1964. A total of approximately 800 were charged and incarcerated, the largest mass arrest ever in the State of California.

The events of December 3 and December 4, 1964, achieved the immediate goal of the FSM, the plan about which the alert to President Kerr had warned. Chancellor Strong was forced to take indefinite “sick leave” early in the new year of 1965. Not long thereafter the Board of Regents terminated the services of President Clark Kerr as well, who mused, “I leave as I arrived, fired with enthusiasm.”

Having seen the FSM uprising as a standard juvenile protest against adult authority, upon reflection by then exPresident Kerr later termed it a “protest and outrage” that was “fresh and meaningful.” He confessed that its intensity “took us completely by surprise.” The same can be said of college and university administrators across the land, as blind to the power and import of subsequent revolts on some 300 other campuses as those at Berkeley had been. They did not believe that the events unfolding before their eyes and pounding into their ears were of great significance, and had no effective response. They had no tools, no concepts, and no resolve to deal with what was happening. A few observers did see that the nation wide uprising had within it the seeds as well as the words of a true revolution. Their warning was futile against the inertia of bewildered and benumbed authorities.

The speech of Mario Savio is a composite of several versions as recorded at the time by members of the Free Speech Movement and news organizations. The essence of the speech, though taken by some as a demand for redress of legitimate grievances, is a radical manifesto. Savio denies that a great public university has any obligation, or even any right, to prepare students to participate in the democratic structure and the free economy of their society. To do so would be selling themselves as “products” of a corrupt educational system to “consumers” of a corrupt economic system. So the solution is to “put your bodies” on the gears and wheels and levers of the “apparatus” and “make it stop.”

Savio’s speech, full of hubris and puffed up importance, transcending reason and common sense, even silly, seemed to be much as President Kerr had expected. But Savio’s speech and hundreds like it across the land were Marxist based exhortations to destroy the politics, culture, and values of American society, beginning in the colleges and universities. Savio’s speech was a declaration of war against the entire American nation until the whole “apparatus” is made to “stop.” The radicals of Berkeley and the rest of academia, in similar results in the years following, believed in what the speech said, and went forth into the nation to make it so.

How did the FSM capture the thousands who seemed to follow Savio’s radical purpose? Did they understand, in the rapturous flow of his soaring rhetoric, what he was saying? Did they agree? Who were the masses that supported the rebels, and what did they “really” want?

One morning on her way to campus during the height of the uprising a visiting professor of psychology gave a ride to a hitchhiking female student. The girl carried a placard on a stick that read, “Strike for What You Believe.” The professor, puzzled by the intensity of the movement and curious as to its motivation, asked the student what the uprising was all about. What did she believe that she was striking for? The student talked with animation about the FSM, oppression, free speech, and the alleged brutality of the “pigs” (their term for police) who were called from time to time to preserve order on campus.

Arriving at the place where the student wanted to be let off the professor stopped the car, and turned to look directly into the student’s eyes. She asked what her deepest motivation was, why such intensity, why the boiling hatred in the speeches, what did she truly want? The girl’s eyes blazed with an even brighter passion and she replied without hesitation: “Freedom!” The professor thought for a moment and, as the student opened the car door to get out, asked if it had occurred to her that she already had so much freedom she didn’t know what to do with it. A face that had been flushed and exuberant was chalk. As she got out of the car the student managed a barely muttered “Thank you,” picked up her sign and backpack and fled. The visiting psychologist paused to observe the young woman as she ran off to her rendezvous with reassurance.

The Berkeley riots were surely in some part a generational rebellion against adult authority, as President Kerr had analyzed it. But it was a rebellion far deeper than that; the rebels themselves perhaps only half conscious of its reach and destiny. It was a rebellion not only against parental authority, but also at its base a repudiation of the civilization for which their parents had fought World War II against fascist tyranny. The heroes of that “Greatest Generation” had given their children lives that were safe, free, and comfortable. But for the children, so dull and boring. They were privileged, idealistic, and restless. Their bountiful lives, together with their youthful innocence, had released them from reality. They were free to fantasize, to find fault with the workaday world, to yearn for a life more meaningful than being “sold” as “products” of the university to the highest “corporate bidder.”

In the daily rallies following their success in deposing the top authority, both at the Berkeley campus and University wide, the FSM grasped for some coherent idea of how to keep the movement alive and to define its mission. The noon tirades in Sproul Plaza continued, laced with obscenities to spice up a fading cause, and broadcast over loudspeakers that shook half of Berkeley. Free Speech was the only slogan that had seemed to stick. But since there was no longer any responsive censorship, limitation, or suppression of speech by benumbed University authorities the substance of that cry wore thin. There continued to radiate among these masses of juveniles far more passion than articulation.

By the spring of 1965 the evangelists of the FSM were as desperate for fuel to fire the energy necessary to their continued success as the girl hitchhiker had been. The prophets of revolution had to find for their apostles something with more hot blood in it, a core belief to match the passion of their motivation. Their search enticed them in all directions: opposition to the Vietnam War, feminism, drugs, the idealism of the Peace Corps, nudism, socialism, sex, women’s rights, and a growing adoration of almost any society other than their own. Everything they had tried and rejected the previous fall. They needed to find a battle cry that was permanent, spontaneous, and visceral. Ultimately, as much by instinct as by calculation, they did. They hit the hot button of an enticement whose magic they had been seeking, the chord of an impulse that touched to the core of their passionate longing for freedom.

Revelation

What at last twitched the nerve of visceral motivation needed to keep their rebellion hot and contagious turned out to be freedom after all. Sexual freedom. Not that there had been any shortage of sex passed around on the campuses of the rebellious sixties. And much of the nation, softened by rock “n” roll and the drug culture, was already drifting idly toward careless sexual indulgence. But the rebels struck deeper than that. Casual sex and indolent promiscuity were only the opening wedges of their revolutionary instinct about sex. Sexual freedom was to be of any and every variety, totally uninhibited. But the key was not merely in the practice of sex.

Sexual abandon must be recognized, accepted, and enforced as the new sexual normality. Sex, raw and flagrant, was to be flaunted and thrown into the face of a dying culture.

 

It just took a while for indulgence to coalesce into cause. That cause, when finally articulated, went straight to the libido. It required no explanation, no oratory, no persuasion.

Wow! Everything the young rebels had always wanted and had always been told was wrong. Right and wrong—how tedious such concepts were to those unsettled young. How boring. How repressive! And what a great recruiting tool for the incipient revolutionaries.

In settling on free sex as the hot button and driving force of their rebellion the rebels may have struck even more deeply into the culture they wished to annihilate than they realized. The act of family formation and procreation is the essential link that forms the chain of civilization.

If that ceremony of faith could be contaminated, its vows profaned and rejected by careless debauchery, perhaps that indulgence would metastasize into a general infection of the entire culture. It would affect religion and the churches, morality, how people value each other, children and their education, love, and much else. The Free Speech Movement turned out to have been the Free Sex Movement in embryo.

The consummation of the sexual revolution, its American Bastille, came soon enough. Woodstock. Ah, Woodstock. 1969. A celebration of “peace, love, and music” as advertised at the time, and as many still believe? Not quite. It was rain, mud, rock music, drugs, and a mass orgy of indiscriminate sex to rival the most licentious sexual celebrations of ancient legend, even to the point of gang rape. It was also an orgy of destruction of property, of trampling neighboring farmers’ fields, and of storming the ticket booths without paying to get in. Celebration of the 30th anniversary of the event, also advertised as a festival of “peace, love, and music” was reportedly more juvenile and irresponsible in every way than the original had been. More “adult” in the corrupted sense the word has been given to cover pornographic entertainment.

The ancient orgies in the name of Dionysus, god of passion, were confined to festivals of no more than a few days a year. The sexual revolution of the sixties, restraint abandoned to indulgence, wild and primitive, repudiated law, morals, and civility. Its effects were destined to strike through an entire society, to contaminate the core of its nurturing institutions. The sexual revolution spawned in the outbreaks of the sixties validated Mario Savio’s seemingly hubristic and silly Declaration of Civil War. The battles in which we remain deeply engaged began in earnest.

2. The Battle Plan
Question Authority

That the rebellious upstarts of this incipient Civil War must “Question Authority” with the intent to destroy authority has been bumper-sticker dogma from the beginning. Following the Berkeley uprising some 300 American college and university campuses across the country experienced similar rebellions. At Cornell University in 1969 armed “student” thugs demanded an independent black studies program, and the University administration gave in to them. The rebels’ tactics there made it clear that the new “studies” would be much more radical activist than anything that could be called academic. Cornell was a harbinger of the new “authority” to come. At Cornell, as at Berkeley, authority when questioned vanished. There was no longer any authority to question. The destruction of authority on campus after campus excited, in addition to their new sexual exploits, a lust for yet more destruction that drove these revolutionary students passionately onward.

Destruction of campus authority became a model for attacks against greater institutions of authority across the social and cultural landscape as the rebels swarmed out of the universities into the general society.

The civilizing concepts upon which democratic society depends must be eliminated. The sustaining faith, practices, and institutions of the existing society, effective in improving conditions of life, must be mocked as ridiculous, ridiculed as outmoded, damned as repressive, and hated. Horrors supposedly perpetrated by the offending culture must be magnified, or if need be invented, to degrade loyalty to the social structure. America must be condemned as a privileged nation oppressing those less well off. Under the guise of such mantras as a new world order and multiculturalism, praising other cultures of the world became for the rebels a useful adjunct to damning their own.

But these ideas are hardly new. That remarkably perceptive French visitor to America Alexis de Tocqueville in the early nineteenth century expresses apprehension regarding the American vision of equality. He fears that holding each individual to be of equal dignity and worth leads to a rejection of traditional morals and ethics. Americans, he says, are prone to rely on “their own judgment as the most apparent and accessible test of truth.” If all are equal who is to say what is right or wrong, moral or immoral? A population brought to that basis of morality is more easily manipulated than are people whose values are held to be eternal codes of human conduct based on transcendent authority.

Adam B. Seligman in his book Modernity’s Wager echoes these insightful, almost “post-modern” remarks of Tocqueville. The Boston University professor of religion examines a widespread disbelief in traditional religious faith, and in particular in its behavioral morality. He finds this to be consistent with the American belief in individual equality based on independent judgment that Tocqueville detects. Political scientist, editor, and author Damon Linker observes that many Americans have concluded they will be better off by treating belief in transcendent authority as a “useless superstition.” The conditions these observers reveal help to explain how easy it has been for the missionaries of passionate revolution to do their work. Moral and social values that had still managed to hold despite Tocqueville’s prophetic observations came to be widely discredited as rebellious waves of the sixties surged through large segments of the American population. Once such degeneration begins the institutions that support those values, and are supported by them, are at risk as well.

The sixteenth century French thinker Michel de Montaigne recognizes that “a man needs at least some degree of intelligence to be able to notice that he does not know.” The revolutionaries of the 1960s did not want to know they did not know where they were headed. Their orgies of sex, drugs, and destruction were conceived as a quest for freedom. The sixties revolution became so free that it pulled a mist of denial over the need to consider its destiny. It indulged a passion that would question and destroy authority without knowing what was to replace it.

The rebels often aligned themselves with genuine reform movements such as civil rights, equal rights for women, or anti-discrimination movements, but only as useful tools if they could be manipulated or taken over. The motivating animus of the nascent Civil War formed early and remained clear and direct. That was to question and subvert the religious beliefs, the history, the founding principles, and the entire culture of the United States of America until the authority of those ideas and institutions could be co-opted or destroyed.

Internal Combustion

The Free Speech Mvement, reincarnated as the Free Sex Movement, demonstrated soon enough that there is more fire and durability in the uninhibited sex of animal lust than there is in free speech or any of the other causes of that era. Free sex, once it became a cause as well as an indulgence, remained an enduring motivation of the Civil War. Adopted on a national scale in the rutting mud of Woodstock, free sex led the charge against the culture and structure of American society. Unlike the coerced revolutions in Nazi Germany or Communist Russia, the motor power of the sexual revolution of Berkeley and Woodstock had no need for coercion. Neither the brutal tactics of a KGB nor the jackboots of a Gestapo was required to gather disciples and guarantee their support. Converts to the new sexual morality at the base of the revolution came fervidly, their devotion often laced with LSD or pot.

To lead the proliferating forces of the Civil War there arose an elite corps of highly educated Civil Warriors. This cadre was formed from the mass-produced intellectual classes being ejected from American colleges and universities by the late sixties. These classes were driven by a passionate disbelief in the moral and intellectual benchmarks of the traditional culture. And they were prepared to apply the enormous energy of their uprising on a scale to match its sensual appeal. This intellectual rebel corps steadily became entrenched in tenured faculty positions in the colleges and universities, in television, print news, entertainment, and similar positions. They established in their newly fortified positions of power the motive, and now the institutional implements, to launch a nation wide campaign of disbelief in America. In his book The Long March Roger Kimball, managing editor of New Criterion, considers, as the sub-title promises, “How the Cultural Revolution of the 1960s Changed America.”

Kimball compares the sexual revolution to sexual orgies in the name of the ancient Greek god Dionysus. He finds that the sixties “prophets of Dionysian excess” tapped the universal clamor of youth against authority to a depth that changed human behavior. The leaders of this revolution of youth perceived that many not so young were also eager to cap their moral growth at the juvenile level. They, too, welcomed the promised release from thought and discipline. Even a once presidential hopeful (who had to settle for Secretary of State) Hillary Clinton has proposed a million dollar memorial to Woodstock as her contribution to that transformational event.

What better revolutionary device to hurl across millions of TV screens than the primal energies of sexual appetite, loosed and uninhibited? It became apparent that not only sexual barriers, but the entire structure of the American heritage was also susceptible to infiltration and subversion. As the revolution bore in, its lust of sexual energy was compounded by lust for the power to prevail and to rule. Authority was not only being questioned, but was cracking and crumbling in giant chunks. But as Seligman points out in his book Modernity’s Wager, authority and the need for authority “are irrevocable aspects of the self and the human condition.” The need is for belief, purpose, and a guiding star to shine toward the future, even if the beliefs adopted are not called religion.

Seek and Ye Shall Find

Guilt is a convenient incubator of the hate essential to maintain an assault against the core values of a society under attack. Guilt for every misdeed, real or imagined, ever committed in the name of America and the West must be piled high to fuel the flames of hatred and justify wanton destruction. If slavery was evil, paint every white man and woman today with the brush of their ancestors’ guilt. If minorities believe they have not been offered their free shot at the American “pursuit of happiness,” set them aside in multicultural holding pens and nurse their grievances to keep the pot boiling. Guilt for the country’s horrors and misdeeds, real or fabricated, and hatred of the self for having been part of that illegitimate enterprise, must burn bright and hot in the heart of rebels on the rise. Accepting guilt for a guilty past adds a patina of self-righteousness to the drive to destroy the authority and institutions that represent the guilty past.

To assure ascendancy over a guilt laden past a new creed is needed to replace the old. A new normality must be forged and made to appear legitimate and acceptable. Guilt and hate must first be leveraged to induce a vacuum of faith, a gnawing emptiness craving to be filled with new belief. Fear is also useful to feed the flames of hate. A gnawing fear that the defeated culture should rise once again to reclaim its inheritance justifies any measure taken toward its extinction. As one commentator has said, whatever it is these people disbelieve, they disbelieve in it passionately. So devout is their hatred and disbelief in existing authority, so poignant their need, and so ardent their devotion to revolutionary upheaval that masses of revolutionaries are not aware that their disbelief has gradually become institutionalized. Seeking to destroy authority, they have given birth to new authority. Their rejection of authority, deep, profound, and abiding, has gradually coalesced into new forms of faith and belief.

Two Commandments

The new order, though it claims no anointed priests, no philosopher kings, no red-hatted Cardinals, and no reliance on celestial guidance, has nevertheless called forth new authority to backfill its chasm of revolutionary rejection. To soften the truth of what is happening the new authority preaches toleration and compassion to present as its public countenance. Since the new order has abolished morality based on a transcendent order of God or natural law, who then can claim the right to admonish others, to condemn, to approve, or to judge? So it is that from some cloistered enclave of command and authority, with no identifiable point of origin, seal of authenticity, or index of official sanction, but imperative all the same, the First Commandment of a new order is delivered unto us: Be Not Judgmental.

Under the gentle indulgence of the First Commandment acts of one’s fellow creatures are not to be judged. To speak of good or evil, of right or wrong, would be “judgmental.” Yet imperatives have appeared. Some speech is allowed on university campuses, and some is not. Some groups are preferred over others. Designated opinions, organizations, and practices are approved, while others are not. Canons of behavior emerge in profusion, and are ignored at the peril—political, social, and sometimes physical—of those who transgress. None of this is based on a judgement of right or wrong in the old sense, of course. But even so guidance is required to replace the defeated moral strictures. Fortunately, guidance has come forth. Emanating from the same shrouded and ineluctable source as the First Commandment, Be Not Judgmental, there has been revealed a Second Commandment: Be Politically Correct.

That’s it. Ten Commandments boiled down to two.

With no modern day Moses descended from spectral heights to deliver them, it is difficult to say exactly how and when the Two Commandments became authoritative. It would appear that as the basis of a new secular religion, they were formed by accretion rather than by any singular act of revelation. To adhere to this new authority became known as being politically correct. The adoption of that term of enforcement for the Two Commandments is equally shrouded in mystery. There is, however, scattered evidence concerning the origin of the term itself, whether the form be politically correct, political correctness, or political correctitude. Some trace the phrase back to the founder of Chinese communism, Mao Tse-tsung. The term seems also to have been popular in most Marxist societies in one form or another to enforce correct speech and thought. Use of the term has rarely, if ever, been associated with an open democratic society.

Thus the wording and usage of the two Commandments and that of their enforcing mechanism, political correctness, simply grew spontaneously out of the lifestyle, ideals, and turmoil of the sixties. The phrases gathered authority incrementally as the Civil War grew more intense and the need to enforce compliance to its edicts became more pressing. Finally, one day the existence and application of the Two Commandments and their enforcement mechanism seemed always to have been part of the new culture.

The “protest and outrage” of the FSM at Berkeley that President Clark Kerr perceived after the fact to have been “fresh and meaningful” was even then taking shape and defining its meaning. The declaration of Civil War uttered by Mario Savio might have been laughed off as silly juvenile blather, which to a large extent it was. But the time had been right. Seeds of a new secular religion had been planted. Its guiding Commandments had been conceived in the tumult of revolution, and nurtured in a deepening disconnect between the old morality and the new. Their battle plan was ready. The masters and planners in the vanguard, who urged their disciples to “question authority,” knew that once they got their hands on the levers of power and became authority, the question period was over. That would come later. In the meantime there were many battles to be fought on many fronts.

II. War of Attrition
3. Silencing the Self
The Self

From the injunction of the Greek philosopher Socrates to “Know Thyself” to the Christian principle that all souls are equally precious the worth of each individual person, each self, has been central to the Western structure. Under the American constitutional system each individual is endowed with the right to make the best of his talents and capacities. And each must recognize that right in others, to be exercised by all with as little interference from government as possible. In return the individual is to accept certain obligations toward the state that supports his individual selfhood. That is the kind of individual self, and the resulting type of society composed of such selves, that the rebels of the Civil War must destroy if they wish to prevail.

The Civil War rebels aim for quite a different plan of selfhood in the population of the nation they wish to master. Their pattern for the American people is more that of Narcissus of Greek mythology, who fell in love with the image of a beautiful youth he saw in a mountain pool. Unable to possess the idyllic vision, and not realizing that it was his own reflection, Narcissus pined away and was turned into a flower. The self dissolved into itself is the image the rebels wish for their enemies, the American people. The hero turned anti-hero, immersed within himself, docile and easy to manipulate is the vision of the present American revolution and its mutation into Civil War. From the image of self love there is seldom reflected the obligation of social responsibility. The thinking, self-sufficient self must be destroyed. The dissenter, the draft dodger, the anarchist, the wasting young female “celebrity,” the failure, and the helpless “victim” become the new icons, the new models of attention and concern.

The requirement that the free individual find for himself the point of balance between Narcissus, dying over love of his own reflection, and his obligation toward the state must be repudiated and destroyed. It is that balance that makes a coherent society possible. The free individual has the right not only to search for his own true self, and to chart his own destiny. He is also is bound by the constriction not to transcend that point of balance. A developing self resists both the temptation of thoughtless submersion into a false inner self, and surrender of the self into some greater mass or cause as a substitute. It is in accepting that balance that makes the free yet participating individual the indispensable creation of Western and American culture. That sort of individual is an entity that does not automatically regenerate itself. Its nurture requires careful attention to achieve the necessary measure of both freedom and social integration.

There are developments in American culture that underpin the aims of the Civil War, even when their advocates do not necessarily conceive that as their purpose. Abraham Maslow, an American psychologist of the early to mid-twentieth century, was a prolific writer, a professor of psychology, and chairman of his department at Brandeis University. Maslow advertises a “humanistic psychology” designed to induce “self-actualization.” Each individual is to be encouraged to get in touch with his or her inner feelings, and to express those feelings openly, even aggressively. Some suggest that this is something akin to training a puppy to relieve himself. In either case training is quite unnecessary. Such children as Maslow would set free to “be themselves” used to be referred to as uncivilized. They are not to develop, only to prattle about the child they find inside. They will not be taught how to live responsibly in a democratic society.

Carl Rogers was also a prolific American writer along these lines at about the same time as Maslow, and a professor of psychology at Ohio State University and the Universities of Chicago and Wisconsin. Rogers advocates openness surging from a free-flowing contact with the emotions as does Maslow. Rogers asserts that morality, and even the nature of reality, is entirely the choice of the individual. That the individual might make bad choices does not concern those who promote such ideas, since they consider the inner individual to be genuine and authentic. Hence there is no call in these theories to consider whether either the inner self or the result of this psychology is good or bad, either for the individual or for society. “You are what you think you are.” That was the caption of an advertising poster of a cat looking at a mirror. The mirror image was a lion. What predators the “lion” might meet in the real world is another matter.

Other psychologists take a different view. Dr. Ronald W. Dworkin is a practicing physician and a senior fellow at the Hudson Institute. He doubts that an individual expanding on his or her personal thoughts and emotions can reach much of an understanding of the universe. Dworkin finds that by the end of the twentieth century the “bold and expansive ideas” of such as Rogers and Maslow, and the effort expended on searching for the higher self as an inner self, have resulted only in a “kind of drowsy selfishness.” For some, as Dworkin sees it, the result has been “a total loss of how to situate oneself in the world.”

The self, wrapped up in its own self-indulgence is not the kind of selfhood that is contemplated in the structure of American society. The American self is not a selfish self, but a responsible self with a sense of moral obligation both to himself and to his fellow citizens. That sort of self, necessary to underpin a free society, is being enticed to lose itself in self-indulgence. Peter Augustine Lawler, Dana Professor of Government at Berry College, has examined moral discourse among ordinary Americans today. He finds that “to an amazing and unprecedented extent” morality amounts to not much more than concern about “one’s own emotional wellbeing.” Popular concepts such as “self-actualization” and “free flowing emotion” leave human creatures missing that part of the self that was once the core of being human. They have lost connection to an integrated society and to a metaphysical moral world. They are creatures in human form emptied of human content. Boston College political science professor Alan Wolfe has surveyed cultural attitudes and beliefs in various American cities and regions. He finds that ordinary Americans are willing to adopt “a version of moral laissez faire” that is in reality “an excuse for not taking others seriously.”

Authenticity and Self-Esteem

Christina Hoff Sommers, resident scholar at the American Enterprise Institute, and Dr. Sally Satel, a practicing psychiatrist and Yale University School of Medicine lecturer, examine these matters in their book One Nation Under Therapy/ How the Helping Culture Is Eroding Self-Reliance. Daniel N. Robinson, distinguished professor emeritus at Georgetown University, finds the Sommers/Satel book revealing in examining the decline of the stoic, self-reliant individual who characterized America from around 1600 to sometime after World War II. People in a personal crisis today are often “helped” so much by swarms of counselors and other well wishers that the individual is often afforded little chance of learning to stand on his own. He fails to form the strength to confront what the American essayist and poet Ralph Waldo Emerson calls “the rugged battle of fate.”

In therapy sessions individuals in trouble are encouraged to reveal their true selves as a means for coming to terms with their difficulties. But Satel and Sommers find that if the individual is pressed to expose and deal with his real self, the real self is not what will be revealed. What will be asserted as the self is only a manufactured performance for public display that the troubled self hopes to pass off as its real self. Robinson summarizes the Sommers-Satel book as showing that what we must convey to these sufferers is not how to improve their performance but “the cost of living life as if it were a performance.”
Evaluation of the “performance” that is presented as the real self is often centered on the “authenticity” of what is being expressed. The theory is that if the performance presented is “authentically” that of the presenter that is all that is required. The reward is something called “self-esteem.” The director of an Oakland California singing contest reports on the effect when a contestant relies on “authenticity” to support his or her performance given to achieve recognition and self-esteem. The presentations of many contestants were so bad they could not carry a tune. Their lyrics were fractured and senseless. As they sat waiting their turn to audition it did not occur to these contestants to compare what they could do to the best they were hearing. There is no “best” (or worst) when authenticity and self-esteem are the controlling values.

These contestants, unprepared and ungifted, were infuriated when told of their elimination, and truly did not understand why that should be. Their singing and composing were, they insisted, “authentically” theirs. That is enough they insisted, and they should be recognized. Such people have become disconnected from reality in their cocoon of self-esteem where merit, comparative ability, and excellence have been suffocated for lack of outside air. They have been comforted by the efforts of therapists who caress their tiny egos with compassion and empathy. Having been taught to submerge deeply into themselves, they feel shock and disbelief when reality slaps them in the face. The pride of true achievement has never been shown to them. They know only self-esteem, an alluring inhibition to achievement.

Jeffrey Hart, a senior editor of National Review examines the search for authenticity. Indulging in what he calls “some Freud-speak,” Hart asserts that authenticity is a child of the id, the unconscious part of the human psyche, or mental process, that acts instinctively to generate psychic energy from long buried sources. The ego in Freudian psychology manages conscious thought, and acts as a moderator between the inner person and outside reality. The super-ego internalizes the morals and values of society, creating a conscience that may induce a sense of guilt, leading to a desire for more adequate connection with outside reality. The super-ego shapes a self by exposure to the best works of great men, that is, to the products of civilization. The id generates an indiscriminate appetite for scintillation, a drive to “go with the flow,” a craving to feed passions generated deep below the surface of civil discourse.

A person following his id according to his need to generate self-esteem may be praised for the purity of his innocence. This provides a liberating rejection of the imperatives of the super ego imposed by a supposedly repressive society. Jacques Barzun, a French born American scholar, teacher, and prolific writer on cultural matters, comments that, “We praise innocence because we want the license to behave like an infant.” Unfortunately, as Boston University professor A. D. Aeschliman notes, such unreal views of human nature tend to harden into “intellectual attitudes and institutional forms” that come to dominate the culture and the schools. Which, he might have added, tends to generate a nation of juveniles. Such schools of thought might be compared to the “performance” of the juvenile Oakland contestants who invent their often weird personal performances to compensate for lack of a substance they do not understand.

Freedom fully and squarely faced can be frightening, as the girl hitchhiker on the way to an FSM rally in Berkeley some four and a half decades ago found in a brief knot of visceral reality. Freedom is a burden that requires discipline if it is to be maintained. But that lesson is easy to forget when the call to indulgence, rebellion, self-actualization, life without limits, freedom and more freedom, all accompanied by a ground bass of carnal stimulation reverberates from the mountain tops. The nascent individual who might seek to discover true individuality has fewer points of reference for guidance within his own embattled society.

An inflated self-esteem is about how you feel. If you feel good, and what you do is “authentically” yours that is all that matters. Little room is left for the formation of substantive goals or for the pride and self-confidence of true achievement. The concept of reward based on merit is lost. Soon the very words denoting substance in achievement begin to disappear, as the word “character” has nearly disappeared already, along with “morals,” “judgment,” and “integrity.” These and similar words once helped to define conduct that is civilised, honourable, and necessary if a democratic society is to endure. Jacques Barzun cautions that even self-respect, if not carefully guarded, can “without warning” engender “vanity and self-righteousness.”

It is unlikely that an individual can navigate life satisfactorily if cut off from reference points developed over the ages that define the boundaries of the journey. That is to require of this lonely and miniscule entity that it manufacture for itself a personal substitute for the complexities of what was once taught and revered as the history and culture of civilization. Columnist and author Florence King notes that those who advocate that we get in touch with our feelings seem to assume that there are only good feelings with which to get acquainted. If “feelings” are to govern it takes only a glance at the daily news to realize that feelings also inspire rape, torture, murder, and the rest of the criminal and totalitarian catechism.

A population of such lost selves is a gold mine of recruits to the ascending Civil War against America. Personal Responsibility

If individuals are not held responsible for their actions there can be no codes of conduct, no rule of law, and no society. Civilization is on the road to disintegration. Stripping the individual of his or her unique qualities and melting the remnants into a pool of similar nonentities is an elementary goal of the refugees from civilization who populate the cadres of the Civil War. To induce the individual to disbelieve in herself or himself is enormously useful to those who would achieve the goal of coercion and control. To turn crime into some kind of psychological disturbance achieves the same end.

After her husband left for work one morning Andrea Pia Yates drowned their five children in a bathtub. Five. One after another she laid each corpse out to dry as she went to fetch the next for a dip in the tub. British born American columnist and author John Derbyshire asks, “Is the lady sick, or just very wicked?” The reaction of women’s rights activists left no doubt. She was just sick. Feminist Anna Quindlen informs us that Yates was sick as a result of “the insidious cult of motherhood.” That would be the “insidious cult” by which the race regenerates itself, and families regenerate civilization. TV personality Rosie O’Donnell felt “overwhelming empathy” for Yates. Katie Couric, TV talk show host and news anchor, asked her viewers to send money for the Andrea Yates legal defense fund.

An Andrea Yates Support Coalition was formed including the ACLU, anti-death-penalty groups, and others who seemed to see Mrs. Yates, rather than the five dead children, as the victim. In the opinion of Derbyshire the real motive of such people is “to establish ‘awareness’ (as they would say) of another Victim Sickness.” Brought on, as the more radical feminists might say, by “the oppressive white male heterosexual hierarchy” that has imposed the “insidious cult of motherhood.”

Derbyshire perceives a drift toward the “medicalization” of all of life’s hazards and tribulations. This would occur by attributing the cause of almost any anti-social event to some “infectious agent” or “organic malfunction.” As for Mrs. Yates, said Derbyshire, she did “a monstrously wicked thing.” For which, he hoped, she would be tried and executed. Derbyshire got the guilty verdict he wanted, but the jury sentenced her to life in prison rather than imposing the death sentence. The defense of Andrea Yates and her like is based on the slippery, if not openly asserted, idea that no one is really responsible for anything. Society did it. All that need be pleaded in defense of the most heinous crime is the culpability of an unjust social order. For these social subversives the standard of personal responsibility is just another symptom of male hegemony and repression.

Andrea Pia Yates was tried for only three of the five murders. After she was sentenced to life in prison Yates and her supporters persuaded the Texas First Court of Appeals in Houston to overturn her conviction. A new trial was ordered and Yates was released on bail. At her new trial Yates plead not guilty by reason of insanity, and after several days of deliberation the jury bought it. She was sentenced to enter an appropriate mental institution and remain there until sufficiently recovered to sense that something went wrong. Didn’t there used to be four or five kids around the house who aren’t there anymore?

Character

The founders of the United States believed, says James Davison Hunter, that elevating individualism to a primary status would result in “orderly, temperate, moderate, careful, and self-controlled citizens.” They would be citizens of good character; something like the founders saw themselves and their contemporaries. In his book The Death of Character / Moral Education in an Age Without Good or Evil Hunter, professor of sociology and religious studies at the University of Virginia, traces quite a different development. His blunt assessment is that, “Character is dead.” He sees little possibility that it can be revived: “Its time has passed.”
American Enterprise Institute scholar Charles Murray in his book Losing Ground is an early critic of the welfare system. He sees welfare as having a great deal to do with a general adoption of underclass behavior that developed as a result of welfare policies. He observes that the code of the gentleman and the code of the lady have disappeared. Murray recalls that the American gentleman’s code was an elite-based formulation of character that included being “brave, loyal, and true.” If in the wrong, a man owned up to what he had done and took his punishment “like a man.” A man did not take advantage of women. One was expected to be gracious in victory and a good sport in defeat. A handshake was as binding as any legal document. When the ship went down, the women and children were put into the lifeboats and the men “waved good bye with a smile” (As most of the men actually did when the Titanic went down).

Murray finds all that has collapsed and left a vacuum that is being filled by the “thug code” of the underclass. That code allows you to take whatever you want, to respond violently if antagonized, to gloat when you win, to despise courtesy as weakness, to treat women as faceless objects of pleasure, and to take pride in cheating or deceiving. “The world of hip-hop,” Murray perceives, “is where the code is embraced.”

At the same time there has been adopted what Murray calls a new code of “ecumenical niceness.” People have become unwilling, or even afraid to criticize behavior they know to be wrong, or to make cultural judgments. They have been bludgeoned to Be Not Judgmental. The standards of acceptable behavior in America have been ghettoized. Nor is this degeneration of behavior and intellectual standards unique to America. Murray sees elites throughout the Western world “twisting in apology” for failings of the West, real or fictional, “disavowing what is best in their cultures, and imitating what is worst.” Murray cites the Clinton presidency, in the conduct of both the President and others in his administration, and in the public reaction to it. That, he argues, “was a paradigmatic example of elites that have been infected by ‘the sickness of proletarianization.’”

The Clinton-Monica Lewinsky scandal during the late nineties, says writer and speaker Os Guinness in his book Time for Truth, Living Free in a World of Lies, Hype, … Spin, represents “the postmodern crisis of truth in presidential form.” Guinness expresses the fear that the American pubic is willing to accept fictional selfimages as the truth, much as the Oakland contestants did. If the public comes to accept manufactured perceptions of truth, there is no truth, and and those with the slickest tongues or most powerful armies prevail. It is instructive to review, in view of the hype and spin that Guinness describes, the simplest elements of the Monica Lewinsky matter during the Clinton administration.

When the news of the affair broke, Clinton’s advisers told him that if it were true the American people would demand his resignation. Clinton’s reply was, “Then we just have to win.” Clinton went on television, looked the American people in the eye, and told them he had had no sexual relations with “that woman.” Clinton was betting his job on the lie of an innocent “inner self” invented for the purpose, and he won. Later it was learned that Clinton had made his TV statement only after intense rehearsals with his advisers to make his denial appear credible. Eventually Clinton had to admit that he had lied, and that he had had sexual relations with “that woman.” By then the intervening spin, the incessant hype, and a general weariness had done their work. The thing had become a “private affair.” It was time to “get this behind us” and “move on.” It was that last phrase that inspired the radical blog MoveOn.org.

Guinness insists that truth is not simply a reactionary dogma as some would have it, but “one of the simplest, most precious gifts” to humankind. Truth is a gift without which “we would not be able to handle reality or negotiate life.” How long can the “domestic Tranquility” of which the Constitution speaks last in a Clintonian world? How long can the institutions of science, law, or government endure if truth is secondary to reputation, fame, political expediency, and self images reinvented for the purpose of the day?

The popular belief today among intellectuals, academics, schoolteachers, editors, top government bureaucrats, and many others is that customs and morals of the past are of no relevance. The past is over and done with, and need not be revisited. Writer and philosopher Roger Scruton reminds us this may not be so. The founders of this nation and the philosophers, authors, and statesmen from whom they drew inspiration for the Constitution knew better. They knew, Scruton says, that the acceptance of custom was necessary to the preservation of freedom, and that to disregard moral norms would leave the state free to do as it likes. They knew that neither nations nor individuals lacking character can long survive. This for the reason that in the condition of anarchy that results life becomes “nasty, brutish and short,” as the Seventeenth Century English philosopher Thomas Hobbes describes. Then only the Leviathan, the omnipotent state, as Scruton echoes Hobbes, “can manage the ensuing disaster.”

Peter Augustine Lawler, professor of government at Berry College, identifies those who demand rights without responsibility in the title of his book Aliens in America. Lawler finds that the supremacy of one’s personal and emotional well being now sets the parameters of moral discourse. This, he says, has occurred “to an amazing and unprecedented extent” among a population that is “otherwise ordinary Americans.” Neither in the irrational nor the intensely personal is there solid ground upon which to resist, or even to identify, falsehood and evil.

And there are far more “aliens” amongst ordinary Americans than it is comfortable to contemplate. The American “aliens” to which Lawler refers are the liberals waging Civil War against America; those who do not share American values or revere American history.

These aliens are the children of the children of the sixties, images of the rebels of that generation. They stand together in solidarity, as they believe, on the chest of a dying culture, unwittingly flaunting the destruction of selves they may never know. For the women among them there is a special kind of additional coercion as well.

4. Gender Wars
Feminists and Genderists

The Feminist Movement, or the Women’s Liberation Movement, is split between two types of feminists: the equality feminists and the gender feminists. Those who demand equal rights, equal pay, and similar overdue notice and reward are the equality feminists, or simply feminists. They strive for the equality their identity signifies. They envision a society that is free, open to the same opportunities for all, humane, and based on the realities of human nature. Their view recognizes that men have a right to exist, have a function in the world, and might be pleasant company at times. Many even like having babies. To these feminists the world is democratic and free of arbitrary constraints and imposed behavior. Their drive is for genuine equality of treatment and opportunity.

The others are the NOW type of feminists, the National Organization of Women. These are the gender feminists, or genderists. This radical element attempts to remodel women into something other than what nature has determined women to be. Founded by Betty Freidan in 1966 the National Organization of Women is often termed the grandmother of contemporary feminist organizations. Seeing that the equal rights mission of equality feminism is largely successful, the gender feminist avant-garde is panicked. The thrill of their chase for power and authority cannot be allowed to dissipate in success. NOW and similar groups must now present the women’s liberation movement in a militant and adversarial mode. They hold that men and women are absolutely equal in every way. The alleged differences between men and women are the result of oppression by dominating males who can be dealt with only through belligerent confrontation, a type of class warfare, a power struggle for domination. These genderist apostles at the cutting edge react like the novice drug user after the first few delirious fixes. They adopt the habit. The genderists address particular vitriol toward non-conforming women still “enslaved” to men, like those who get married and have babies, whom they label traitors to the cause.

Christina Hoff Sommers is a resident scholar at the American Enterprise Institute and author of The War Against Boys. In her book she observes that the genderists are not satisfied with equality of opportunity, but insist on equality of outcome as well; and this would apply in every aspect of life. It is a form of equality that would leave no one behind. Taken to its genderist extreme that would allow no one to be out in front either; leaving nothing but gray, spiritless clones of each other waiting to die. Genderist tactics range from intimidating women who prefer to stay home and raise their children, to ridiculing men to the point of extinction, to insisting on women in infantry combat units. They go so far as to claim that even gender differences—their term for sex differences—are culturally induced and arbitrarily imposed by a chauvinistic male order of oppression. Men and women, boys and girls, they insist are the same and appear to be different only through cultural manipulation.
The chief theorist of the genderists is the 20th century author and activist Simone de Beauvoir. Her genderist view is based on a claim of unlimited freedom for women in all respects, and above all in sexual freedom. Charles Kessler, a senior fellow of the Claremont Institute, points out that the radical claim of unlimited freedom necessarily rejects the socializing idea of freedom within limits. That observation de Beauvoir readily accepts. Kessler identifies as the “telltale political marker” of the genderists’ theory of freedom its attack on the family. For de Beauvoir the commitment of a man and a woman to each other in marriage not only establishes unacceptable limits to sexual freedom, but also forecloses unlimited freedom in all other respects as well. Kessler predicts that a society that defines freedom to include, or even require, the absence of committed human relationships, and therefore abandons defense of the family, will not remain free.

To enforce their idealistic and unnatural goals these disbelievers in femininity necessarily gravitate toward coercive, authoritarian, and anti-human behavior. They know that harsh political enforcement is the only hope of instituting their barren program. Sommers counters genderist claims of oppression with the assertion that American women are among the freest in the world. This, she reports, elicits from the genderists the claim that American women undergo “psychological foot-binding.” And for her apostasy, Sommers says, the genderists “wish to excommunicate me from my sex.”

The genderists tend to position themselves among the far left politically, to reject all advances of women as inadequate, and to hate America. As their program unfolds it invites the question whether the genderists are feminists at all. Or even women, as indeed some claim not to be. The genderists are welcomed warmly into the ranks of the Civil War, offering as they do an avid new counterculture faith for the faithless. Their new faith is something to grasp and hold tightly against their disbelief in their country, their culture, and ultimately in themselves.

The Emancipated Co-ed

The Emancipated Co-ed on the campuses of America is something to behold—if not necessarily to be held (except perhaps by other women). Cast in an environmental mindset of the genderists who disbelieve in all established sexual morality, the fresh co-ed is taught that fulfillment is to be found in unlimited personal and sexual liberty. David Pryce-Jones, a National Review senior editor, notes that the result has been a new vulnerability of women. They are using themselves as men have always wanted to use them, but were inhibited from doing so by the woman’s accepted code of more virtuous behavior. The new promiscuity places the woman in a position that Pryce-Jones maintains clashes with the natural desire of both men and women for the “exclusive sexual possession of another.” This, he says, has produced an “explosion of violence between the sexes.”

Graphic descriptions of college sex are detailed in Tom Wolfe’s novel I am Charlotte Simmons. There are no rules in the co-ed dorms, or if there are rules “for the record” there is no supervision to enforce them. Women are “sexiled” from their rooms to make way for the libidinous behavior of their roommates. They have to go down to the lounge and sleep on a couch to accommodate their roommate’s desire to indulge in what once went under the derogatory nomenclature of “shacking up.” On party nights fraternity row operates much like an inner city red light district.

Any college lass who doesn’t want to participate is ridiculed or coerced into altering her pristine predilections. Should she approach a counselor or a physician in the health clinic for advice on how to avoid these activities, she is assured that it is her freedom that is being protected for which she should be grateful. In other words, “Have at it, and have a good night.” A spontaneous orgy might develop in almost any suggestive location. The goal of the emancipated co-ed is to “score” the Big Men on Campus, who return the compliment by referring to her and her fellow caterers as (sorry) “cum dumpsters.” Wolfe sums up the results of campus sex practices in an extended passage: “Rut, rut, rut, rut, rut, rut, rut, rut, rut, rut, rut, rut, rut… ” Nor do Wolfe’s descriptions appear to be at all fictional.

At Wesleyan University the president pressures all fraternities to become co-ed and denounces single sex dorm rooms. Yes, even the rooms! The criticism that this amounts to setting up a system that to some seems not much different than subsidized brothels is dismissed as ignorant moralizing. The Wesleyan president, clucking reassurance, explains that to her “gender [sex] and biological sex are separate and distinct concepts.” Apparently roommates are expected to gender together without biologizing each other. But, should temptation so near become overpowering, well… .

Wolfe’s view of campus sexual behavior is supported by the occasional confession of a college dean or counselor who is brave enough to say she or he is sickened by what they have to deal with. They have dared not speak of it for fear of retaliation from a campus administration dedicated to the freedom of rutting. Animal lust, summoned from the depths at Berkeley and Woodstock those decades ago, remains addictive, impersonal, and promiscuous. In a book titled Unprotected a psychiatrist working in a university women’s health clinic writes of her experience with young women she has counseled.

Heather, who reports to the clinic in a state of severe depression, has a “friend with benefits.” This means he has the right to have sex whenever he so desires, but he refuses any activities with her other than sexual. Anything more would be a “relationship,” which he doesn’t want. Heather is confused and depressed because she doesn’t get the “friend” part of the arrangement while he gets the “benefits.” Campus policy does not allow the author of the book to suggest that Heather’s depression is the result of this unhealthy arrangement, or that she should end it. That would be contrary to the “emancipation” the genderists have gained for her, an academic dogma carved in stone. The author finds that many other “Heathers” also become physically and emotionally ill after repeated sex with no personal commitment.

The book Unprotected was published anonymously out of the doctor’s fear that she would be punished personally and professionally were she to tell these young women that they are being used and abused. “Radical politics,” she writes, “pervades my profession and common sense has vanished.” A year or so after the book came out and was a success the author revealed herself on Dr. Laura Schlessinger’s radio show. She is Dr. Miriam Grossman, a psychiatrist at the University of California at Los Angeles. The taxpayers of the State of California pay for the debilitating abuse of young women she describes.

Freelance writer and columnist John Bambenek cites a report in the Journal of Sex Research, published by The Society for the Scientific Study of Sexuality, on the effects of uncommitted sex on women involved. The report finds that most campuses host an annual celebration of “uncommitted sex.” The University of Illinois titles theirs “Sex Out Loud.” The report relates widespread depression among college women who are lured, or even sometimes coerced by peer pressure, into uncommitted sex practices. This depression results, the report indicates, from the fact that uncommitted sex “flies in the face of our internal nature.” The findings of the Journal would appear to validate on a wider scale the experience of Dr. Grossman at UCLA.

College dating in the years before Liberation was a search for a life partner, and therefore a serious matter. Young men and women got to know each other, “went steady,” or “got pinned,” and formed a basis for possible marriage. Dating in that sense has largely disappeared from the campus of many if not most of the prestigious institutions. As at UCLA dating has been replaced by “relationships,” understood to have no permanency, or “hookups” for the day or night. Or arrangements like Heather’s. The liberated woman who leaves college alone may find mating in office cubicles or singles bars less attractive than it might have been on campus. For the men she meets Women’s Lib is often welcomed as furnishing one more of an endless series of nameless orifices, upturned for a moment of empty indulgence, then abandoned.

How liberated does the woman of the world come to feel, staring at the ceiling on all those mornings after? The warmth of professional success in the world after college was what she had worked for, expected, and achieved. Her dreams seem to have been fulfilled. Until the alarm of her biological clock begins to sound in the middle of the night. She consults her mirror to see if she is still attractive, and cannot deny the harder lines around the mouth and eyes, even though she can afford the best of beauticians and cosmetic surgeons. Perhaps another little tuck? Botox might do. Or a new hairdo? College to her, bright and ambitious, had all seemed so glorious and free. To her, dedicated to a professional life, all those different men, the fun of choosing and changing, the groupies, the weird techniques had seemed right on. Fun in college and a brilliant career to come. She had been in command. Hadn’t she?

Jennifer Roback Morse, a Ph.D. in economics, is a prolific writer on sex, free love, marriage, and related topics. In an article for American Enterprise, a publication of the American Enterprise Institute, she writes of group sex, lesbian sex, casual sex, and about any other kind of sex imaginable. She reveals that in her college years and thereabout she had tried most of what she writes about and so has first hand experience on what works and what doesn’t. Those experiences she describes as “not a jolly time.” Young people today, she finds, do not know how to express what they feel about unsatisfactory sexual relations. It is no longer acceptable to feel “cheap,” or “used,” or that a relationship is “wrong.” Such words have been expunged from “the stunted moral vocabulary” of the modern woman constantly assured that “anything-goes.” So the best a young woman staring at the ceiling on those mornings after can say to herself, says Morse, is that it just feels “all icky.” A stunted moral vocabulary indeed.

The Neutered Male

It is no longer the “emancipated co-ed” who needs protection and spiritual sustenance at leading America colleges and universities. It is descendants of the “dead white European males” who created Western society who are now under attack. For genderists on the loose the vision of the male remnant eliminated is their eternal aphrodisiac. They lust after the power to destroy the “slave society” founded by those hateful creatures. They anticipate ecstatic joy in annihilating both the male remnants and the society they have created. They have no idea what they are wishing for in place of either.

Strange as it may seem, after the collapse of Communism nearly everywhere else in the world, the snake oil of Marxism is still a weapon of choice on the academic front of the Civil War. Author and editor John Zmirak writes in The American Spectator that teachers and administrators of advanced “feminism” see women as a “domestic proletariat” engaged in class warfare. Zmirak cites European cultural theorist Monique Wittig’s discourse against a society founded on heterosexual principles that does not recognize the rights of lesbians, women, and homosexual men. The term “woman,” she insists, has meaning only in terms of heterosexual concepts. Wittig bravely asserts, therefore, “Lesbians are not women.” With that observation not a few of the red-blooded chaps who still refuse to be cowed by the feminists might heartily agree.

In a country becoming progressively more feminized, then genderized, there is a question as to how many “redblooded” citizens (men or women) might remain to serve and defend the nation. Consider the following from Duke University law professor Marilyn Morris, who served as an adviser to the Secretary of Defense in the Clinton era. Melanie Kirkpatrick, a Wall Street Journal editorial writer, reports that Prof. Morris advised the Secretary to eliminate from the military prevailing attitudes such as “dominance, assertiveness, aggressiveness, independence, self-sufficiency, and a willingness to take risks.” Think about that. Read it again. This is serious advice to the Secretary of Defense on how to run the armed forces. Can embroidered combat helmets and lipstick be far behind?

Conditioning for a genderist society starts in grade school and continues through middle and high schools, most of which have a predominantly female teaching staff. To lay the groundwork for hating men it is only natural to begin at the beginning by hating boys. Gloria Steinem asserts that boys should be raised the same way girls are. One consultant funded by the Department of Education described Little League baseball as a place “where parents and friends sit on the sidelines and encourage aggressive, violent behavior” (It used to be called an active sport). In some San Francisco schools the boys are forced to do quilting. A Caucasian boy is required to give a presentation as though he were an African-American woman.

If none of this works, there are always drugs. Hoover Institution fellow Mary Eberstadt in her book HomeAlone America reports that prescriptions for drugs used to treat anxiety and depression in preschoolers recently increased tenfold in four years. Drugs may be excoriated in public policy statements, but a truce in the War on Drugs is declared at the schoolhouse door. There overly energetic boys are administered a dose of Ritalin to cool down their incipient masculine ardor, and cure that dread disease called “attention deficit syndrome.” Eberstadt reports that Ritalin production increased tenfold between 1990 and 2000.

Women are now admitted to college in greater numbers than men. In many fields women receive more degrees than men, and overall 135 bachelor’s degrees are granted to women in this country to every 100 granted to men. The National Center for Education statistics projects that by 2017 the ratio will be 150 to 100. The equality feminists have done their job, and then some. That leaves the field to the genderists to wreak what havoc they can. A number of colleges provide “Women’s Centers” frequently run by lesbians or other far left genderists, while discouraging or even destroying the fraternity system for men. At Wesleyan University the president pressures all fraternities to become co-ed. At Colgate, with a similarly elevated purpose, a newly appointed feminist president decrees that fraternities must disband and sell their buildings to the college for use as “diversity housing.”

Zmirak writes that, while he persevered and finished his Ph.D. in English, he realizes the futility of thinking of an academic job in the humanities. What of other men, he asks, who enter college with a love for literature, art, or history? Will they attempt to explore and transmit inherited traditions of the West which he sees as “essential to its survival?” Not likely, Zmirak thinks, in the caustic atmosphere of utter disbelief in those traditions that pervades contemporary humanities departments. Rather, as he did, he predicts most other men of similar mind and experience in academia will take their B+, see that the humanities “are only for women and ‘Queers,’ and move on.”

Harvey Mansfield, Harvard professor of politics, says in his book Being A Man that “the entire project of modernity can be understood as a project to keep manliness unemployed.” And if you want the latest in that strategy look again to the always-inventive European Union. In Germany there is a burgeoning movement to require men to sit down while urinating. Whether there is to be surgical follow-up of this new mandate is not yet clear. But at least one commentator has already concluded that we no longer live in a golden age, or even a gilded age, but in a “gelded age.” “Higher education” is busy honing the necessary intellectual instruments.

Love and Liberation

In her book An Old Wife’s Tale Midge Decter views feminism from her perspective as a woman who was a suburban housewife before becoming entangled in the debate over the feminist movement. She writes of the rewards as well as the travails of staying home and raising children. She enjoyed talking with other young mothers over morning coffee, loved watching her children grow, and found fulfillment in participating in the activities of her community. Observing the effects of Women’s Lib and the rest of the Civil War, Decter gradually evolved from a New Deal liberal into a neo-conservative. Experience convinced her that in practice neither the ideas of new movements of today nor those of the old New Deal programs work.

There is a gap between the world of intellectual fantasy, where ideas have no consequences, and the world of reality where they do. The consequences of that gap, says Decter, have turned out to be her main preoccupation as a writer. She seeks to account for “the distances between… the experience of something, and the way that experience has come to be talked about,” in both public and private life. The “liberation” of women, she believes, has enslaved them to an unnatural and wrenching divorce from the reality of their innate makeup, their human nature. This has been a cardinal aim of the Genderist faction of feminism since the 1970s, as author and commentator Jeffrey Bell explains. The core of their preaching is that children and childbearing are “the central instrumentality of men’s subjugation of women.” From that stance flows the imperative that women shall be “free” to indulge in any and all kinds of sex. Or, as Dr. Laura Schlesinger once commentated to a caller on her radio show, to make of themselves “unpaid whores.”

“Lust as an independent value,” Decter believes, divorces itself from both institutional and personal relations, and travels with indifference from “creature contact to creature contact.” Decter holds that Americans reject these results from the success of women’s “liberation,” at least “in the pits of our stomachs.” Rather than recognizing that intuition, however, the popular reaction is to find excuses for indulgence. The effect of adult indulgence without regard to its effect on children is considered in a following chapter.

For those who can’t quite complete the divorce of sex from love, the practice is fumigated with words. Instead of calling it “free sex” some call it “free love.” The hope is that the more tender connotation will serve as a gentle shield against the reality of what is going on. Words needed in order to be fully human are slipping away. What is it that is “free” to the liberated woman in the life of “free love” that the ravaging male lives every night in the fast lane? What does that which she furnishes for him and he forgets as he quietly slips across the horizon of a new day, to new prey, add to her “freedom?”

In the TV series Oliver’s Travels Oliver, on a quest for the author of crossword puzzles who calls himself Aristotle, meets a suspended police officer Diane, who agrees to join him in the search. They find they like each other, become friends, and presently Oliver suggests they might share their nightly accommodations. Diane responds with a quizzical smile, “I’ll tell you when.” Oliver, driving the car, replies with mock resignation, “When is better than if.” That exchange represents two people getting to know each other in a way that suggests their relationship has meaning and might last.

The free sex foundation of the new Civil War decorates rutting with the illusion of modernity, of avant-garde thought, of post-modern chic, always with compassion, of course. The free sex foundation does modestly insist that its license to love is limited in one respect. The pleasure of “having it” whenever and wherever it pleases must not hurt another person. Human beings, especially women, are taught that they may, even should, use each other as objects of momentary pleasure rather than treat one another as unique and special individuals. This is the sort of conditioning that prepares a population for inclusion in the herd. When individuality and personality are taken away, the animal is prepared for the totalitarian feedlot where all animals are equal, similar, and equally dispensable. Sex that was once thought of as love for a person becomes merely the use of an object, mostly for the man. Is no one hurt in that?

Choice

In 1973 the United States Supreme Court reached deeply into the marriage relationship, the family, the love of men and women for each other, and the miracle of life itself. In Roe v. Wade the Court touched the core of what it is to be alive as a human being. That decision invented a “constitutional right” for a woman to have an abortion, and in so doing made abortion one of the most difficult issues facing the emancipated woman. A child born out of wedlock was, until recently, referred to as illegitimate—in former times as a bastard. Now the unwanted result of dalliance needs no label, since that result can be summarily disintegrated as a “right” of the otherwise mother. Such organizations as NARAL Pro Choice America (it is never called pro-death) claim that abortion is a right standing alone, exercisable by the woman herself for any reason, or no reason.

The NARAL Pro Choice feminist (the acronym comes from its original title, National Association for the Repeal of Abortion Laws) flaunts her rugged independence with such bumper sticker proclamations as “U.S. Out of My Uterus” or “Keep Your Laws Off My Body.” This is a form of individualism busy with itself, with little time or inclination to consider whether its credo is likely to have social or cultural consequences, or even unwanted personal repercussions. Total freedom is what matters—for the woman. The ideology of I is triumphant, and no one has the right, the genderists and even many feminists insist, to judge the woman’s choice.

The Roe decision also aborted laws regulating abortion in one way or another in all 50 States. Until that decision it had been understood that society has an interest in reproduction of the race, and that abortion was a proper subject of public policy as reflected in democratically elected legislatures. Public policy so created supported a climate of opinion that provided, among other requirements, for parental and religious guidance of children at a susceptible age. In that atmosphere teenage girls at high risk had somewhere to turn for help. In an environment becoming increasingly licentious in its worship of Berkeley and Woodstock Roe changed that.

Whether abortion be approved or opposed it is beyond dispute that, at least following a short initial period after conception (that keeps getting shorter with new research), the “procedure” kills a living human being. Those who choose to engage in the actual practice tend to be fastidious about its description. A National Review article recounts some of the euphemisms employed in the trade. In a typical mid- to late-term abortion the fetus is pulled apart piece by piece in the womb and the pieces are extracted one at a time. The practitioner lays the parts on a table and shapes and counts them to make sure he or she got everything out. Doctors may describe this procedure as “disarticulation,” but avoid the term “dismemberment.” One doctor said the purpose of the procedure is to “safely and efficiently empty the uterine cavity, rendering the woman unpregnant.” So much more tasteful than looking at the table of parts with something like, “Well, I guess we got all of that kid.”

Partial birth abortion occurs when the baby is already partly exposed, and has emerged feet first. The procedure then involves crushing the baby’s skull and removing its brain with a suction pump to make it easier to get out what remains of its head. Abortion providers tend to describe this as “reducing the fetal calvarium” to allow “completion of delivery.” The procedure is so repugnant that Congress was persuaded to pass legislation prohibiting partial birth abortion. In April of 2007 the Supreme Court, in a 5-4 decision, upheld the constitutionality of that law.
In his majority opinion in the case Justice Anthony Kennedy quoted from the testimony of a nurse who was required to witness the procedure as given at the trial of the case. “The baby’s little fingers were clasping and unclasping, and his little feet were kicking. Then the doctor stuck the scissors in the back of his head, and the little baby’s arms jerked out, like a startle reaction, like a flinch, like a baby does when he thinks he is going to fall.” The baby did not scream out in the pain he felt, having as yet no air in his lungs.

The Non-Woman

Melanie Philips, a columnist for the Times of London, comments that Midge Decter has understood from the start that modern feminism is not about the emancipation of women, but about “the emancipation from women.” As Philips perceives, “It is a revolt against motherhood and womanhood itself.” It is a fantasy of turning the woman into a non-woman. It is not really feminism at all, but something that she says might more aptly be called “genderism,” the genderists in their screechiest mode.

The “mother” of this extremity is the famous French writer ^ISimone de Beauvoir. For her the liberty that is being given women has one sharp limitation: Women should not be allowed to stay home and raise their children. Society should deny women that choice because too many women would take it. Few totalitarians have been more explicit in their ideology, which has a solid lineage in the French history of revolution that de Beauvoir shares.

Author and commentator Jeffrey Bell points out that high on the enemies list of the French Revolution “were organized religion and the family.” These institutions are capable, if not extinguished, of preserving and passing on moral values outside the scope of government control. As Bell says, it is the “anti-institutional, relativistic moral crusade” against religion and the family that has always driven the left. Christina Hoff Sommers reports that a study of the texts used in women’s studies courses reveals that every text disparages traditional marriage, stay-at-home mothers, and the culture of romance. In Sommers’ opinion that is to deny to women the sphere of life that is most appealing to most women. And this is done “in the name of liberation.”

Decter believes that women are desperate to get out of this trap, but fear being ostracized as politically incorrect Victorian prudes if they complain. Philips suggests that it is only “former liberal idealists” like Decter who can understand “the depth of the liberal betrayal” on these issues. Decter identifies this betrayal as inducing disbelief by women in their own selves and being. The Journal of Sex Research reports that this “flies in the face” of their natural constitution and causes depression among women.

Author Wendy Shalit in her book Girls Gone Mild remarks that the war against sexual repression “always seems to require another sort of repression, of feeling and caring.” Shalit and the young women she studies who have “gone mild,” one critic has observed, call for women to find a “rediscovering [of] our capacity for innocence, for wonder, and for being touched profoundly by others.” In his book Taking Sex Differences Seriously Steven E. Rhoads a professor of politics at the University of Virginia, examines the results of feminist theories. Rhoads finds that since the 1970s, when the rage for feminist theories took hold, “women have been more depressed and unhappy than they used to be.” To Harvard professor Harvey Mansfield, “This seems the price of going against nature.” Mansfield comments, somewhat colorfully, that even feminists’ psychological studies have also found that “it is still considered better to be a stud, like the actor than the slut, like the women he sleeps with.”

What these non-women pursue, and would condition America to accept, has at its core a revolutionary and authoritarian purpose. That purpose is for NARAL-infected idealists to drive personhood and individuality toward their goal of authoritarian manipulation, as so vividly illustrated in the writing of Simone de Beauvoir. Depersonalized robots are so much easier to handle than real people.

And what better forum than the United Nations in which to give it a try?
CEDAW

The Convention on the Elimination of All Forms of Discrimination Against Women was hatched in that incubator of curious nonsense located at Turtle Bay on the East River in New York City: the United Nations. To identify and eliminate discrimination against women sounds reasonable, at least outside the Muslim world. Then come those satanic details. If discrimination is to be prohibited discrimination must be defined.

Article I of the CEDAW instrument says discrimination is “any distinction… on the basis of sex” in “any… field.” You don’t know exactly what that means? There’s more. It must then be determined how the treaty is to be implemented and enforced. Article V lays that out. All governments signatory to the treaty are required to “modify the social and cultural patterns of conduct for men and women with a view to achieving the elimination of… all… practices which are based on… stereotyped roles for men and women.” So. What, exactly, are the obligations CEDAW imposes?

Start with discrimination. Does “any distinction” on the basis of sex in “any field” include, say, the playing field of the marriage bed? Do we abolish the words “girl” and “boy?” Does the continued existence of concavity and convexity in the two biological types of human anatomy imply discrimination? Do we move beyond Ritalin to surgical modification of little boys to make them more like little girls? Assuming that, as usual in feminist matters, it is the wielders of convexity who need to be “modified.” The fact is that “any distinction” in “any field” can mean almost anything an any agency enforcing the treaty might want it to mean.

Christina Hoff Sommers has investigated how this peculiar document works. The treaty, which some nations have actually ratified, provides for an enforcement committee. In 2001 the enforcement committee reprimanded Armenia for its “traditional stereotyping of women in ‘the noble role of mother.’” Does that mean no more mothers, or just that motherhood can’t be seen as noble anymore? Belarus was called to account for reinforcing “sex-role stereotypes” by reintroducing Mother’s Day. Slovenia, the committee found, was at fault for having only 30 percent of its children in state-sponsored child care. The committee informed Columbia that it was bound by the treaty to allow abortion—even though the CEDAW treaty itself condemns the practice.

CEDAW is yet another scheme to reformulate human nature. It is a valuable revolutionary tool because such a project is undefined and endless. In the CEDAW enforcement committee the powers of bureaucracy are unlimited. Governments that have ratified the treaty are bound to “modify the social and cultural patterns of conduct for men and women” to comply with treaty requirements. That requirement sets forth unmistakably what the genderist liberation brigades of the Civil War aim for. The totalitarian intent of those forces, including destruction of the individual self, is seldom so bluntly laid open as it is in the CEDAW treaty.

5. Divide and Conquer
Compassionology

When traditional moral and social foundations are under attack the guidance and comfort of the individual inner self can become shaken and doubtful. Those who hate America seize the opportunity and seek to entice the disaffected into new tribal-like entities along sub-national lines. The aim is not to help the floundering soul of the individual to grow toward a new and more secure self. It is quite the opposite. It is to aggravate and exploit feelings of self-pity, anger, and victimhood. The new tribal disciples are told to speak incessantly of their “rights,” and of the “wrongs” done to them by society. Shaken individualism is melded into the collective comfort of tribal identity. The new recruits are transformed into submissive and subversive battalions of the Civil War, angry and aggressive, eating away at the roots of national loyalty and cohesion. The process begins with compassion.

The traditional American approach of affording each individual full opportunity to learn and grow is ridiculed as cruel and uncaring. The concepts of individual merit and excellence are dismissed as weapons of oppression. The tribal chiefs of the compassion corps drill their new members to believe such an approach is mean spirited, lacks compassion, and amounts to discrimination against minorities. To tout excellence, it is said, is to denigrate the less gifted and less able. All are treated alike to avoid the appearance of discrimination.

At a typical middle school in Little Rock, Arkansas every member of the graduating class receives the same trophy of success. A Japanese boy gets the trophy for math. A girl gets the trophy for hairdressing, another girl for being an excellent hairdressing model. The next girl gets the trophy for applying fingernail polish; the girl whose nails were polished gets the trophy for being an excellent fingernail polishing model. All members of the class are recognized equally so as not to damage the self-esteem of those who would otherwise be seen as inferior or as failures. The award for math, a genuine achievement, sinks into the banality of hairdressing and fingernail polishing. The best is pulled down to the level of the least competent. Absolute equality cannot work the other way. The achiever can be dragged to the bottom, but the incompetent or the unwilling cannot be pulled to the top. Parents and family cheer as the trophies are handed out.

The compassion corps would not think that cheering the acceptance of an unearned award demeans not only the recipient, but also those complicit in granting the award. Or that this would degrade the integrity of the school, game, or contest as well. Columnist George Will sees that sort of compassion as a “moral theory in vogue” whereby that single virtue “trumps all competing considerations.” Will describes that sort of compassion as a feeling that generates within its purveyor a duty “to do whatever is necessary to ameliorate distress.” This becomes a professionalized and uncaring kind of compassion that grants to the “caregiver” an authoritarian license to act as he sees fit. It is compassion so understood that spawns the doctrine and practice of compassionology, and an army of compassionologists to enforce the “moral theory in vogue” of which Will speaks.

The compassionology corps provides quotas, work set-asides, and similar favorable discrimination (while denying that discrimination is what they are practicing) on the basis of membership in subgroups marked for assistance. These include backs, Hispanics, women, and other newly tribalized minorities. Membership in a tribe, and obedience to its leadership, is the key to entitlement. Writer and philosopher Roger Scruton summarizes how such a program has affected the black family in our inner cities. As late as the 1950s black families held together—husband, wife, and children—in inner city neighborhoods and often set a model of good behavior. Inner city black people formed successful and proud communities.

Then the cry of compassion broke out that some in the inner cities were “living in poverty.” In a spirit of guilty compassion in1964 President Lyndon Johnson requested and Congress enacted legislation to commence a War on Poverty. Now, more than four decades after that declaration of war, the situation, as Scruton observes, is “horrifyingly different.” Seventy percent of black children are born out of wedlock, crime rates escalate, and there is a steep decline in school performance.

Myron Magnet, editor of City Journal, identifies the assumption upon which the War on Poverty is based. That assumption, he points out in a Wall Street Journal article, was that the free enterprise system automatically produces an underclass that cannot take care of itself and so must be ministered to by its betters. In the resulting welfare system, Magnet says, the welfare worker persuades the recipient that his plight results, not from any failing of his own, but from discrimination, an unjust economy, and other “vast impersonal forces, of which he is the victim.” His loyalty to the tribe is cemented and he cannot escape.

What such an approach fails to understand, Magnet protests, is that “an inner transformation” is what such a person needs. That would require the welfare giver to respect clients as individuals with the potential of making their own way. It would mean helping them to find and develop the best of their abilities, to explore the richest capacities within themselves. But that is contrary to the purpose of tribalization. To build a successful tribe the caretaker must think of those in his charge as his dependents. He must see to it that they are dependent, think of themselves as dependent, and remain so in order to maintain or increase the caretaker’s “fiefdom.” That is his real work, done behind the façade of compassion to relieve poverty.

Those who receive the favors of compassionology never know what capacity lies within them. Those who seek, or have thrust upon them, unearned benefits and privileges do not know that there is a price for every benefit, a collateral required for every loan of artificial compensation. The collateral that must be given up, the price that must be paid is the dignity, honor, and self-respect of the recipient. He or she will never be exposed to the satisfaction of true achievement, and so never become a whole person.

Super-Equality

The Declaration of Independence asserts that “all men are created equal,” and are endowed by their Creator with the unalienable right to “life, liberty, and the pursuit of happiness.” Jefferson’s formulation is a fusion of liberty and equality that recognizes it is impossible for all individuals to be equal in any absolute sense or in every way, since by nature they are not. It declares, instead of the impossible ideal of absolute equality, that everyone has an unalienable right to his best shot at whatever there is. The right to “life, liberty, and the pursuit of happiness” is the right to equality of opportunity.

Slavery made a mockery of that right, and nearly tore the nation apart in the Civil War of the 1860s. After that war the Fourteenth Amendment to the Constitution was adopted in 1868 to guard equality of opportunity for the newly freed slaves, among other purposes. That Amendment provides that no State shall deprive any person of “the equal protection of the laws.” But discrimination against minorities, especially blacks, continued despite the equal protection requirement of the Fourteenth Amendment.

It was not until a century later that the nation heard Martin Luther King’s inspiring “I have a dream” speech as the climax of his massive March on Washington in August 1963. That march and that exhilarating speech were the catalyst, supported by the justice of the cause, that lead to congressional enactment of the Civil Rights Act of 1964. The CRA prohibits the federal government from imposing or allowing unequal treatment “under any program or activity receiving Federal financial assistance.” It took much longer than it should have to provide legislative enforcement of the principle of equal protection. But the question soon became what kind of enforcement?

Equality of opportunity, the “colorblind society” of Rev. King’s dream, allows freedom to do its work, and each to succeed according to his own effort and ability. But it was soon argued that past inequality requires an “affirmative” remedy beyond equality of opportunity. It was President Lyndon Johnson, a Southerner, who signed the Civil Rights Act of 1964, promising equality for all Americans in order to heal the ugly sore of racial discrimination. It was also Lyndon Johnson who, only a year later, in 1965, issued an executive order that laid the foundation for the “super-equality” of “affirmative” action. That is, preferential treatment, rather than equality, for those wronged in the past. For good measure a quota system was added to assure preference for those designated for preferred treatment.

To corrupt the concept of equality into super-equality is to reestablish the inequality the Fourteenth Amendment prohibits. This affects everyone—both the supposed beneficiaries and those newly discriminated against. Super “equality” awarded as part of a group quota affects the way minorities who have succeeded, and even excelled, are frequently perceived. A highly regarded black brain surgeon relates that he is often taken as the “black quota doctor” and given the impression that the patient would just as soon have someone else do the cutting. The doctor reports that he has experienced that reaction even from other blacks! Black writers and political analysts report the same “black quota” reaction to their achievements.

Martin Luther King’s color-blind dream is a dream of worth and dignity for all. When that dream is acclaimed today not a few black “leaders” sneer, and spit some invective such as “white man’s chains” to express their contempt for that gentle goal. These leaders have become accustomed to the perquisites of super-equality. They prefer to profit from their role as Chiefs of their newly created tribes. The gold of equality for which Rev. King stood has been exchanged for the glitter of super-equality ornamented as affirmative action.

Diversity

In 1978 the U.S. Supreme Court, in the case of University of California Regents v. Bakke, ruled that a quota system based on race used for admission to the University of California School of Medicine was in violation of the equal protection clause of the Fourteenth Amendment. That seemed to have settled the matter. After a decade or so of affirmative action, recognition according to merit was reestablished as the basis for qualification or advancement. But there was, as they say, a snake in the woodpile.

In a separate opinion Justice Lewis Powell speculated that the goal of attaining a “diverse student body” might be a “constitutionally permissible” consideration in admitting students to colleges and universities. No other Justice commented on Justice Powell’s “diverse student body” remark. Nor did anyone else at the time. Then, years later, lawyers and civil rights activists happened to notice Powell’s opinion lying there in the Supreme Court’s wastebasket, as it were. Someone picked it up and began to administer artificial respiration. Shouts of “Eureka!” reverberated in law offices across the country as the trial lawyers perceived they had found gold. The idea of a “diverse student body” might be used as the wedge by which to force quotas back into college admission, and in private businesses as well!
School administrators, and others who wanted to admit more minorities, even if they could not qualify under normal admission standards, proclaimed diversity as their new goal and the practice swept the country. Boston University Professor Peter Wood reports that this “artificial diversity” has permeated American campuses and workplaces. In June of 1993, in the case of Grutter v. Bollinger, the United States Supreme Court upheld consideration of race to help achieve a diverse student body in the recruitment policy of the University of Michigan Law School. At last the Court took notice of Justice Lewis Powell’s lonely and long abandoned orphan of diversity, and the little fellow became officially legitimate. The cycle from quotas, to no quotas, to quotas called diversity was thus completed.

To be in favor of such “diversity,” Wood observes, is to claim “a kind of righteousness tinged with modesty.” Hoover Institution fellow Thomas Sowell suggests that the supposed beneficiaries of a quota system, by whatever name, are more like “mascots” or “trophies.” These prizes are more useful “to advertise the virtue of their owners,” such as university admission offices or businesses, than they are to the supposed beneficiaries.

Wall Street Journal editorial board member Jason Riley speculates that somewhere along the way between the Civil Rights Act of 1964 and acceptance of the quota system called diversity something happened. Riley says blacks allowed themselves no longer to be “judged as individuals.” This, he laments, strips blacks of their “individuality, their pride, their humanity.” Those who push for special consideration, whether it is called quotas, preferences, or diversity, Riley asserts, aren’t interested in the effect this has on those supposedly its beneficiaries. That effect is the indignity of feeling that fellow students or fellow workers are always suspicious that blacks are there because the standards were lowered for them. Riley bases his opinion both on observation and on his personal experience of suffering that reaction. Quotas and preferences, whether in the guise of affirmative action or diversity, Riley asserts, carry the inference of “genetically predisposed black inferiority.”

Diversity has now become a wedge issue that diminishes love of America and transfers loyalty to tribal units and their chiefs instead. Next comes victimology.

 

Victimology

Victimology is the parasite that grows on the host of diversity. Victimology is based on the same art and practice of inducing those bound in the tribes of diversity to believe that any misfortune or failure that befalls them is no fault of their own. Anyone subject to any disparity of treatment, personal or statistical, is a victim of discrimination, economic deprivation, or some other sort of injustice. He or she is therefore entitled to compensation. The resulting claims become the business of what Shelby Steele, a research fellow at the Hoover Institution, calls the “fierce grievance industry.” Victimology, firmly embedded in the grievance industry, provides income and leadership status to captains of the Civil War who run the industry. These leaders, Steele asserts, are far more interested in funding themselves than inspiring their subject “victims” to higher achievement.

Columnist, blogger, and editor Andrew Breitbart writes that the general public does not accept the idea that past injustice justifies present injustice to rectify the wrong done. This idea, he says, is unacceptable to the “race industry” of victimology. In parallel with Steele’s comments Breitbart perceives that the power of those managing the race industry depends on maintaining “a latent rage” in those designated as victims. It is a carefully nurtured and guarded rage that can be adjusted “at the will of the nation’s elites.” Manipulation of that rage is a type of victimology to which the poorest of black Americans are particularly vulnerable.

John McWhorter is an author, a former U.C. Berkeley linguistics professor, and a senior fellow at the Manhattan Institute. According to McWhorter, Black America is now in this “ideological holding pattern” of resentment and victimology. These frozen resentments, McWhorter says, are greater barriers to black advancement and well being than any white racism that may still exist. McWhorter contends that for blacks this constitutes “a continuous, self-sustaining act of self-sabotage.” In his book Losing the Race McWhorter concludes that claims of victimhood used to excuse black failure are contrary to the reality of conditions today. He finds opportunities for work and advancement abounding for those willing to pursue what is available.

Michael Barone, a senior writer for U.S. News and World Report, points out that segments of other ethnic or lifestyle groups such as Hispanics, Native Americans, or homosexuals also use their versions of tribal victimology to claim special benefits or privileges, with much the same results. You are nothing without your group, your clan, your tribe. You are a victim for whom only the tribe can seek redress. And if you are nothing without your group you are also nothing within your group; just another cipher to be placated when the spoils of victimology are divided up. Most significantly you are a certified dependent, subversive of the social fabric of your country, for which you care less and less.

The concepts and practices of compassionology, super-equality, diversity, and victimology all wrap up in multiculturalism, one of the most potent weapons of the Civil War against America.

 

Multiculturalism

The motto E Pluribus Unum, the “many” of this country becoming “one” with America while retaining diverse cultural and other differences expresses a genuine multicultural America. Preserving roots of ethnicity, cuisine, religion and so on while becoming loyal Americans is the true multicultural reality practiced instinctively in this country for centuries. Multiculturalism is a different matter, and turns the American assumption on its head. “Multiculturalism,” writes Harvard Professor Samuel P. Huntington, “is basically an anti-Western ideology.” Multiculturalism is designed to destroy the unity, and eventually the very fabric, of the United States of America. On this battlefield, as the term implies, the multiculturalists are legion and attack on multiple fronts.

British philosopher Roger Scruton relates that multiculturalism establishes a cultural apartheid in which each separated culture develops independently of the others and of the nation. Each sees law, group identity, and loyalty as issuing from a “religious or a tribal source,” not from a national tradition. Tribal groups such as these, Scruton says, are incapable of full participation in “Western political culture.” They will not recognize their obligation to the state or feel the love of country that has been an essential ingredient of citizenship in America and other Western nations.

Instead of “Americans,” plain and unqualified, there are now hyphenated Americans of all sorts: AfricanAmericans, Asian-Americans, Native-Americans, Hispanic-Americans, Korean-Americans, ChineseAmericans, and so on. Multiculturalism furnishes the considerable number of unassimilated malcontents who despise America as the source of the world’s ills an ideal vehicle by which to activate and proselytize their hatred. British psychiatrist Anthony Daniels has worked in a prison and a hospital in Birmingham, England. Writing as Theodore Dalrymple he records the fragmentation of British loyalty into multicultural subgroups as he sees it in those he works with every day. Dalrymple calls multiculturalism “the death of the citizen; it is the retribalization of society.” Theodore Roosevelt issued a similar warning in a speech to the Knights of Columbus in 1915. He found “no room in this country for hyphenated Americans.” To Roosevelt, the “absolutely certain” way to ruin this nation would be to turn it into “a tangle of squabbling nationalities.”

Indoctrination into multicultural separateness flourishes in the schools. Classroom teachers responsible for the development of children’s pliable minds attempt to implant in those young minds an anti-American animosity before they can think clearly for themselves. In one class students were asked to name their favorite ethnic food. A Korean boy wrote down “hamburger.” He was roughly admonished that what was required was a food from his native country. The child said that America is his native country; he was born here. The teacher persisted until a suitable Korean dish was concocted to multiculturalize the boy’s insensitivity and mend his insubordination.

In his book Who Are We? The Challenge to America’s National Identity Professor Samuel R. Huntington asserts that elites on the right as well as on the left are culpable in spreading the contagion of multiculturalism. Together they have successfully suppressed both the full facts and open discussion concerning the effects of recent immigration, especially from Mexico and Muslim countries. This influx, Huntington points out, is different from any past immigration, not only in its huge numbers, but particularly by the attitudes of the immigrants. In one study children who had been born in America of Mexican immigrants were asked whether they considered themselves to be primarily Mexican or primarily American. Only a single-digit percentage chose American. Radical Mexican-American groups such as La Raza (The Race) agitate to reinforce resistance to American loyalty. As do such policies as foreign-language ballots, ethnic studies programs, bilingual education, and dual citizenship.
Muslim immigration, though considerably smaller than Mexican, presents an even greater challenge to assimilation and patriotic allegiance. Under sharia law it is permissible for Muslim husbands to beat their wives, apparently a favorite domestic pastime in the Muslim world. Under American law physically abusing another person is assault and battery subject to criminal prosecution. Divorce under sharia law provides that all the Islamic husband has to do to get rid of a discordant or boring wife is to intone three times, “I divorce you,” “I divorce you,” “I divorce you.” The job is done, and out the door she goes. Muslims in America wish sharia law to apply to such matters here, not American law.

Attempts to criticize, or even discuss, assimilation or non-assimilation in relation to such issues as these are typically answered by resort to a politically correct “buzzword” attack. Those raising such issues, even though their resolution is essential to the preservation of this nation, are accused of “racism,” ”xenophobia,” “ethnic profiling,” and the like. What those who raise such questions are really guilty of is thinking about the future of their country. Columnist, author, and commentator Mark Steyn reports that to exhibit an interest in these effects of immigration is to risk being called “if not a ‘racist,’ at least a ‘nativist.’” And, Steyn observes, there is no public forum available in which to discuss the harmful effects of immigration.

True multiculturalism is an inherent American achievement. Americans don’t need sanctimonious advice, judicial coercion, or the intimidation of PC buzzwords to know how to reconcile immigrant cultures with American loyalty. It’s in our blood. So long as it’s not driven from our minds. Multiculturalism has become a device of the Civil War by which to incite the elements of diversity, victimology, and super-equality into the disorder of a nation degenerating into contentious factions of bickering tribes. The purpose is to destroy common values and pave the way to power for those who command the victorious rebels and intend to rule the defeated remnants.

The Morality of Guilt

To prevent public understanding of the destructive effect of multiculturalism it is necessary to keep the American people soaked in guilt. We must be made to feel personally responsible for every blemish, real or imagined, in America. Each of us, the fierce grievance industry insists, must feel guilt for the suffering “victims” of America’s misdeeds. And guilt must be assuaged, atoned for. British psychiatrist Anthony Daniels observes that to accomplish this we are socially conditioned to continually seek victims to receive “our virtuous, which is to say conspicuous, compassion.”

English author and critic C.S. Lewis perceives that those “who torment us for our own good” will never cease for they do so “with the approval of their own conscience.” Social workers, when faced with cuts in staff or service, have been known to wail plaintively on behalf of their charges, “But they depend on us.” Unfortunately, that is all too true. The helper moves ever closer toward a role of keeper or warden, and those helped toward the status of wards or inmates. The warm milk of compassion turns to sour condescension toward those in need.

A few American writers, preeminent among them Lionel Trilling, a professor at Columbia University, often termed the leading literary critic of his time, recognized this tendency early on. Writing at his centenary American professor, writer, and scholar Gertrude Himmelfarb notes Trilling’s concern, as he put it, about “the dangers of the moral life itself.” Trilling warns of what often follows once we have made our fellow men objects of our “enlightened interest.” The tendency is then to make them “the objects of our pity, then of our wisdom, ultimately of our coercion.”

Juan Williams, a senior correspondent for National Public Radio, and a commentator on Fox News, has written a remarkable book Enough: The Phony Leaders, Dead-End Movements, and Culture of Failure That Are Undermining Black America—and What We Can Do About It. (How’s that for a title?). The book is about Bill Cosby, the actor and comedian, who has taken to “telling it like it is” to his compatriots in black America and to their white benefactors as well. But Cosby (“The Cos” as some call him) had not written about his perceptions of what is wrong with black society and its relationship to white liberals, though he speaks of his insights most eloquently. Williams’ book is the written story of what The Cos is saying about the “Phony Leaders, Dead-End Movements, and Culture of Failure” that he refers to in his sub-title.
In 2004 Cosby was invited to speak to a black-tie crowd celebrating the fiftieth anniversary of Brown v. Board of Education, the Supreme Court case that declared separate schools for whites and blacks unconstitutional. The Cos astonished his audience of chattering glitterati. He spoke of the lack of education, drug dependency, drug dealing, illegitimacy, the high black population of prisons, and similar problems of the black ghetto. The audience glowed in anticipation. But, Cosby quietly informed them, those conditions cannot be blamed on white people. The audience was shocked. Those conditions are, Cosby continued, the result of an “underclass culture” that is promoted by their own black leadership in collaboration with guilty white liberals. He emphasized this truth by rejecting the “righteous guilt” of whites and insisting that blacks must shoulder responsibility for themselves.

Cosby noted that many in his jewel-bedecked audience of benefactors were proud of being “not racist.” This they sought to prove, he said, by supporting programs designed to ameliorate the conditions of the “underclass culture,” and so relieve their “white guilt” in causing such conditions. The Cos revealed why black civil rights leaders maintain the same “tired rant” about the guilt of white people and the power whites wield over blacks. Blacks, Cosby told them, must be portrayed as “hapless victims” who need white assistance built on white guilt so as to maintain the status of the black leadership.

Writing in National Review another black, John McWhorter, a former Berkeley professor, digs at the roots of the welfare system from which these events have grown. President Johnson’s “Great Society,” he finds, had the unanticipated result of helping to destroy the black family. The integrity of a proud and vigorous black community that had existed even under discrimination was extinguished. The system of welfare implemented in the sixties as the Great Society, McWhorter says, “deep-sixed” these “struggling but stable” black urban communities. Welfare turned those communities into “lawless black inner cities” plagued with drug-related teenage killings, where hardly anyone has a real father. The black people of America’s inner cities have suffered the most severely. They have been herded into a new plantation of the mind, a new servitude of the soul.

For those supposedly benefited by these liberal programs the true self melts into the sweet anonymity of the tribe, freed from individual responsibility. The struggle for selfhood is over when the substitute self of collective identity is handed out free. The aching spirit no longer suffers the need to form its own values and purpose when it can live on tribal excuses. When allegiance to God and country has been debased or erased, the catechisms of tribal identity can seem welcome and attractive.

The Beast

After hurricane Katrina struck New Orleans in 2005 there was discovered a large number of people stranded in the worst hit parts of the city, unable to get out, and unable to help themselves in any effective manner. The cause widely attributed to their condition was poverty. Charles Murray, political scientist, author, and commentator writes that what was discovered in the dregs of Katrina was not poverty, but an underclass. Murray’s book on the welfare state, Losing Ground, published in1984 was an early dissection of the welfare state and the damage it does to those purportedly helped.

Murray finds that those existing at the end of the trail that starts with compassion often fall into a permanent underclass. What was discovered in New Orleans after Katrina, Murray affirms, is not poverty, but the underclass itself. It was the same conditioning not to work or be productive, and becoming dependent and subservient, that Bill Cosby, John McWhorter, and others have also described. The lethal ingredient in the formation of this underclass is the fatherless children of broken families who want no part of society. They have no concept of what society is. The high unemployment rate of underclass youth, Murray finds, is not caused by discrimination, by lack of jobs, or by any similar marker of deficiency. It is caused by the absence of a social structure that would require members of this underclass to work, to keep a job, and to support their families.

Murray, an American Enterprise Institute associate, finds that the commanders of welfare in America screen the existence of the underclass from their consciousness by calling it poverty.

When the welfare contagion becomes too odious to ignore these commanders of compassionology gather on the ramparts of denial. There is announced an outwardly composed and seemingly rational search for the “root cause” of what went wrong: poverty, illiteracy, greed, oppression, discrimination, and bigotry. These and other “roots” the welfare commanders dig out of the acid land they have despoiled to displace blame from themselves for the lives their policies have ruined. The disease is diagnosed as poverty. The prescription for a cure is always the same: more money, more counselors, more staff, and more generals for a renewed “War on Poverty.” The thought that the war itself might be the root cause is an apostasy that cannot be tolerated.

Murray asserts that few poor people who are not part of the underclass need help to get out of poverty. To the contrary, poor people who enter the work force and keep a job do not remain poor. The underclass is not constituted to furnish to its fatherless young the incentive to begin the process: to plan, to get to work on time, and to become responsible citizens. Claudia Anderson, managing editor of The Weekly Standard notes the near disappearance of marriage in the underclass. When marriage disappears the planning, sacrifice, and self-control of a committed relationship, which she terms “the lifeline out of poverty,” are lost. The result is that, “The underclass is hardening into a hereditary caste.”

It is almost as though the masters of anti-American animus and revolution had engineered this hopeless undercaste, breeding crime and brutality, to prove their own disbelief in humanity. Just look at them, the young people in the inner cities, throwaways at the end of the trail of compassion cycled through to multicultural tribalism. The worst of them roam in gangs seeking stores to rob, women to rape, old people to beat up or kill, drugs to enhance the thrill. These are the underdogs of the hardening undercaste. They run in packs, ice eyes set in frozen faces, capable of anything.

6. The Family Under Siege
Love and Marriage

“Love and marriage, love and marriage / Go together like a horse and carriage.” The Broadway musical of bygone days expresses what was once the universally expected fruition of the sex drive. That was back when sexual intercourse was commonly called “making love.” Now it is more likely to be referred to as “having sex.” Something like having a hot dog and a diet coke. Making love is a tender and personal event. It leads to commitment, forms a family, and generates children. Having sex is an impersonal and passing indulgence. How people treat each other in the one form or the other says which of the two is happening, an animalistic urge or a human desire.

English philosopher and author Roger Scruton defines desire as desire for a person. It is not lust for “an object in the physical world.” Desire for a person is a self-conscious union where the partners meet “eye to eye and I to I.” To fulfill that desire, Scruton observes, requires reciprocity and commitment. It takes away a significant degree of freedom and imposes responsibility. The responsibility is to a structure of law and society centered on the traditional family, headed by two married parents who conceive and raise children. That is the conduit of civilization. That function cannot be effectively carried out when promiscuity and pleasure replace the core of family discipline and loving care.

The traditional sacrament of marriage with its oath of fidelity “until death do us part” takes place before family, friends, and God. The “ties that bind” are tight and secure. Disciples of the sixties, converted to the pagan rituals of Dionysus, see the traditional vows of marriage as irrational and outdated moral restraints imposed upon their new ideal of sexual freedom. The “horse and carriage” phrase of the old Broadway song underscores what has happened. The indulgence of “free sex” has effected a fundamental change in the sexual manners and practice of the country. Marriage, like the horse and carriage, many now believe, is antiquated if not already obsolete. Marriage reduced to a prop of romantic musicals has nothing to do with daily life. The heirs of the sixties demote the obligations of marriage to an option to be considered only if convenient.

As the traditional grip of the church on the institution of marriage weakens the state takes over. Grounds for divorce become progressively more liberal until finally reduced to the nub of its logic, that divorce should be “no fault.” Just get me out of here so I can find my natural talents and capacities and be myself. About half of American marriages now end in divorce, with many of the divorced indulging in what some refer to as the “serial polygamy” of successive marriages and divorces. The resulting “serial wives” have been known to compete with each other in the splendor of their successive weddings.
The deterioration of the American family first became too severe to ignore in the inner cities. That was when the “Great Society” of Lyndon Johnson began to interfere with family development by offering financial assistance to unmarried mothers. The unanticipated consequence of that good intention was to subsidize the natural inclination of men to be free of family responsibilities. Unmarried mothers abandoned to raise their children alone formed the nucleus of a new underclass, with the government acting as surrogate father to keep them from utter poverty. The blighted life that has resulted for the children, and most everyone else living the ghetto existence, is depicted in the preceding chapter. For a long time it was tempting to think of this as just a ghetto problem, a poverty problem, or even a racial problem. But surely it didn’t affect any of “us.”

Then the movement of women into the workforce by the millions left a huge additional cohort of children orphans for a day five days a week. Piled on top of that is the “genderist effect,” reviling womanhood and motherhood as but one more aspect of male tyranny, a form of slavery. The sociopathic results of these forces have been severe throughout society, though less so elsewhere than in the inner cities. It is, after all, the suburban kids from “good homes” who are out there soaking up hip-hop and “screwing their heads off.” And, as one distraught mother lamented, “there’s nothing I can do about it.” Maybe the kids are simply enjoying the “safe sex” their schoolteachers advise them about in explicit terms in sex education classes.

It’s Only Sex

There had been communes of free sex and orgiastic practices in the past, such as the hippie movement in San Francisco and elsewhere, with its nocturnal cry, “Let’s get naked.” Most of these soon disintegrated into jealousy and disunity, some into violence and even death. An early indication that the free sex of the sixties might be something more profound was the reaction to its consecration in the mud-splattered orgy of Woodstock. Rather than being dismissed at an unfortunate juvenile event, Woodstock was, and still is, recalled with ecstasy and praised for emulation by such organs as Mother Jones and the New York Times. The elite, joining this nucleus of the new Civil War then taking form, were elated to follow that soggy precedent. Hillary Clinton, something of an expert in these matters, suggests a Woodstock memorial as a permanent celebration of that salacious event.

Advocates of self-liberation proclaim that all “irrational” barriers to self-expression, including marriage, should be swept away as unjustifiable “repression,” “prejudice,” or “taboo.” As the self-indulgence of free sex is injected into the larger population the obligation that a man and a woman feel toward each other, if they do elect to get married, diminishes. Leisure, egotism, self-worship, and ever-expanding realms of pleasure rise to compete with the obligations of the marriage vow and the needs of a family.

Roger Scruton in his book Sexual Desire: A Moral Philosophy of the Erotic laments that in the end “repression was identified as the only true sexual sin.” Once that credo takes hold no sexual impulse can justifiably be denied or repressed. We tie into the official policy of the Netherlands, “If it can be done it must not be prohibited.” The responsibilities of children and family melt easily into a puddle of euphoric, if often illusory, self-fulfillment. The entire range of reactions between men and women changes.

How does a young woman indoctrinated, if not forced, into the contemporary college sex orgy of hookups, relationships, multiple sex, and similar “liberated” practices ever escape it? How does she work her way out of feeling what she has been through was just “all icky” to an articulate formulation of a meaningful life? How can she switch from having sex to making love? How does this graduate of officially imposed sluttiness discover how to become a loving wife and a caring mother? How, if she achieves a family, does she instruct her own young daughters? When does she realize, as Jennifer Roback Morse puts it from her own experience, that such a “liberated” life as that is “not a jolly time?”

English journalist and author G. K. Chesterton feared the results of sexual liberation as early as 1926, when he foresaw an “erotic religion” that both “exalts lust and forbids fertility.” He predicted that the next “great heresy” would be “an attack on morality, and especially on sexual morality.” Chesterton’s prophecy, even in face of the gathering menace of Soviet Communism at the time he wrote, was that the “madness of tomorrow” would be not in Moscow but in Manhattan. He did not live to see his prophecy come true in the campus riots of the sixties and all that followed.
Then the gay rights ingredient was added to the sexual stew.

Gay Marriage

The gay rights movement began by stating that its goal was to achieve toleration. It was argued that adult persons should be free to engage in sexual activities of their preference in privacy without social opprobrium or government interference. A call for individual rights, for toleration, for granting people a broad license to order their own lives appeals to the American spirit. The call by gays for toleration was widely perceived as reasonable. Subsequent experience ranging from homosexual priests in the Catholic Church, to persecution of the Boy Scouts, to grade school courses redefining the family raises broader issues. The evidence grows that something more than toleration is the true aim of the gay and lesbian movement. Insistence on enforced conformity to their lifestyle seems to be the more likely course of development.

In San Francisco homosexual men formed numerous establishments, euphemistically called “bathhouses,” where they could gather for promiscuous and impersonal homosexual sex. As these bathhouses were becoming popular the operation of one of them became involved in a lawsuit asking that it be closed down. The judge, innocent fellow that he was, ruled that the bathhouse could remain in business if it ceased allowing promiscuous sex on the premises. The San Francisco Chronicle, a proudly avant-garde publication, wailed editorially that, “Promiscuous sex is the whole point of the thing.” In San Francisco there is a homosexual cult that calls itself the Sisters of Perpetual Indulgence, which pretty well capsulizes what a bathhouse culture would aim for. An unintended, but predictable, consequence has been to greatly facilitate the spread of the AIDS epidemic.

The Boy Scouts of America are a paradigm of what to expect from an ascendant homosexual movement. The Scouts encountered the essence of gay “toleration” when gay men claimed the right to become Scoutmasters, even though the Boy Scouts believe that homosexuality is wrong. After many years of strenuous attacks and harassing lawsuits against the Boy Scouts by the gay movement one of the cases reached the United States Supreme Court. In Boy Scouts of America v. Dale the Court held that the Boy Scouts could dismiss a Scoutmaster who had openly declared himself a homosexual and had become a gay rights activist. The Court based its ruling on the constitutional right to freedom of association. The Scouts, the Court said, have the right to choose leaders whose example would reinforce the beliefs of their organization.

The concern of the Boy Scouts is how adult leaders—authority figures in the current parlance—influence the young boys in their charge. The apprehension is that homosexual Scoutmasters will abuse their authority by setting the wrong example, and will attempt to abuse the boys as well. This is not a new anxiety on the part of Scouting. There are periodic cases in which a homosexual Scoutmaster, not known to be so, has been dismissed after he misjudged his target and was reported. The behavior of covert homosexual Scoutmasters was an accurate portent of how homosexual priests would act in the Catholic Church.

Stanley Kurtz, a fellow at the Hoover Institution and a senior fellow at the Ethics and Public Policy Center, has done extensive studies on the effect homosexual Catholic priests have had on the Catholic church. Following the upheavals of the 1960s and 1970s numerous homosexual priests were ordained in the Catholic Church, and in time came to dominate some seminaries. Kurtz cites an account of Jason Berry, a liberal Catholic, who finds that by the 1970s and 1980s gay priests would visit such seminaries “on the make” and would also frequent gay bars. Kurtz reports that such acts have been part of a “deliberate subversion” of the basic moral teachings of the church. This has also led to the abuse of young boys by gay priests.

The facts relating to gay activity against the Boy Scouts and in the Catholic Church are not in dispute. We know what has happened to subvert the teaching of the Church (further related in Chapter 15). It has happened to other institutions as well. The gay-lesbian movement began as a “live and let live” plea and most Americans honored that appeal. From that modest base homosexual movements have gradually transposed their goals into a militant determination to make fundamental changes in society to replicate their own image. Similar acts motivated by similar intent attend the gay marriage issue.

Even with the experience of the Catholic Church and the Boy Scouts before them gay marriage advocates will contend that only by legalizing gay marriage can the predatory practices of gays be “tamed.” In his investigations Stanley Kurtz finds no support for that justification, nor any sincerity in the intent expressed. To the contrary, advocates of gay marriage frankly strive to effect “‘subversion’ of the idea of monogamy.” Kurtz cites the goals expressed by gay marriage advocate Andrew Sullivan. Sullivan begins with an attack on the Church rule requiring celibacy of priests. He expands that attack into a broader assault on the Church’s teaching regarding non-marital sexuality in general.

The experience of the Church, Kurtz finds, has shown that gays joining a traditional institution consciously attempt to subvert its sexual mores. Kurtz fears that if civil unions and gay marriage are adopted in this country their effects will “percolate for years” before the damage to traditional institutions becomes apparent. Then it will be too late. To elevate gay marriage to the same level as traditional marriage does not merely create an equal legal status for pairs of homosexuals. It destroys marriage. That is the cutting edge and purpose of the homosexual drive for same sex “marriage.” This drive has been greatly assisted by the Supreme Courts of several States.

The Massachusetts Supreme Judicial Court, in a 4-3 decision rendered in the fall of 2003, labeled marriage an “evolving paradigm.” Speaking for the majority, Chief Justice Margaret Marshall held that there is no “constitutionally adequate reason for denying civil marriage to same-sex couples.” A Wall Street Journal editorial following that ruling terms the Massachusetts decision neither an argument nor a debate: “It is instead a unilateral declaration that the assumptions and values that have defined one of civilization’s oldest and most vital institutions—marriage—should be tossed out the window.”

The court’s decision was reached through the influence of politically powerful gay rights activists who know they could not get such a revolutionary measure past the rigors of public debate and legislative consideration. They hope to use the courts to bypass democracy on “gay rights” just as the abortion rights activists did a generation before. The subterfuge in the 1973 abortion case was to persuade the United States Supreme Court to declare that abortion is a woman’s “constitutional” right. “Can anyone doubt,” the Journal asks, “that the Massachusetts High Court has started another Thirty Years War?”

In Massachusetts, and subsequently, in California and Connecticut as well, twelve judges, in three 4-3 decisions, assumed the power to make their own policy contrary to established law and practice. The “thirty years war” predicted by the Wall Street Journal is on in earnest. The Supreme Courts of other States have since joined the parade.

Maggie Gallagher is co-author of The Case for Marriage and President of the Institute for Marriage and Public Policy. She describes these judicial decisions as “a huge nuclear bomb dropped into the culture wars.” Gallagher fears that these decision point toward judicial determination that to oppose the “right” of gays to marry will be put in the same category as those who oppose equal rights for blacks, Hispanics, or other nonwhite races. Christians, Jews, or any other religion that believes marriage is designed to be a union of a man and a woman will then be placed in the category of “racists.”

That is where matters stood on gay marriage until the election of November 4 2008. There was on the ballot in California a constitutional amendment that reads: “Only marriage between a man and a woman is valid or recognized in California.” At last the people themselves had a chance to speak. They spoke, that amendment passed, and is now part of the Constitution of the State of California. The people of California voted an unambiguous repudiation of the California Supreme Court’s effort to force gay marriage on them by judicial fiat. At this writing the matter is once again before the California Supreme Court this time it is to judge whether the people of the State of California have the constitutional right to change their constitution.

Maggie Gallagher calls gay marriage a revolutionary social upheaval that would “gut marriage of its central presumptions about family” in order to accommodate a “handful of people.” Gallagher contends that same sex marriage would be legal recognition that “the desire of adults for families of choice outweighs the need of children for mothers and fathers.” Gallagher observes that civilized cultures over the millennia have uniformly directed the erotic desires of men and women into that “relatively narrow but indispensably fruitful channel of marriage and family.”

The “handful of people” that Gallagher sees as benefiting from gay marriage, estimated at two to three percent of the population at most, are supported by a substantial gay, as well as non-gay, elite of revolutionists that greatly enhances gay power. As part of the Civil War directed toward general destruction of American institutions the gay marriage issue is an excellent wrecking ball for that purpose. Should gay marriage proponents succeed, Harvard law professor Mary Ann Glendon foresees mounting discrimination and intolerance. Not against gay rights, but intolerance by the gay and lesbian movement against any who dissent from their agenda in any manner.

Glendon predicts that any person or institution, including traditional churches and their congregations, “will be hit with lawsuits” if they refuse to compromise their faith and acquiesce to gay and lesbian demands. According to Maggie Gallagher, Marc Stern, general counsel for the American Jewish Congress, fears the homosexual attack against religion is on track for a “train wreck.” With both sides “looking for Armageddon” he fears that it will be “a very dangerous train wreck.”

Professor Glendon cautions that to conceal their true goals partisans of gay marriage will continue to use “the language of openness, tolerance and diversity” while practicing quite the opposite. Amherst College Professor Hadley Arkes warns that when marriage loses its integrity as a concept, it “will also lose its special standing as something to be esteemed and sought.” Commentator Anne Morse observes in The Weekly Standard that when Scandinavians made same sex marriage de facto law “the rate of heterosexual marriage plummeted.” As a result, she reports, when Scandinavian children “watch uncommitted adults wander into and out of their homes” they pay “a heavy emotional price.”

Gay marriage is not about gay rights. It is about marriage.
And Then?

Opponents predict that gay marriage will not only destroy monogamous marriage. It will also lead to polygamy, polyamory, triple parenting, communal promiscuity, and virtually any other sexual arrangement or practice that can be imagined. Stanley Kurtz cites such writers as self-described “gay leftist” Richard Goldstein in an article in Village Voice, and libertarian Jacob Sullivan in the Washington Times. Each writer sees polygamy as a logical and probable next phase in a further transformation of the sex drive following the establishment of gay marriage. Western culture, Kurtz points out, has historically treated polygamy “as an offense against society itself,” a repudiation of the concept of fidelity in monogamous marriage.

The term polyamory is not found in older dictionaries. It had to be newly coined to keep up with the times. Polyamory might be described as a sort of hyper-polygamy that consists of “a bewildering variety of sexual combinations,” as Kurtz describes it, inspired by the gay marriage movement. Kurtz has identified “triads of one woman and two men; heterosexual group marriages; groups in which some or all members are bisexual; lesbian groups; and so forth.” These groups do the sorts of things the original hippies did when they had the urge to “get naked” and let the games begin. It is claimed that the aim of polyamorous groups is to form stable and loving relationships. In fact the membership of such groups, not surprisingly, tends to be fluid rather than stable, promiscuous rather than committed, and jealous as often as amicable.

It is clear that in a world accepting gay marriage it would become increasingly difficult to deny recognition and equal standing to almost any organized sexual arrangement claiming it.

Florence King, author and National Review columnist, foresees that the aberrant practices of polygamy, polyamory, and even incest will eventually be accepted through debates about such practices on TV talk shows: to talk about it is to predict its coming. The most sordid acts can be made to seem commonplace if “debated” long enough. All that will be necessary is to repeat the mantra that “the vast majority” of polygamists live quietly, contribute to their community, and practice family values. That “hypnotic phrase,” King predicts, will make us feel secure in approving the practice. Next will come the incest lobby, “debated” endlessly with current and historical examples. Finally incest will be covered by its own “hypnotic phrase,” which in this case will be, “It’s already happening.”

The Dutch may have gone farther toward total sexual liberation than anyone else. There, Kurtz reports, all parties to the controversy over gay marriage—gays as well as the political left, right, and center—take gay marriage “to signify the replacement of marriage by a flexible and morally neutral range of relationship options.” Kurtz refers to the Dutch lesbian intellectual, Xandra Schutte, who prophesies that gays will be the trendsetters in severing the connection between marriage and parenthood.

In the United States avant-garde professors of “family law” have joined a movement toward the abolition of legal marriage. University of Utah law professor Martha Ertman proposes the substitution of a corporate-like system of limited liability contractual relationships. Ertman suggests that the increased openness of homosexual partnerships is already “slowly collapsing the taboo against polygamy and polyamory.” But it was National Gay and Lesbian Task Force policy director and University of Michigan law professor Paula Ettelbrick, who laid it on the line. She did so, ironically, by opposing gay marriage. Why? Because, she says, allowing gays to marry would force their “assimilation” to American norms. “Being queer,” as Ettelbrick puts it, leads to a transformation of sex and family, and she hopes to transformation of “the very fabric of society.”

Some see the marriage issue as a battle for the word. Who captures the word wins the war. If marriage is transformed into a unisex institution it will then be publications such as People magazine, Mother Jones, the New York Times, and newsstand pulps that will speak the public language. A mother and father attempting to raise a family will be on the fringe, pleading for recognition and support. “To lose the word ‘marriage,’” Maggie Gallagher warns, “is to lose the core idea any civilization needs to perpetuate itself and to protect its children.” Civil War militants who advocate gay marriage and all that follows blatantly proclaim that destruction of that core idea is exactly what is intended.

The evidence is that the sex cults of the Civil War will “push the envelope” until there are no sexual prohibitions of any kind left for depravity to push against. As soon as one novel sexual deviancy is recognized, another will demand “equal treatment,” “social justice,” and so on. What these people are saying is that the sexual revolution will be complete only when the entire society—men women and children, fathers and daughters, mothers and sons, communes and polyandrous “households,” any combination you can think of—all treat each other as objects to be exploited for instantaneous pleasure. The nation will be one gigantic “bathhouse.” The concept of a “loved one” to be treated as a person, to be cherished, loved, and respected above all others—to be seen “eye to eye and I to I”--will be relegated to a museum of antiquated moral concepts, exhibited for a good laugh on a rainy afternoon.

Children

In the mélange of sexual relationships likely to develop should gay marriage come to be widely practiced, have children been taken into account? How does a growing child manage these circumstances? Where is a healthy role model to be found? Where is loving care to be had? If there is no more mommy and daddy for the youngest, and no mom and dad as they grow older, no husband and wife, and no home, what sort of values, what kind of behavior, what level of aspiration might the child develop? How the new sexual rights of adults affect children is considered in Freedom’s Orphans Contemporary Liberalism and the Fate of American Children, a book by David L. Tubbs. A professor at King’s College in Manhattan, Tubbs finds that the extensive indulgence of personal liberties, as sanctioned by the courts and advocated in the universities, centers on adults and ignores the effects on children.

Judicial decisions prohibiting the regulation of what is delicately termed “adult” literature, films, or TV—that is, pornography—have been “carved into the law of the land.” Other adult preferences, such as licentious sexual conduct and easy divorce are sanctioned with no consideration of their effect on family life. Welfare assistance encourages fathers to leave their families while they enjoy their licentious freedom elsewhere. One result is the alarming increase of single mothers. Tubbs argues that “a benign and omnicompetent welfare state” cannot fill the role of absent parents. Additional adult “rights” threaten even more damage to family and children. As it often is, California is once again in the lead.

Legislation has been enacted in the Golden State designed to ban in public schools the use of terms that might seem offensive to gays, lesbians, cross-dressers, transgenderists, and so forth. Among the terms that frighten or anger these classes of people, and are now to be obliterated, are “mom and dad,” and “husband and wife.” California’s Republican Governor, Arnold (the Terminator) Schwarzenegger signed the bill into law, thereby terminating the State’s interest in maintaining healthy families as a basis for the nation’s future. Karen England, executive director of Capitol Research Institute sees that with this decision the Governor “has told parents that their values are irrelevant.” Randy Thomasson, president of Campaign for Children and Families says this legislation means that children as young as five years old “will be mentally molested in school classrooms.” In his view the Governor and the California Legislature have made every California school into “a homosexual-bisexual-transsexual indoctrination center.” There is no requirement that protection or toleration for Christian values held by a majority of Americans also be recognized in the schools.

Midge Decter notes that this legislation eliminates “the distinction between heterosexuality and homosexuality” just as court decisions have done by judicial fiat. After that there is nothing to prevent the courts from according the same legal status to adults and children. Why not consider pedophiles, Decter suggests, as “human beings with human feelings?” And if children might enjoy their attention are not the children entitled to exercise their sexual preferences as much as anyone else? “And after pedophilia,” Decter asks, where can there be found a “truly telling argument against incest?” Why not father and daughter, a young child (of either sex) and an old man, two brothers or two sisters? Dr. Jennifer Roback Morse reports that in Israel in December 2005 a British woman and her pet dolphin were united in matrimony. And you thought Leda and the Swan was an odd match?

From among specimens of advanced thinking offered in California legislation are provisions inviting school children to choose the role most fitting their own felt proclivities. Pupils of all ages may now choose to use either men’s or women’s lavatories or locker rooms, depending on the “gender” with which they wish to identify. Each child is left to find out which sexual orientation or variation is “right” for him or her. This is to be a matter of personal choice. But the “choice” is to be exercised only after intensive indoctrination in the new vocabulary of the politically correct classroom. The role models for school children are now to be drifting images of cardboard people playing strange and unnatural parts of every sort imaginable on an ever-changing stage. Mothers and fathers have been legislated out of existence in this brave new world.

The fatherless children of the inner cities more than likely foretell what kind of society is to be expected from this condition of institutionalized child neglect. It will produce as its victims growing numbers of neglected children, frustrated and angry. Washington Times columnist Cheryl Wetzstein points out that by 2007 the number of out of wedlock births had reached 38.5 percent and could reach 50 percent in a decade or so. Wetzstein quotes Charles Murray on single mother families. They are, he says, “a net drain on the community’s resources,” and in large numbers they would “destroy the community’s capacity to sustain itself.” Wetzstein asks what the nation will look like if half its babies are born “without a legal bond to their father.” The pattern is already laid down.

The children of this regime are caricatures, empty of dignity or purpose, thrown helplessly into a land of perpetual indulgence. These innocent creatures, loveless and alone, are destined to become angry and lost. They will be left to strike out blindly at a society that leaves them in abject neglect while the “adults” in their lives enjoy their nameless, faceless rutting. They become the underdogs of the undercaste, cloned spontaneously in every inner city of America, the same ice eyes in frozen faces, capable of anything except the warmth and security of a loving relationship.

The American family is disintegrating like a painting overexposed to the sun. Slowly cracking and peeling, finally there is nothing of the original design left. Neither love nor children nor society itself can thrive in a brothel.

III. Mother Earth’s Angry Armies
7. Nature’s Uses
Faces of Nature

The worship of nature to the detriment of humanity is a central dogma, and a powerful weapon, in the Civil War to bring America down. How nature is perceived has been a singular dividing line between the developed world of cities, science, and technology, and the primitive world of animalism and totemism. Nature, even as it is recognized to be massive and basically unchangeable, has been treated in the West as manageable to ameliorate its worst ravages. Western culture has assumed that nature is a logical and ordered structure that can be explored and understood through the application of rational thought and technology. And nature’s resources have been gratefully accepted and utilized for the benefit of humankind.

In your imagination leave the city, suburb, or farm where you live and move into one of the caves of Lascaux in southwestern France, or Altamira in Spain, to capture a feeling for the lives of the people who once lived there. Think about the food you woul have to eat, sometimes raw, often half spoiled, or even rotten if you are hungry enough. The clothes you would wear are rough skins rarely if ever cleaned. Try to imagine the smell of a cave home from raw sewage and lack of bathing or clean clothes, a stench intensified by oppressive heat, perhaps moderated somewhat in the cold.

The seventeenth century English philosopher Thomas Hobbes describes life in nature unadorned by civilizing instruments and institutions. It would be an existence, he said, in which there are, “No arts; no letters; no society; and which is worst of all, continual fear and danger of violent death; and the life of man, solitary, poor, nasty, brutish and short.” Primitive peoples, lacking the ability to mange nature, tried to pacify nature. They invented ideas and worshipped icons to mitigate the wrath of wild beasts, or the forces of rivers, storms, floods, and drought. They sought to propitiate the spirits of animals, or the gods of sun, wind, forest, or rain. Or these ancient people prayed to the gods of clubs and stones for victory in the never-ending wars of primitive life.

Harvard archeologist Steven A. LeBlanc writes that there is no archeological evidence of a sylvan idyll that was perverted by “warlike, modern imperialists” as Romantic idealists like to fantasize. Archeological excavations confirm that prehistoric warfare was “common and deadly.” This seems to apply to all time spans and geographical regions in an era when the average person died before the age of 40. In his book Constant Battles LeBlanc estimates from examination of ancient gravesites that as many as a fourth of the male population died in battle, a proportion to match or even exceed the carnage of the Twentieth Century wars. The cause of this small scale but continuous warfare is not clear. LeBlanc speculates that it might have been a search for female mates, or perhaps a “genetic selection for more generalized aggressive behavior,” that fueled those incessant conflicts. Yet the vision of an ideal state of nature, sylvan, peaceful, abundant, and unspoiled by human creatures endures in the Romantic imagination.

In the context of the present Civil War nature is viewed by one side as the source of raw material to be used bountifully and judiciously for the betterment of humankind. On the other side the vision of an idealistic nature, exploited and ravaged by man, is used as a bludgeon against those defending the American economic and cultural system.

Conquer or Submit

Even primitive peoples struggle for something more than mere capitulation to the fear and danger of nature in the raw, or to the caprice of the gods. Covering the cave walls and ceilings at Lascaux in South Western France there are marvelous painted images of bison, horses, and deer done some 30,000 years ago. Similar paintings are found in other caves in the same region, and at Altamira in Spain. The cave paintings may be religious totems done by people who believed that events in nature occur through the intervention of gods of the rain, rocks, trees, or other manifestations of nature. Some think the paintings were inspired to quiet echoes in the caves that the dwellers there believed to be spirits of the beasts they had eaten in order to survive, or killed in self-defense.

Others believe the paintings were done simply as art, as creative works of beauty to counter the harsh life of the cave-dwelling hunter society. Those extraordinary animals at Lascaux and Altamira are so alive they seem ready to rush past as you watch, or jump down and attack you. Perhaps they were done in defiance of nature’s destruction by creating beauty inspired by nature, yet beyond nature. Perhaps the paintings assert a stretching of the spirit toward a better life to come, even in a creature as hard pressed as Hobbes describes, yet never quite willing to give up. The cave paintings can be viewed as a creative protest against a wretched existence, even a reprimand to the harsh nature the cave dwellers faced outside the shelter of their caves. These beautiful works are one step up the ladder toward civilization. But they hardly depict the idyllic existence the Romantics imagine.
To Romantics from the Eighteenth Century French philosopher Jean Jacques Rousseau to the visionary purveyors of today’s illusions of the grandeur of nature in its pristine state, nature is gentle and beneficent. It is only society that corrupts this goodness and causes people to do evil things. Is Rousseau himself an example? Even amongst the lower primates care is taken to assure survival of the young. Rousseau abandoned all five of his illegitimate children at a foundling home, which in those days amounted to a death sentence. However that may be, the Romantic legacy of Rousseau and others is to enshrine Mother Earth, Gaia as she is popularly called, as an object of worship. Gaia is a gossamer ideal of a lost existence and of superior forms of behavior to be recaptured through dissidence, protest, and revolution. Romanticism justifies destruction of what is in favor of what is to be. Romantic idealism is a rejection of the real for faith in submission to the unreal.

British born novelist and scholar Anita Brookner recognizes in her book Romanticism and Its Discontents Romanticism’s great and inherent deficiency. That is its zest for “breaking the old rules, but only incidentally establishing new ones.” The literary Romantic is the perpetual Don Quixote flourishing gloriously armored words, his lance of riposte at the ready, eager to conquer the windmills of his imagination. The militant Romantic actually believes the fanciful myths and goes out to destroy the windmills, even when he has no idea what he wants to build in their place, or how to compensate for their absent function.

Indeed, there is evidence that the adoration of nature among the more militant Green legions of the Civil War leads to destruction for its own sake in the name of Mother Earth turned Green. For the greenest of the Green this would require a purified earth devoid even of the tendentious and raucous noise of humanity. Driven by their belief in the purity of nature and the corruption of humankind, the choice of the modern Romantic, in the name of the perennial Green Gaia, is not to conquer, but to submit. There is in this a reversion to the pagan worship of objects of nature as though they were sentient creatures more worthy of respect and awe than anything human could ever be.

Conquest

On the opposite side of the battle lines defenders of the faith in America and the West, besieged though they may be, hold an opposite view. The predominate response of humankind operates from a base of curiosity, energy, and inventiveness. The wheel was invented, animals were domesticated, learning agriculture assured a more dependable food supply. Language, writing, and mathematics were formed. This and much else coalesced into the beginning of Western civilization in ancient Greece, though even the Greeks still recognized a hierarchy of pagan gods.

But if the Greeks had not invented implements or institutions capable of taking more than the roughest edges off nature in the raw, they did better with human nature. From the minds of their philosophers, historians, and poets there erupted an explosion of creativity that set the stage for the modern world. The Greeks, most significantly in Athens, developed democracy, cherished freedom, and produced philosophers without peer. They sought to tame the worst in human nature, not only through philosophy and reason, but also by their artistic creativity.

At Athens the great tragedies of Aeschylus, Sophocles, and Euripides, and the comedies of Aristophanes, were performed in a semi-circular stone theater below the Acropolis that still exists. These plays set before the men of Athens a morality by which to domesticate and control the savage elements in human nature. The message of the Greek tragedies is recognition of the boundaries of human action, of limits to violence, and of the need to rectify injustice and to assure retribution against those who violate those principles. That quest, never entirely successful, remains at the core of civilized experience. It is a philosophy, like that of Hobbes, which recognizes the evil inherent in humankind along with the good, and prepares ideas and institutions to deal with both good and evil.

The pure idealist who does not recognize evil, or the reality of nature in the raw, including human nature, is ill prepared to deal with strokes of evil when they fall, as they surely do. Rather than propitiate the spirits of inanimate forms in nature, the Western tradition from Biblical times to the present has been to go forth and conquer nature for the benefit of mankind. Every artifact of civilized existence is a result of that pursuit. That technological humanity can live in houses and cities rather than in caves or grass huts in small villages; that it can move about in comfortable mechanical devices rather than on foot or atop animals; that it has scientifically developed treatments for disease rather than having to rely on the ministrations of shamans and witch doctors; that scientists and engineers can create the enormous labor-saving capacity of the tiny computer chip; that astronomers can pursue an understanding of spiral galaxies, black holes, and the creation of the Universe—all of this and the rest of civilization is a result of patiently learning how to understand, to mitigate, and to exploit the potential of a raw and ferocious nature.

The venture of using nature’s resources for the betterment of mankind is opposed, not by every environmentalist by any means, but by those at the radical core of the environmental, animal rights, pagan religious, and similar movements. These assert that what science and industry have done is a blasphemy against nature and Mother Earth, an affront to Gaia that must be stopped. Why is it that the fruits of human endeavor are so rabidly opposed at the Green extreme? And what of it? Aren’t such views too extreme to be credible? Perhaps. Yet not infrequently it is persistent application of energy and dedication to a cause at the extremes that ultimately determines a movement’s course and purpose. A look at acts and statements of the greenest of the Greens should furnish insight into what is at stake in their Civil War engagements against human use of nature’s resources.

8. The Green Machine
Pious Genocide

A persistent characteristic on many fronts of the Civil War is the drive to destroy with no model for rebuilding. In that effort the Green Machine leads the way. The first proud achievement of the environmental movement was to achieve a ban on the use of DDT as a spray to kill malaria-bearing mosquitoes. That spared the mosquitoes but resulted in the deaths of millions of people from malaria.

That proud achievement still marks the onward march of environmental passion. Dichloro-diphenyltrichloroethane was first synthesized in 1874. It was not until 1939 that DDT was discovered to be toxic against mosquitoes, while harmless to human life. In the Pacific during World War II DDT saved the lives of an estimated 400,000 American servicemen who would have died of malaria. One of them might have been your father or grandfather. Use of DDT against malaria-carrying mosquitoes spread rapidly after the war and was effective in virtually eradicating that disease.

Author and Wall Street Journal commentator William Tucker describes its use in Sardinia. There an annual death rate of some 50,000 was reduced to seven in three years. In Sri Lanka an even higher death rate was reduced to 17. More remarkable benefits followed. In Third World countries before DDT was introduced there were an estimated three million deaths from malaria every year, with millions more sickened by the disease. By 1970 a research committee of the National Academy of Sciences reported that in just over two decades the use of DDT had prevented 500 million deaths due to malaria. Worldwide the death toll from malaria had been reduced to near zero. The Nobel Prize was awarded to Swiss chemist Paul Muller in 1948 for pioneering work he did to help bring about these astonishing results.

The subsequent banning of DDT that began in the 1970s was inspired by a willful misinterpretation of Rachael Carson’s book Silent Spring published in 1962. To some Carson’s book is the genesis of a beneficial environmental movement, and in some ways it has been that. Yet not only has spring turned out to be as cheerful and noisy as ever, and not silent at all, but use of the book has been taken far beyond anything advocated in it. What all too often lies beneath the soft green exterior of the Green Machine is demonstrated by its genocidal misuse of Carson’s work. Silent Spring investigates the effect on nature of a number of pesticides. Ten Years after its original publication DDT was plucked from its pages and made a villain by environmental activists seeking a cause.

Rachael Carson’s concern about DDT was simply its indiscriminate use as an herbicide by the U. S. Department of Agriculture. She did not advocate banning DDT, but only that reasonable regulations be applied to its use. Then it was discovered in America that in some mothers milk there were detectable “parts per million” of DDT. The green extreme seized the moment. A public furor was raised, and in 1972 the newly formed Environmental Protection Agency was asked to ban DDT. The EPA investigated the matter, evaluated the evidence, and its scientists could find no indication that DDT was harmful to humans. But facts do not interest the passionate activist. By the time the EPA’s scientific report could be issued the tocsin had been sounded, the brigades of misinformation had been marshaled, and environmentalists on the warpath had embellished the data with alarming “facts” of their own.

It was said that, in addition to polluting mothers milk, runoff from fields or ponds treated with DDT was causing eggshells of some birds to be too thin to last until the bird could hatch. The media were convinced, the presses rolled, the TV anchors rose up in indignation, the birds were sanctified, and the anti-DDT onslaught swept all before it until the political pressure became too great to be resisted. The EPA, against its own scientific evidence, issued orders banning both the use and manufacture of DDT in the United States. The effect here was not large, since other substances were available as alternatives.

Passionate environmentalists mounted a similar offensive against DDT in the Third World. Lobbying such international organizations as the U.S. Agency for International Development and relevant United Nations agencies, the anti-DDT movement persuaded or forced most Third World countries also to ban DDT. As though patiently awaiting the call, their high-pitched whining buzz signaled clouds of mosquitoes swarming back to the attack. “Predictably,” notes William Tucker, malaria made “a ferocious comeback.” Recorded malarial deaths in Africa are now the highest in history.

But wasn’t the globe overpopulated to begin with? What’s so bad about millions of fewer mouths to feed with limited resources? And doesn’t it affect mainly the poor who are a drain on society anyway? If these questions seem caustic or cynical, consider Paul Ehrlich’s answer. Ehrlich, author of The Population Bomb (that never went off), chastised those who would eradicate disease in poor countries as “death controllers” interfering with the “natural restraints on population growth.” Such policies, Ehrlich contended, would allow population increase so great as to destroy the earth’s ecosystem. Tucker recalls two types of demagoguery employed throughout history. One tells the poor the rich have too much money. The other tells the rich there are too many poor people.

So Ehrlich’s plea to ignore the “death controllers” and let mother nature get on with her population cleansing was accepted. The results have been effective. The World Health Organization estimates that from 30 to 60 million people have died of malaria since DDT was banned in most countries. That some 30 to 60 million, and counting, already dead from malaria since DDT was banned may exceed the record of Hitler and Stalin combined. The anti-DDT drumbeat became a death march to the rhythm of which humans, rather than mosquitoes, would be ushered to extermination. But, as Joseph Stalin once said, one death is a tragedy, a million deaths is a statistic.

On the thirtieth anniversary of the original DDT ban in the United States, Vermont Senator Jim Jeffords introduced a bill in the United States Senate calling for an international treaty that would further restrict the use of DDT. For those such as Senator Jeffords and other environmental greens who proclaim their sensitivity, compassion, and good conscience to assist those in need, there is a question to be considered. The effects of the DDT ban are undisputed, and the results are known. The anti-DDT movement is a death sentence to millions more still living, and who may wish to stay that way for a while. And for millions yet unborn. The German Nazis and the Soviet Communists issued their orders of extermination knowing what the results would be. Is what the anti-DDT movement has already done, and is still doing to those who die of malaria, different practically or morally from the work of the Great Dictators? If so, how is it different?

The image of mother’s milk—more sacred even than ice cream or apple pie—allegedly polluted by an insecticide had done its work. Environmental fundamentalists revel in their victory (far from the stench of rotting bodies), an achievement to inspire them to grasp for ever greener ambitions. It is simply unfortunate if there are, from time to time, incidental side effects.

PETA, ELF, and ALF

The full name of PETA, People for the Ethical Treatment of Animals, barely hints at what the goals of that organization actually are. Such groups as the Society for the Prevention of Cruelty to Animals, the SPCA, have long pressed for their humane treatment. It is kind and admirable to advocate that animals not be stoned, tortured, beaten, starved, or otherwise subject to pain and bodily harm. But “ethical” treatment? For animals? Ethics is a branch of philosophy or religion dealing with moral duty and obligation of human beings toward one another. Here it is the animals that are said to have a “right” to ethical treatment. Rights are normally identified and claimed by sentient creatures able to articulate and understand what rights are, and what the reciprocal obligations might be.

Ethics involves interaction of human beings with each other. What is ethical is defined by a society in which those affected participate. Only humans “have the capacity for free choice and the responsibility to act ethically,” says Tibor R. Machan, a Hoover Institution fellow, in his book Putting Humans First. Human rights include the reciprocal obligation to respect the rights of other human beings. Whatever “rights” may be attributed to animals are invented by human beings. Animals don’t know anything about the rights of others, or whether either they or the others have rights or not. But PETA people believe that animals are essentially the same as human beings. PETA stands ready as a diligent enforcer of the “ethics” and “rights” of their furry friends, with emphasis on enforcement. And PETA shows little compunction about the techniques it considers acceptable to compel compliance.

In a PETA ad spokesman Bruce Friedrich is credited with the following wish list: “It would be great if all the fast-food outlets, slaughterhouses, these laboratories and the banks that fund them exploded tomorrow.” To make such dreams come true PETA donates money to its offspring ELF, the Earth Liberation Front. ELF claims that its aim is to liberate the earth from the profit motive, the benefits of free trade, and the production of mere “things.” That is, the products that those who remain unliberated from ELF’s “tyranny of the past” find pleasure in owning or using. The ELF doctrine is right out of Mario Savio’s anti-capitalist playbook back in Berkeley in the1964 riots.

The ELF website has included such items as “Setting Fires with Electrical Timers: An Earth Liberation Front Guide.” The PETA funds given to ELF have included assistance for an ELF arsonist who plead guilty to setting fires at a Michigan State University laboratory. PETA president Ingrid Newkirk characterizes the arsonist as a “very nice” and “idealistic” young man. Newkirk says “there’s a difference between violence to property and violence to persons.” So the crime of arson, if committed for a good purpose, is OK.

Two Elk Lodge at the top of the Vail, Colorado ski area was a beautiful and environmentally harmonious structure. Large timbers and native stone were the main components in its architecture. Outside stood life-size bronze statues of a pair of elk, antlers and all. On a bright winter day languages from all over the world could be heard chattering happily over lunch, a cappuccino, or a glass of wine. One night Two Elk Lodge vanished in a roaring inferno: $12 million in property damage, and an immeasurable loss in beauty and healthful pleasure. The Earth Liberation Front was proud to claim credit for the event, having relieved the earth of what it considered a human created blight. ELF operates in small cells, much like al Qaeda and similar terrorist groups. That means identifying the individual participants in any given incident to prosecute for the crime is not easy. It was only after years of investigation that the perpetrators of the Two Elk Lodge arson were found, prosecuted, and sentenced.

By the end of the year 2001, according to the Portland Oregonian, ELF had boasted additional property damage of some $26 million in five years, involving 33 major crimes. Prof. Toby Bradshaw’s tree research was destroyed in a University of Washington arson. Bradshaw declares of ELF that “these people have a combination of ignorance and malice that is really dangerous.” The FBI characterizes ELF as one of the most active U.S. based terrorist organizations.

A related organization is the Animal Liberation Front. To ALF the use of laboratory animals in experiments to develop life-saving drugs and procedures to cure the ills of people is a terrorist act. Attacks against laboratories using animals in experiments leading to cures for human diseases are launched in the name of saving animals from being used to save humans. To them the fate of the animals in laboratories is the same as the fate of human victims of terror. Nearly 3,000 people died in the Twin Towers? Same as 3,000 experimental rats, right?

Still, it’s hard to outdo PETA itself. One of its ventures into philosophy and ethics is to compare animals in slaughterhouses to Jews in concentration camps. Each group is “terrorized” when housed in “huge filthy warehouses and rounded up for shipment to slaughter.” Leather seats and handbags are “the moral equivalent of the lampshades made from the skins of people killed in the death camps.” It is as though, in attributing ethics to animals, adherents to the PETA cause have agreed to an exchange of characteristics with the animals. The PETA folks take on the mindless ferocity of the animal while supposedly granting the animal a basis for thoughtful and ethical consideration.

For PETA devotees to absorb for themselves the violence of nature on the loose while ceding the ethical high ground to its four-legged friends is an interesting artifice. If it remains doubtful that the animals have been endowed with some higher level of contemplative morality, it is quite possible that their human advocates have ceded their humanity. Although it is a bizarre twist of ethics and morals to equate humans and animals, some might be persuaded to apply the analogy to those who do. In practice the “ethical” treatment of animals, such as PETA advocates, is an idea that turns upon itself and becomes an animalistic regard for humans. “When it comes to feelings, a rat is a pig is a dog is a boy.” So says PETA’s president Ingrid Newkirk. That being established, she sees “no rational basis for saying that a human being has special rights.”

Organizations such as PETA, ELF, and ALF, are assisted by classroom teachers, think tanks, and the rest of the militantly Green “community” in its effort to change humanity. The change, by whatever increments that can be enforced, is downward from the human level toward the level of the animal. Novelist Dean R. Koontz remarks that this “embodies the antihuman essence of fascism.” The worship of earthly creatures is, in its essence, a virulent gnawing away at human confidence, pride, and achievement.

Help! Rape!

The more passionate Greens see all things flowing from science and industry as devised to despoil the imaginary environment they revere, to ruin everything they consider to be “natural.” These Ultra Greens insist on the curious notion that exploitation of natural resources is somehow an iniquity, a crime. They like to call it “Rape of the Earth.” Yes, it’s that bad.

It is declaimed that humanity is exhausting the world’s oil supply, the world’s gas deposits, the world’s copper, iron, diamonds, coal, and tin—all the world’s natural resources. That message is being driven deeply into the culture in schools, think tanks, foundations, media hammering, and elsewhere. Often this occurs on a basis not noticeably more sophisticated than the earth, air, fire, and water theory of the ancients who believed the earth is made of those four elements. Discerning thought is not a mark of the more fervid environmentalist.

Consider this example from a widely used school textbook: “As human activity interferes with the earth’s capacity to maintain a maximum range of tolerances for life, history traces the roots of degrading activity to the advent of agriculture and the rise of civilization; the Judeo-Christian view of human beings as having domination over the earth; the industrial and scientific revolutions; and the rise of capitalism.” That’s a fairly comprehensive indictment, advanced as Green Gospel in the classrooms of public schools.

Clearly such “degrading activity” must be eliminated. For that to happen, in terms of the textbook’s analysis, the following must occur: abolish capitalism and industry; reject the science and engineering that support the industrial revolution; stop trying to dominate the earth by extracting and using natural resources; give up agriculture and get used to eating roots, wild berries, and raw rabbits (while they last). This would get rid of a few billion people by starving or freezing them to death, and leave—well, the textbook doesn’t seem to go into that.

One bright winter day a local schoolboy and a vacationing online columnist happened to be paired up on a chairlift at Teton Village ski resort in Jackson Hole, Wyoming. The columnist reports that, knowing the boy lived in an area of active environmentalism, he asked the boy what he was learning about the environment in school. The answer was quick and unequivocal: “Don’t mess with mother nature.” The visitor suggested to the boy that if mother nature hadn’t been messed with a bit he wouldn’t be on that ski lift. The boy looked a bit quizzical. Nor, the visitor persisted, would he have those elegant Rossignol skis on his feet, or top of the line Technica boots to hold them on. The lad found a strap on his glove that needed adjusting. And was he looking forward to lunch by a warm fire in the Mountain House? As their chair neared the top of the lift the lad found it necessary to spend the remaining moments adjusting his elegant spacesuit ski attire and had no time for further conversation. His visiting lift companion bid the lad a cheerful good day, wondering if he might have pried a germ of thought into that modishly educated and expensively hooded head.
Author and columnist Mark Steyn suggests that the whole purpose of the “earth-is-your-mother” environmental doctrine is “to inculcate an enfeebling passivity in the face of nature.” What would the consequences have been, he wonders, had the Pilgrims consulted “Ye Olde Weather Channel” about the climate of New England before they set sail for America back in the year 1620? What if they had shivered on the dock in submissive acquiescence to the forces of nature and sold the Mayflower for kindling wood? The earth as your mother, Steyn snorts, “is eco-babble.” And the kid on the ski lift is pretty much the sort of passive, limp, indulgent automaton his classroom teachers seek to produce as future citizens, using such material as cited above.

The Green Crusade: Rethinking the Roots of Environmentalism is a book by Duquesne University political science professor Charles T. Rubin. In the book Rubin observes that susceptibility to irrational fear can be, and is, easily stimulated and abused. The fact is that the future is uncertain, even where life is good. “Past results,” Rubin notes, “do not guarantee future performance.” He recognizes that “the fragile components” of the complex society in which we live are not guaranteed to hold forever. How can it be known that the next predicted catastrophe wouldn’t be the real one? So the revolutionary tactic is to raise fear, create doubt, and invite the fearful to join the crusade. And by all means, dig down deep when the collection plate comes by and contribute to the cause. The heavier the collection plate, the more damage to be done to those profaning the earth.

New fantasies of catastrophe are probably as unlikely to be fulfilled as the old ones were, though no one can say so with certainty. It is difficult, if not impossible, to prove a negative. There is no guarantee that this or that cannot happen, including global warming or a new ice age, but it is helpful to find out who profits from the panic inspired by the charges.

Environmentalists seek to exploit uncertainty, Rubin says, by promising “complete certainty if only we remake the world as they desire.” The future under prevailing conditions is uncertain, so follow us unto the land of certainty. The environmental visionaries promise that conditions are to be made, not just better, but best. Not just good, but perfect. And how likely is that in human affairs? No matter, says Rubin, the doomsayerperfectionist always has an answer: If things appear to be going well, they will assert, “they are only going well so far.” Then they offer some new “bellwether for future disaster” and pass the collection plate.

Boston College professor of philosophy Thomas Hibbs concludes that environmentalism is driven by a vision “that puts the environment above liberty, self-government, human diversity, and material well-being.” Ethicist Paul Ramsey remarks the visionary’s inevitable recourse to power where persuasion fails. He cautions that those who preach “ultimate success” where ultimate success is not possible are always “peculiarly apt to devise extreme and morally illegitimate means for getting there.”

Author and philosopher Gene Edward Veith, Jr. in his book Modern Fascism reports the view of nature held by Finnish Green Party activist Pentti Linkola. Linkola says the use of natural resources is nothing but “ravage and despoliation.” He considers human beings to be “an evolutionary mistake, a cancer of the earth.” Linkola claims more sympathy for certain insect species than for children dying of hunger. As to the supposed problem of overpopulation, Linkola asserts that “sacrificing billions might possibly save a million.” This remnant of chosen people would live in an “authoritarian agrarian society.” Calling his new authoritarian society also “agrarian” may be to conjure up green fields and wooded glens to screen the truth of the inevitable savagery of his totalitarian vision. Linkola exposes the essence of the return to nature fantasy: its heart of ice. Still there are those who, staring into the pit, claim they are eager to jump while they are still warm (not too many do).

John Berlau, author of the book Eco-Freaks: Environmentalism Is Hazardous to Your Health, quotes former Vice President and global alarmist Al Gore on the subject. “We have been blind to the fact that the human species is now having a crushing impact on the ecological system of the planet.” Berlau perceives that such “anti-human statements” as this are “the most poisonous kind of environmental rhetoric.” Suggesting evil in humanity in general is far more ominous than attacks on individuals or groups. The truth of “An Inconvenient Truth, ” as Gore titled his book on the subject, is the clear enough. The “crushing impact” of the human species can be mitigated only by allowing fewer, if any, of the species to stick around. These “Gore mongers” are the ones who rev up the Green Machine and speed to the assistance of anyone yelling, “Help! Rape” when anything useful is extracted from Mother Earth. For his service in revealing the evil in humanity Mr. Gore was awarded the 2007 Nobel Peace Prize.

Warm Ice

“The sky is falling! The Sky is falling!” That was the cackling cry of Chicken Little of old. Her original fear of the sky falling having proved baseless, or at least premature, today’s Chicken Little pesters the coop and chicken yard with a new cry: “The earth is warming up! The earth is warming up!” The Wurm ice sheet covered most of Europe some 20,000 years ago, about the same time the Wisconsin ice sheet covered much of North America. Then the ice in both began to melt. There is no record of whether the melt was called global warming. Nor do we know whether the warming was attributed to too many campfires lit by our irresponsible ancestors trying to keep warm. But most of us are not too unhappy about the warming that did occur.

There have been numerous cycles of warming and cooling throughout the millennia, long before humans learned how to make themselves a bit more comfortable and secure through scientific and industrial processes. A study by scientists Fred Singer and Dennis Avery finds that there have been some 600 global warming and cooling periods in the last million years. More recently a chart prepared by the UN Intergovernmental Panel on Climate Change (IPCC) in 1990 shows a “Medieval Warm Period” from about 1000 AD to 1500 AD, followed by a “Little Ice Age” that extended from 1500 AD to approximately 1900 AD. During the latter period there was ice skating on the Thames River in London and freezing on the Hudson River in New York State.

How intense is the present global heat wave? Most scientists agree that the earth has warmed slightly over what it was a century or so ago, somewhere around one degree Fahrenheit. Just 1 degree. In 100 years. It is remarkable that such a small increment over the entire earth could be measured at all, not to mention taken as proof of catastrophe to come. Temperature changes in more recent years are an embarrassment to the global warmers. The earth’s temperature during the decade bridging the twentieth and twenty-first centuries remained flat; no increase, no decrease. The year 2007, according to all four of the top services that measure such things, produced record cold. Some melting at the North Pole was more than compensated for by additions to the South polar icecap. North America had the heaviest snow cover in 50 years, and the year 2010 winter was even worse. Similar conditions existed from China to Australia. Baghdad had the first snowfall ever recorded there. How do the global warmers respond to this evidence? They ignore it and press on. To be sure, the statistics for one year, or ten years, do not close the debate. But might not such statistics be sufficient to give pause to some of the more apocalyptic proclamations about global warming?

The truth, whether it be considered convenient or inconvenient, is that there is nothing more natural than periodic fluctuations in weather and climate.

 

Ice

An article in the April 28, 1976 issue of the magazine Newsweek, following a cooling period after World War II, presented evidence that scientists believed the world was entering a new ice age. Evidence was reportedly so massive that climatologists could hardly keep up with it. Agriculture could be affected to the point that catastrophic famine might occur. The article refers to the “little ice age” that occurred in Europe and North America between the years of 1600 and 1900. On the Hudson River iceboats sailed as far south as New York City, while in England the Thames was frozen so solidly that Londoners roasted oxen on the ice. The Newsweek article presented scientific predictions of similar conditions, or worse, developing in the mid-1970s.

At that time, columnist Walter Williams reminds us, there was already “massive worldwide industrialization” in a period of virtually nonexistent emission controls. Smokestacks smoked, carbon dioxide belched forth, millions of animals produced flatulent gases including CO2, and the “carbon footprint” was enlarged. Yet the earth got alarmingly colder for a while when all that uncontrolled emission should have made it warmer according to the theorists of global warming. Weather, climate, temperature patterns, droughts and hurricanes seem to occur in cycles of change, and none is observed to remain static.

The radical environmentalists with their Green Machine claim a mission to save the earth from the recurring evolutionary cycles of nature. But in the end the earth is not the target of concern. In this crusade the sword of retribution points, like a dowsing rod to water, straight at the malignant heart of man, rapist of the earth. 9. The Holy Green Crusade

Militant Theology

The apocalyptic calls to control global warming or face imminent catastrophe emanating from the radical core of environmentalism have infected the entire country. The prophets of a global warming doomsday claim their vision is based on scientific studies that not only prove the warming, but also conclusively find that its cause is manmade. The facts are that the whole matter is more theological than scientific. Computer projections are the basis of global warming predictions. There are serious deficiencies in those projections, including lack of adequate data, bias in selecting the data, and purposeful manipulation of data to achieve the desired political result.

Author and social theorist George Gilder, and editor and publisher Richard Vigilante, observe in a joint article on the subject that there are dozens of computer models, and none of them is the same as any other. Each model has several million variables, and each variable must be assigned a value in the overall assessment. Computer results vary according to input as well as by analysis of the output. Only the most extreme predictions are used to start Chicken Little cackling about the earth warming up. Clouds present a particularly difficult case because they act both as shade to keep the sun’s heat out and as a blanket to keep the earth’s heat in. So scientists cannot agree whether clouds cool or warm the climate. The weight clouds are given as to their heating or cooling effect in any computer model is therefore an arbitrary determination. And clouds are only one input of the millions fed into computers the output of which is relied on to support predictions of catastrophe.

Michio Kaku, professor of theoretical physics at City University of New York, questions the validity of any computer projections involving complex phenomena. The prediction of weather even a few days in advance, he notes to the surprise of no one, is not always reliable. Even the most powerful computer, Kaku cautions, is incapable of accounting for every molecule that makes up the weather. Rather, he suggests, it may be that “the smallest system which can truly simulate our weather is the weather itself.” If this is true for weather prediction a few days in advance, how much confidence, Kaku asks, can we have in computer data that presume to predict climate decades into the future? How many of the “molecules” that would be required for an accurate prediction are missing from the equations? How much relevant data is simply not known? Is it possible to give accurate weight to each of the elements entered into the computer equations?

The “compound uncertainties and blind spots” of computer models “make it impossible to know the probability of any future outcome” (emphasis in original). That is the conclusion of Steven F. Hayward, a resident scholar at the American Enterprise Institute and author of the annual Index of Leading Environmental Indicators. Hayward suggests we should ask what the chance is of X occurring rather than Y? Is it 50%? 1.25%? 62.7%? Even the weather folks sometimes risk their batting average to venture a percentage of probability that it might rain tomorrow. Hayward contends that if the operators of the global warming apparatus can’t offer a similar percentage of probability that their predictions are accurate, which they do not, this should mean to a scientist that their predictions are unreliable and useless.

Thus the data entered into the computer determine the analysis of its output. The “science” cited to support the theory of manmade disaster depends on the “theological” input of the computer. The “inconvenient truth” of looming climate change catastrophe that Al Gore claims to possess is based on the results of these arbitrarily constructed computer models.

The Ottawa-Carleton Geoscience Center at Carleton University in Canada studies the sun and its effects on earth. The Center’s director, Timothy Patterson, presents a theory that goes against the “received theology of the Worldwide Church of Science,” as Wesley Pruden, former editor in chief of the Washington Times, calls it. Patterson points out that the earth’s climate has never remained stable. Cycles of sun activity act to cause more or less cloud formation over the earth. The present cycle is in a “high sun” period that tends to heat the earth. Patterson’s data indicate that by 2020 we will have moved into the “weakest solar cycle of the past two centuries.” That, says Patterson, is likely to lead to “unusually cool conditions on earth.” On that basis Pruden’s advice is not to dispose of materials or dismantle processes that leave large “carbon footprints.” We might need them to keep the place warm.
Another scientist, Lord Christopher Monckton, was science and technical adviser to former British Prime Minister Margaret Thatcher. Monckton cites “hundreds of recent scientific studies” that explain global warming as part of a naturally occurring cycle. These hundreds of studies show that warming and cooling is simply “a moderate, natural 1,500-year global climate cycle” likely caused by the behavior of the sun. Even the most ardent global warming fixers haven’t as yet offered a theory to refute that embarrassingly inconvenient truth. So they ignore it.

U.S. Senator James M. Inhofe, former Chairman of the Senate Environment and Public Works Committee, castigates the American media as “advocates for hyping scientifically unfounded climate alarmism.” He notes that the media rarely, if ever, present the large body of scientific opinion to the contrary. Meteorologist John Coleman, founder of the Weather Channel, terms global warming “the greatest scam in history.”

The National Academy of Sciences (NAS) was asked by President George W. Bush to prepare a report on climate change to assist him in assessing the merits of the Kyoto treaty on global warming. Radical environmentalists gushed approval, secure in their global warming anti-industrial litany. Reading or seeing no more about the resulting report than was available on mainstream newscasts or in most newspapers, the global warmers felt vindicated. A typical media version was that of CNN reporter Michelle Mitchell. Assuming a stance of righteous, yet controlled, vindication she announced that the report was “a unanimous decision that global warming is real, is getting worse, and is due to man. There is no wiggle room.”

Richard S. Lindzen, professor of meteorology at MIT, is one of the scientists who prepared the NAS report. His response to Ms. Mitchell’s claim was the admonition, restrained and gentlemanly, that what Mitchell reported over national TV was “simply untrue.” Lindzen points out that NAS never asks that all participants in such a report agree to all its elements, so that NAS reports rarely, if ever, represent a unanimous decision. They represent a span of views and evaluations. That is what the report on climate change did. As Prof. Lindzen states in a Wall Street Journal article, “Our primary conclusion was that despite some knowledge and agreement, the science is by no means settled” (emphasis added).

The NAS panel did agree that carbon dioxide is one of many greenhouse gases whose increase is likely to warm the earth. However, the “most important” of these gases, Lindzen cautions, are “water vapor and clouds,” not carbon dioxide. The panel made no prediction as to what the effects of carbon dioxide have been in the past, or will be in the future, on the earth’s atmosphere. Yet such as Ms. Mitchell brazenly proclaim as truth, on national television, exactly the opposite.

Prof. Lindzen states unequivocally that the NAS report makes it clear “that there is no consensus, unanimous or otherwise,” as to either climate change or what might cause such change (emphasis added). It would appear that the NAS finding prepared for President Bush is a fair summary of scientific opinion as a whole on global warming. Stephen F. Hayward suggests, rather colorfully, that while the greens accuse their critics of denying reality “it is the greens who have their heads stuck in a dark place.”

Ted Nordhaus and Michael Shellenberger operate an environmental research firm. In their book Break Through they point out as a final irony in this distorted debate that carbon dioxide is the principal nutrient for plant life on earth. Yet it is the principal bogyman of the excitable environmentalists who claim carbon dioxide is the main “pollutant” that is causing global warming.

Columnist and George Mason University economist Walter Williams reports in his syndicated column that British Channel 4 television has done a documentary called “The Great Global Warming Swindle.” The program, says Williams, “devastates” most of the claims about global warming, drawing its information from top climatologists at MIT and other major universities around the world. Their findings include the astonishing revelation that only about 5 percent of carbon dioxide emissions, of such concern to the warming brigades, are produced by human activity. Some place the figure at 10%, others at 3%. That 3-10% is the emission global warmers propose to spend an estimated $95 trillion to eradicate. Yes, trillion! The rest of the carbon dioxide emissions, 90-95%, come from such sources as volcanoes, dying vegetation, and flatulent animals (We await a remedial proposal for that one). Other findings indicate that the oceans emit most of the greenhouse gases. Other scientists cited believe that sunspot activity is the greatest influence on earth temperature. Chancellor of the Exchequer in Great Britain during the Thatcher years, Nigel Lawson, in his book An Appeal to Reason A Cool Look at Global Warming takes the open approach of accepting the worst of the warming predictions, and then asks some questions. Would the enormous costs of what the hottest of the warmers propose be worth it? Or would a more plausible approach be to accept “autonomous adaptation” to the warming over the century or so during which it would occur? Human beings have been known to adapt to new and difficult situations, and with a century to work on might well do so in respect to global warming. Assuming it occurs. Lawson confirms that those who express skepticism about global warming, including politicians and scientists, “are treated as heretics for questioning the received wisdom.” Lawson confesses that he could write his book Appeal to Reason without suffering similar treatment only when his own career had been completed.

Hot Heads, Hot Air

A conference of the world’s leading economists, including three Nobel Prize winners, was convened in Copenhagen Denmark to consider the cost versus the benefit of a number of proposals for improving life on the planet. This included evaluating the cost and calculated effectiveness of “fighting global warming.” The participants reviewed the global warming claim that without drastic measures to curb carbon dioxide, the principal devil of the global warmers, the earth would warm up by approximately 7.3 degrees centigrade by the year 2300. In what came to be known as the Copenhagen Consensus the participants concluded that reducing carbon dioxide as advocated might, at best, keep warming at 6.1 degrees Centigrade above current temperatures by 2300.

The projected cost to reduce warming by 1.2 degrees Centigrade over the next two centuries is the 95 trillion in 1990 dollars ($95,000,000,000,000) noted above. Nobel economist Douglass North participated in the Copenhagen Consensus. He points out that the benefits projected by reducing global warming are “far into the future,” as well as debatable. The costs are “up front and immediate.” The remedies proposed by the alarmists would require a drastic scaling down of industrial and technological development. This would produce extremely damaging, if not catastrophic, consequences for living standards everywhere. The poorest of the world would bear the heaviest burden, just as they do whenever general living standards decline.

The global warmers do not like to be reminded that carbon dioxide is a natural component of the earth’s atmosphere. Plants need CO2 to generate oxygen for animal life to breathe. Nor do they care to consider that humans and their activities account for no more than approximately 3% to 10% of all CO2 emissions. What, we might ask as a matter of common sense, would the likely effect be if some percentage of that small percentage of emissions were eliminated? Would such an amount be likely to justify anything like the sacrifices the global warmers call for? Man made global warming, says writer and scholar Peter Ferrara, “is a hoax developed to serve powerful special interests.”

In view of the authorities and data cited in this chapter, it also seems fair to ask whether the alternative figures of the president’s commission, the 6.1 degrees estimate versus the 7.3 degrees estimate, is any more realistic. Where does the 1.2 degrees difference come from if not out of the same computers that concocted “global warming” in the first place? Is it time to conclude that the whole business is so “far into the future” as to be scientifically useless? Then we could scrap the global warming charade, save the $95,000,000,000,000, and buy everyone a new suit and a good dinner.

Columnist and environmentalist Peter Pfeiffer, writing in The Washington Times, reports that scientists in relevant fields who dare to dispute the global warming dogma are “treated like a pariah” by their profession. Pfeiffer quotes Myron Ebell of the Competitive Enterprise Institute: “It’s the typical politics of the hard left at work. I think these are real threats.” Scientists who dissent from the alarmist cant, says syndicated columnist Walter Williams, have been deprived of financing for their projects, seen their work ridiculed, and heard themselves “labeled as industry stooges.” The George Mason University professor of economics also laments the billions of taxpayer dollars given to those who thrive on the farce, “not to mention their dream of controlling our lives.”

Peter Foster is a former Canadian liberal columnist now turned American moderate. Writing in the Canadian publication Financial Post Foster recalls the established history of global warming and climate change. Long before demon carbon dioxide was held to be the “cause” of global warming, and before the “polluting” industrial processes were invented, climate change was a periodic occurrence. When scientists seek to advance such facts as this regarding global warming, Foster notes, they are not “scientifically refuted,” but are “howled down as ‘deniers’ or industry shills.”

Professor emeritus in the Department of Ecology, Evolution and Marine Biology at the University of California, Santa Barbara, Daniel T. Botkin, has worked for 40 years to improve the environment. He says that genuine improvement of the environment can happen “only from a basis of reality,” and finds “that is not what I see happening now.” He cites the 19th century book Extraordinary Popular Delusions and the Madness of Crowds. The book cites such “madness” as the tulip mania in Holland in the 1600s when a tulip bulb might sell for more than a house. Or the mania about witches when the accused could be proved innocent only if she did not drown when thrown into a pond.

Botkin finds that today’s popular imagination in regard to the environment seems to be influenced by a similar madness in beliefs “that have little scientific basis.” In their book Break Through Nordhaus and Shellenberger suggest that in its delusion of imminent catastrophe the environmental movement has reached a dead end. It has become, not a scientifically based cause for improvement, but merely a narrow special interest. The authors argue that a truly effective environmental movement can be arrived at only if purged of its apocalyptic extremists.

Writing in The American Spectator Victor Davis Hanson, professor emeritus at Fresno State University and a prolific writer and commentator, identifies one wind tunnel from which emanates the global warming hurricane of doom. Hanson points out that, like Jean Jacques Rousseau before them, Western intellectuals and philosophers are comfortable in the leisure and safety they enjoy. They are at liberty “to fantasize about a primordial Eden-like past.” Or even more likely to enjoy romanticizing about “a purer nature unsullied by industrialization and urbanization.” They are free to indulge a sense of superior ease and self-gratification without giving too much thought to the results. The probability that their anti-industrial fantasies, if actually effected, would destroy the civilization that supports them, their fantasies, and their families does not seem to penetrate their quiet sanctuaries.

Peter Ferrara, general counsel for the American Civil Rights Union, asserts that the Obama “cap and trade” program is designed to raise the cost of energy to a level that the economy cannot sustain. The coal industry, Obama confessed in an interview for the San Francisco Chronicle during the campaign of 2008, is to be driven out of business by “emissions control.” Coal is the main fuel used to generate most of the nation’s electricity. Natural gas and other petroleum products are also tagged for a march to the gallows. Other nations, including India and China, have made it clear that national suicide is not their preference in addressing whatever merit, if any, there might be in the climate change crusade. Without the participation of China and India “trashing the American economy,” Ferrara says, will not produce meaningful results, as even the environmentalists concede. That is, unless trashing the American economy is their real purpose.

Ferrara, who is also director of budget and entitlement policy at the Institute for Policy Innovation, notes enthusiastic support for the global warming program in the bureaucracies of the UN and many nations around the world. That support, Ferrara says, is based on the expectation of a massive shift of power to them. That would include a correspondingly massive increase of “Green Police” to search the world for “carbon emissions” and punish the malefactors. Cutting edge environmentalists see in the global warming issue, Ferrara warns, “the potential for achieving their dream of repealing the Industrial Revolution.”

Battle Stations

The Holy Green Crusade, formerly based on “global warming,” now on “climate change” since it started snowing too much, has taken its place in the front lines of the Civil War. The battle cries of combat become higher pitched and more fanatical: “stop global warming,” “outlaw greenhouse gases,” “carbon dioxide kills,” “freeze the melting icecaps,” “stop the rising sea,” “twenty foot waves over New York,” “erase carbon footprints,” “beware of catastrophic flooding” “prevent new deserts,” “eradicate climate change,” “save drowning polar bears,” and other horrors yet to be invented. This litany is drummed through the air like a cosmic thunderstorm until the ears are deadened to any other message. These incantations are the foundation for new Holy Scriptures, and for revelation of Armageddon to come, to energize the militant legions for combat. The perpetrator, the guilty, the new devil, is “human activity.” Human activity must be expunged if the global warming/climate change crusade is to triumph. Too fanciful?

Screeching calls of disaster to come, absurd as most of them are, are clarion calls to the faithful to take up their weapons and charge ahead. Author and commentator Christopher Booker, writing in the Telegraph of London, to take one example, terms the hysterical claims of a rising sea level of 20 feet or more “the greatest lie ever.” He cites the work of Nils-Axel Morner, the former chairman of the International Commission on sea Level Change. Morner reports that in 35 years of measuring and observing relevant data worldwide he finds that there has been no sea level rise in the last 50 years. The most that could be expected in the 21st century would be some 4 inches. When computer models failed to show the 20-foot calamity touted by Al Gore, “corrective factors” were inserted into the computers to reaffirm the theology of doom. This is not surprising in the case of Mr. Gore, who is, shall we say, “liberal” with the facts. But he’s a politician. Perhaps computer scientists should answer to a higher standard.

Jonathan Lash, president of the World Resources Institute, agrees that water is cleaner, air is clearer, and resources are better managed as a result of heightened environmental concerns. These very beneficial results of the earlier environmental movement were based on factual conditions such as polluted water and foul air that could be seen, measured, and remedied. The challenge now, says Lash, is to shift support to longer-term projects such as climate change that people can’t see for themselves. Gathering allegiance for this new greening of America, Lash says, will require that adherents to the cause “assert what they believe in.” Lash sees the mainline churches as having a major role to play. If you can’t see for yourself what the new environmental horrors portend, Lash seems to be saying, don’t worry. Just close your eyes and we’ll imagine them for you. Lash must know that a human cause for climate change has not been scientifically proved. So he would convert the Green Machine into a religious faith that cannot be rationally challenged.

Anti-religion opinions of the U.S. Supreme Court (discussed in the following section) have combined with other cultural events to drive traditional religion off the public square and to create a secular state religion. Not only has God been driven away from public places, but also out of not a few mainline churches as well. But help is on the way. Jonathan Lash says that he and his groups will supply a new set of beliefs to fill the void, as well as the vacant churches. Not to mention the dwindling collection plates. This new coalition, Lash boasts, will “set the moral tone for the whole country.” Secularists will be furnished at last with the new belief they so ardently pray—well, hope for.

The American Spectator columnist Jonathan Aitken reports a curious union developing between old-fashioned Christian believers and the cult of revolutionary Greens. In England the Rt. Reverend James Jones, bishop of Liverpool, has set forth a new “earth theology” that holds Christianity to be a religion of consumption. That theology requires that humans be “discerning, responsible, and ethical consumers.” Bishop Jones predicts that climate change will become in the 21st century “a great moral spiritual issue, just as slavery was in the 18th century.

Such a doctrine will then have solidified what should be a scientific issue based on fact into an even more intractable religious belief than it is already. Global warming, climate change, and whatever is to follow will be ever more immune to fact, reason, or debatable public policy.

Philip Stott is professor emeritus of biogeography at the School of Oriental and African studies in London. Stott remarks in the Washington Times that there is a McCarthy-like movement among scientists who cherish the greenhouse effect. This he finds to be “like a puritanical religion, and that is dangerous.” Columnist Mark Steyn calls global warming a racket, and much more as well.

Author and commentator William Tucker in a Wall Street Journal article sees the environmental movement as having undergone an apotheosis. In its new godly image this is environmentalism at the extreme. It has developed, Tucker says, an “Ahab-like pursuit” of a natural environment cleansed of “every vestige of industrial society” and its “pollution-based prosperity.” The mad Captain Ahab pursued Moby Dick, the Great White Whale, to his own destruction. The Great White Whale the ardent environmentalists hunt down and would destroy is the economic and social base that exploits the earth’s resources for the use, and the livelihood, of humankind.
The problem is psychological as well as theological. Tom Bethell a senior editor at The American Spectator, perceives that liberals are convinced that renewables such as wind and solar can form a viable substitute for oil and coal within a few years. This belief is based on the liberal catechism that holds all problems are solved once we feel good about our intention to solve them. Or, as Bethell puts it, “They think goodwill can surmount all problems.” This challenge is aimed at democratic as well as technological society. The majority that should govern in a democracy is to be rendered passive and submissive. Ordinary Americans are to yield up their judgment. They are to be forced to support any cause made to appear worthy by a bucket of green paint. Humanity is reduced to a “pollution source,” despoiling the pure earth vision of advanced Green theology. A “pollution source” is, by definition, something to be eliminated.

The radicalizing of environmentalism at its leading edge will be accelerated by the effect of what philosopher and environmentalist Alston Chase notes is an increasing urbanization of the environmental movement. As people move to cities they lose touch with nature as it actually exists. The result, Chase says, is loss of a true model of the rural way of life, and the disappearance of “hands-on experience with nature.” There is a gradual diminution in the number of those who actually know nature, who know what they are talking about. Sadly, Chase concludes, while public concern for improving the environment increases, public understanding of how to go about it is diminished. The public perception of natural phenomena softens into acceptance of whatever theory, whatever “what if” alarm is sounded to warn of the newest imaginary catastrophe.

Once urbanized, says Chase, environmentalists tend to become “infatuated with fantasies about land untouched by humans.” These fantasies take root as abstract concepts about such issues as endangered species, grazing, water rights, mining, oil and gas drilling, and logging among others. In the name of abstract environmental purity mines are closed, lumber mills are shut down, land is withdrawn from grazing, and drilling for the oil and gas needed to survive as a nation is thwarted. These are only the first steps toward dismantling the entire economy as it now exists.

The “inconvenient truth” about global warming, when finally exposed, is revealed as an untruthful convenience. It is a fabrication used to deflect any inconvenient facts opposed to the climate change dogma. Its purpose is to mask the true objective of those who grasp passionately for the power to regulate and ruin the lives of their fellow human beings. Global warming is the stalking-horse devised to conceal liberal pursuit of power as the leading divisions of the Holy Green Crusade strike on toward their “final solution.” That solution, for the most ardent and irrational of its leaders, is two-fold: the eradication of industrial society; and along with that the elimination of the world’s main pollution source.

Vaclav Klaus, an economist, professor of economics, and President of the Czech Republic, pursues the threat of climate change theology in his book Blue Planet and Green Shackles. He sees the greatest threat to democracy, the market economy, and freedom to be “the ambitious, arrogant, unscrupulous ideology of environmentalism.” On both sides of the Atlantic, he says, “the debate has metastasized into cultural warfare against economic liberty.” Born in 1941 in Prague, he experienced little of Nazi atrocities, but was richly educated in Soviet oppression after World War II when the Soviet Union “liberated” his country from the Nazis. President Klaus knows the symptoms of growing totalitarian control when he sees them. In a recent article Klaus compares environmentalism to communism. The latter, he notes, was broadcast as giving the ruling classes “the right to sacrifice man and his freedom” in the name of the masses of the proletariat. The human sacrifice of self and liberty that environmentalism justifies is to be exacted “in the name of the planet.”

The book GREEN HELL: How Environmentalists Plan to Control Your Life and What You Can Do to Stop Them elaborates on the theme. Author Steve Milloy, an attorney and writer about “junk science,” sees a “powerful network of individuals and organizations” that has for decades “sought to transform our way of life” using various environmental alarms and pretexts. Global warming “is simply their latest—and by far most successful—organized campaign to achieve this transformation.” In an article for The American Spectator Peter Ferrara discerns that for its advocates global warming never was about science or evidence, but was directed to achieve a “massive increase in government power.” That would mean “a dramatic loss of freedom and prosperity” for ordinary people everywhere.
Much of the “science” that supports climate change/global warming is reported by the Intergovernmental Panel on Climate Change (IPCC) at the United Nations, and widely accepted as proving the change to be manmade. In turn the IPCC has relied heavily on research done by the Climate Research Unit (CRU) at the University of East Anglia in England. In the late fall of 2009 leaked e-mails from CRU at East Anglia showed without a doubt that data have repeatedly been suppressed or distorted to “hide the decline” of global temperatures during the ten year cooling period the world is presently experiencing. Data showing warming periods inconsistent with current theory have been “fudged” to diminish their damage to current dogmatic theory. These e-mails also show that climate researchers in other think tanks and universities have also engaged in similar “unscrupulous and thuggish behavior,” as the National Review characterizes it.

The e-mail scandal strips bare the hubristic arrogance of those who are certain the worst effects of whatever is happening to the climate are manmade. The scandal also shows how little we do know or understand about climate change, even as the Left pushes on in sanctimonious certainty. The purity the radical Left likes to claim has been badly tarnished by the e-mail scandal, but that will not slow them down. There is too much money involved in thousands of grants for researchers, and multi-millions of dollars in profits for those who manipulate the levers of the “green economy” they hope to impose on the nation.

To press their anti-human agenda there must be generated new threats even more ominous than overpopulation or the specter of global warming. The leading edge Greens have adopted a form of language that physicist P. H. Borcherds of the University of Birmingham in England calls “the hysterical subjunctive.” The tactic of the hysterical subjunctive is to fabricate frightening “what if” scenarios. The horrors of the “what if” scenarios are then sensationalized through a media that is at the same time sympathetic to the cause, eager for exciting news, and ignorant of the facts. By incessant repetition the worst that can be envisioned will come to be seen as scientifically validated. The power the ministers of hysteria seek is then within their grasp. Crippling, then dismantling, the industrial order is their program. Impossible? All it takes is the driven image of new devils and the power of Green Civil War.

Volunteers

PETA president Ingrid Newkirk preaches from the battlefield chapels of Green religion a sermon reported in Wild Earth magazine. “Voluntary human extinction,” she exults, “will solve every problem on Earth, social and environmental.” Al Gore groans on with his black prophecy that the “human species” is having a “crushing impact” on the world’s ecosystem (while collecting “green” profits by the truckloads). Pentti Linkola’s billions, those “cancers of the earth,” those “evolutionary mistakes” that Linkola says should die, must be made ready to oblige, willingly or not. This will leave the few who might remain luxuriating in Linkola’s “authoritarian agrarian society.” Dr. James Lovelock, a medical doctor, author, and independent researcher, sees the planet earth as a living organism that must be saved from human depredation. In his book The Revenge of Gaia he asserts that nine-tenths of humanity must be eliminated to save the planet from warming. An Australian Broadcasting Corporation website offers a “Greenhouse Calculator” that will tell you at what age you should die to save the planet. A pregnant mother-to-be aborts her child for the cause.

The truly unnatural aspect of these ostentatious lovers of nature is their assumption that the climate of the earth can be commanded to remain static through the destruction, not only of democratic and industrial society, but also of the entire human race. Attorney and author Wesley J. Smith comments that statements and acts such as these illustrate “how profoundly anti-human and pro death” certain aspects of our culture are becoming. The true passion of the most ghoulish of the Holy Green Crusaders, even greater than their passion for power, is a passionate hatred of humanity itself. So what better offering in propitiation of Mother Earth, the Gaia of the Greens, than these billions of human sacrifices?

Imagine a final earthly ceremony in prelude to the satiation of Gaia, led by the new Grand Master in whom the passion for power can be sated only in death. The EPA has shut down the rapacious automobile industry; the energy of oil and gas remains safely locked in the ground; meat production is prohibited as destructive to the environment; mines are closed; electrical generating plants are destroyed; and all other industrial operations are reduced to rubble. The Grand Master gazes down upon masses of exultant Greens gathered in victorious joy over elimination of all man caused environmental atrocities. Except one.
“It is time to choose!” The Grand Master’s voice blares down at the huge throng from loudspeakers on all sides. The righteously indignant, the victoriously exonerated, begin to feel their joy strangely muted as the meaning of the Grand Master’s words slowly shape a new reality. They stand before the Grand Master shuffling their feet. All eyes are to the ground, shifting along for any sign of a foot stepping forward to choose. Are there no volunteers to ease the way for the few who will remain to suffer the burdens of an authoritarian agrarian paradise, doing without the forbidden resources? No one moves. The Grand Master nods to agents among the masses below to begin measures of persuasion, and advises the masses to remain calm as the persuasion begins.

Visualize the ferocious Gaia of the Green brigades’ imagination, now rotund and sated, smiling at these events as she whiles away long winter evenings deep in the earth. There she sits, Mother Earth, Green as those who created her. All around her, covetous miser that she is, she gazes greedily upon the barrels of oil, the tons of gold, the mountains of iron, titanium, and zinc; all the massive treasure of her dark domain. Her eyes glisten as she counts up the mounds of rubies and opals, baskets of emeralds and diamonds that shall never sparkle in the sun, sated in the ravenous grasp of her wealth. The satisfied smile of Gaia turns to laughter of earth-shaking joy as the mountains of skeletons begin to fall to the ground above, their bones rattling in the wind, condemned for ravaging her sacred horde.

That is the Greenest of the Green dreams; rattling bones beating out the rhythm of victory in their Holy Green Crusade. It is the last somber hymn of a deranged ideology, twisting the good cause of concern for the environment toward its final solution of devastation, death, and misery.

IV. The Sword of “Justice”
10. The Judges v. the Law
A Stealth Attack

Since the earliest days of its existence the United States Supreme Court has rendered decisions indicating that it holds itself above the Constitution when a majority of the Justices feel their views are superior to constitutional requirements. The Court has chosen to ignore the fact that the Constitution, in limiting the power of government, includes limitations on the judicial power as one branch of that government. In such cases the Court has, in effect, established itself as a revolutionary tribunal when the majority has felt so inclined.

For a civil war to prevail it must destroy the core institutions and beliefs of the target social entity: tear down the old to make room for what is to come. In the present Civil War against America the Supreme Court has played a crucial role in performing that function. A great deal of the Court’s work in that respect was done well before the present Civil War took shape as such. In that sense the Court might be thought of as a Founding Rebel of the Civil War.

How the Court has, case by case, reaffirmed its superiority over the Constitution, until it gradually merged with the Civil War, is the subject of this and the following two chapters.

The first act of judicial defiance that set the Court against the Constitution occurred in 1803. It was not a frontal attack, but rather an assault by stealth. George Washington served as President from 1789 until his voluntary retirement at the end of his second term in 1797. John Adams served from 1797 until 1801, but was defeated for a second term in the bitterly contested election of 1800 between Adams’ Federalists and the Republicans (now Democrats) of Thomas Jefferson. Having lost, the Federalists did all they could to hamper the incoming Republican administration. That included making dozens of “midnight appointments” to fill as many administrative and judicial posts in the government as possible with Federalist sympathizers before the Republicans took over.

One of the Federalist applicants, William Marbury, got the appointment he wanted as Justice of the Peace for the District of Columbia. Unfortunately, the outgoing Secretary of State, in what is often termed a “fortuitous accident,” forgot to deliver Marbury’s commission. Without the commission he could not claim the job. So Marbury sued in the Supreme Court, asking the Court to issue a writ of mandate ordering the new Secretary of State, James Madison, to deliver the commission.
Normally the Supreme Court is an appellate court hearing cases on appeal from trials in lower federal or State courts. Here the Supreme Court was asked to try Marbury’s case as a matter of original jurisdiction, as an ordinary trial court would. The Constitution grants original jurisdiction to the Supreme Court in a few cases, but those do not include the type of dispute at issue in Marbury’s case. Congress, however, in the Judiciary Act of 1787, had granted to the Supreme Court the additional original jurisdiction to issue such writs as Marbury requested.

The case of Marbury v. Madison came before Chief Justice John Marshall, himself one of the Federalist’s midnight appointments following the election of 1800, so Marbury felt optimistic. But in what is surely the most significant decision ever rendered by the Supreme Court, Marshall held that the Court had no authority to issue the writ of mandate. The reason, he said, was that in authorizing the Court to issue the writ Congress was expanding upon the original jurisdiction of the Court as provided in the Constitution. That, he said, Congress has no authority to do.

In so ruling Marshall had declared the provision of the Judiciary Act of 1787 concerning the writ of mandate to be unconstitutional. The Court enforced its own decision by refusing to exercise the power granted. Marbury was out of luck and there was nothing anyone could do about it. That was the least of the matter. The Jeffersonians were furious, and for good reason.

What the Court had done in negating an Act of Congress was to launch a subtle but ultimately devastating judicial counter-revolution against the newly adopted Constitution of the United States of America. Under a theory popular among Federalists at the time of the founding the Supreme Court would have been the apex of the new government. It would have been the arbiter and final authority in interpreting and applying the provisions of the new Constitution. But no such provision was either debated, or even introduced, at the Philadelphia Convention where the new Constitution was debated and written. The Federalists knew it would never be accepted. The Court the Constitution creates is merely the highest authority of one of three co-equal branches of government: the Judicial, Legislative, and Executive branches. It was never meant to be the dominant authority over the entire structure of government.

In the Marbury case, the court assumed the seemingly deferential pose of declining to accept a power not properly granted to it in the Constitution. In reality the Supreme Court invented for itself, with no shred of constitutional justification, the power to review and negate democratically enacted congressional legislation. That was a neat trick. Chief Justice Marshall, a Federalist himself, in effect single handedly amended the Constitution to serve the Federalist purpose of judicial supremacy. What Marshall thereby invented has come to be called the power of judicial review.

President Jefferson joined in the angry denunciation of the Court’s decision, and prophesied that in creating a Supreme Court the framers had unwittingly placed the nation “under the despotism of an oligarchy.” So harsh was criticism of the Court after Marbury that, though its weapon had been forged, it was another half century before the Court again resorted to the brute force of its newly fashioned power of judicial review.

And what was that “fortuitous accident” in the matter of William Marbury? The Secretary of State who had neglected to deliver Marbury’s midnight commission was John Marshall. Yes, the same John Marshall who had himself, just before “midnight,” become Chief Justice John Marshall. Marshall, as Secretary of State, had thereby afforded himself the opportunity as Chief Justice Marshall to issue his momentous decision in Marbury v. Madison. A decision made possible only because of the “fortuitous accident” of Marshall’s having forgotten to deliver Marbury’s commission. So it is recorded.

Slavery

The second exercise of the Supreme Court’s power of judicial review was by Chief Justice Roger Taney. By the mid-nineteenth century tension over slavery was rising to explosive intensity, dampened to a degree by terms of the Missouri Compromise enacted by Congress in 1820. That statute admitted Missouri to the Union as a slave State and made provisions as to how future States should be admitted as free or slave. The statute also provided that, should a slave enter a free State or territory from a slave State, he should become free. The case before Chief Justice Taney’s Court involved the status of a Negro slave in Missouri named Dred Scott. Dred Scott’s master had taken him from the slave State of Missouri into free territory where, by terms of the Missouri Compromise, he was free. But Scott’s master returned Scott to the slave state of Missouri, and claimed that he thererfore remained a slave. Scott sued in Missouri courts to claim his freedom under the Missouri Compromise, lost his case, and appealed it to the Supreme Court.

The Supreme Court decided the case of Dred Scott v. Sandford in 1857, in a 5-4 decision that was savagely anti-Negro and pro-slavery. The Court held that by returning to a slave State a slave who had become free remained a slave. That is all the Court had to decide to dispose of the case. But Chief Justice Roger Taney had a vision more grand than that. Writing for the majority, Taney declared the Missouri Compromise to be unconstitutional. Reaching further still into the politics and social structure of the nation, he added that the “African race … were beings of an inferior order” and therefore “altogether unfit to associate with the white race” either socially or politically. Even if slaves should be freed they could never become citizens.

Outrage against the Court’s Dred Scott decision in 1857 was as furious as it had been against Marbury v. Madison in 1803. Abraham Lincoln, not yet President, spoke of judicial “chains of bondage.” He warned the American people to “prepare your limbs to wear them” should they “slouch and acquiesce” to such judgments of the Supreme Court. Mindful of the furious attacks against it, following Dred Scott, the Court again waited out a few prudent decades of quiescence before again asserting its power of judicial review.

But the Dred Scott case, a clear and unequivocal attack on the legislative powers of Congress, proved to be only a warning shot by the Court majority of a judicial onslaught to come.

 

Judicial Veto

At the end of the nineteenth century and the turn of the twentieth century a burgeoning industrial society was producing enormous new wealth, unimagined in earlier ages. At the same time there developed intolerable hours and conditions of work in the new mines and factories. Congress as well as the States passed remedial legislation requiring reasonable work hours and more healthful working conditions, among other measures. In the State of New York this included a law regulating hours of work and other working conditions in bakeries. Louis Lochner owned a bakery affected by the new regulations and sued in federal court to have the New York law nullified, claiming that it violated his Liberty of Contract. He lost and his appeal reached the Supreme Court in the case of Lochner v. New York, decided in 1905.

Liberty of Contract was an economic theory of the time. It included the idea that both parties to a labor agreement, the individual employee and the owner of a business, a mine, or a factory, had the same liberty to bargain for the wage paid to the worker. In an age lacking in union representation this meant, in reality, that the “liberty” the individual worker had was that of taking the wage offered or getting out of the way. In the Lochner case the Supreme Court ruled that the state law regulating the hours his employees could work in his bakery was a violation of Lochner’s constitutional Liberty of Contract. But there was a problem. While Liberty of Contract reflected the sort of economics the Court preferred, alas, it was nowhere to be found in the Constitution. What to do?

Well, after considerable legal legerdemain the Court decided simply to say Liberty of Contract is included in the Constitution. And if the Court says it is there, it is there, even if no one else had ever noticed it. The Lochner decision is a maze of legalistic complexities and specious judicial invention that was to spawn countless treatises and commentaries. It’s essence, however, is clear enough. Judicial fiat, pure and simple, was the basis of the Lochner decision.

In implementing its newly invented Liberty of Contract theory, and in similar decisions as well, the Court claimed and entrenched for itself a veto power for which there is no justification in the Constitution. State and federal legislation that happened not to appeal to the economic or social views of a majority of Justices was sacrificed in mechanical application of the Court’s inventions. The Court’s insurgency had become a judicial war of attrition against democratic principles of the Constitution, and the Court was winning.

So invincible had its power grown that by the 1930s the Supreme Court felt free to exercise its judicial veto on a more expansive scale. In case after case much of President Franklin Roosevelt’s New Deal legislation, designed to get the country out of the Great Depression of the 1930s, was declared unconstitutional. Most of those decisions were decided by a thin 5-4 margin. Once again political outrage, deep and bitter, swept the land. Demands mounted to “pack” the Supreme Court by increasing its size with additional Justices favorable to the New Deal policies.

The New Deal crisis came close to calling down overt political retaliation against the Court to offset its own overt political transgressions. That might well have been a cure worse than the disease. However, the pressure against the Court did its work, if less overtly. Seeing the threat of political action, and with his finger to the wind, Justice Owen Roberts, one of the 5-4 majority, switched his vote. That saved the Court from political manipulation, and the New Deal from further judicial interference. As is commonly said, Justice Roberts’ decision to change his vote was “the switch in time that saved nine.”

Judicial Legislation

After its narrow escape the Court remained contrite during the remaining years of the 1930s, throughout the years of World War II, and for a time after the war ended in 1945. But an even more predatory minority of four Justices was already building a new head of steam, and a powerful head it would turn out to be. Weary of exercising merely a peremptory veto, the four rebel Justices itched to break out of the confines of simply reviewing acts of Congress, the President, or the States. Why be so restrained, these Justices seemed to hint in their militant dissents, when the Court might do the legislating itself and get it right the first time? But with the exception of the 1947 case of Everson v. Board of Education, discussed below, the militant minority was unable to pick up the necessary fifth vote to effect a broader constitutional revolution. Then President Dwight Eisenhower committed, “The worst damn fool mistake I ever made.”

Topeka Kansas, like many cities across the country, maintained a school system that segregated colored children from white children (the terms then used). That had been a common practice ever since the Civil War and the abolition of slavery a century earlier. Most would now agree that racial segregation is an evil practice, and in the case of Brown v. Board of Education of Topeka, Kansas the Court was asked to declare that practice unconstitutional. The specific question before the Court was whether separate schools for blacks and whites were a violation of the equal protection clause of the Fourteenth Amendment. That clause provides that, “No State shall … deny to any person within its jurisdiction the equal protection of the laws.” The Brown case had been argued orally before a deeply divided Court under Chief Justice Fred M. Vinson in the fall of 1953. But Chief Justice Vinson died before the case had been decided.

There is a legal maxim warning lawyers and judges alike that, “Hard cases make bad law.” Was Brown v. Board a hard case, dealing as it did with the condition of “separate but equal” schools for the “colored” and “white” races? Justice Felix Frankfurter wanted to find that separate schools did violate the equal protection clause. This despite a hundred years of uniform practice to the contrary throughout the country following adoption of the Fourteenth Amendment. Frankfurter instructed his law clerk, Alexander Bickel, to search the background of the Fourteenth Amendment to ascertain whether its framers had intended the equal protection clause to apply to schools. Bickel learned unequivocally that the framers intended to leave matters of education to the States, as had always been the case. They did not intend that the Amendment should prohibit separate schools for the two races.

Then Frankfurter learned that Chief Justice Vinson had died, and was heard to murmur, “An act of Providence. An act of Providence.” President Dwight Eisenhower then committed his “worst damn fool mistake.” He appointed Governor Earl Warren of California to be Chief Justice. The activist minority of the Justices, straining to set the Court on a new and more aggressive course, sensed a landmark case in the making. Justice Frankfurter, not ordinarily considered a judicial radical, quickly seized the opportunity Providence had granted him. He briefed the new Chief Justice extensively on the pending Brown v. Board case, and in no uncertain terms. During intense discussions Frankfurter was able to persuade Warren that separate schools were wrong, whatever the intent or practice had been. Warren, who proved to be more the politician than the jurist, used his own persuasive powers and his new position of Chief Justice to achieve a unanimous decision by the Court, and wrote the opinion himself. The Court held that the equal protection clause does apply to school segregation by the States.
In a constitutional law case it is normally expected that the Court would rely on the Constitution, the intent of its framers, and perhaps also on its own prior opinions when interpreting that document. In his opinion Warren cited as the basis of his decision not a single provision of the Constitution or its Amendments (other than the equal protection clause itself), or even past decisions of the Court. He did cite psychological and sociological treatises, the passage of time, and other appealing references.

A hard case? No doubt about it. Brown v. Board was a hard case. It would be hard to uphold segregation. Was the Court’s decision bad law? Here are some of the reactions to the decision. One law professor asserted that the case could only be described as “a revolution in constitutional law.” Another commentator, gleefully agreeing, saluted the Supreme Court as the “Revolutionary Committee” that had brought that revolution about. A Harvard law professor observed that the case did not only “shape” the law but “upended it,” giving to the Fourteenth Amendment a meaning “exactly the opposite of what its framers designed it to mean.” Brown v. Board was Chief Justice Warren’s first case. That case set the Supreme Court on a course directly against the headwinds of its oath, and its mandate, to uphold the Constitution.

The Court’s actual “authority,” and the true basis for its decision, was the morals, social ethics, and personal preferences of the Justices themselves. Is this bad law, hateful as segregation was? Do the ends justify the means when the meaning of the Constitution itself is at issue? The Brown decision soon became a “magnetic field,” as some have called it. The case attracted all manner of causes and special interest groups that could not prevail in the political process, and came to rely on judges to do their work for them. Under such a regime does the Constitution remain a guiding and limiting factor in judicial decisions? Does it continue to uphold the limitation and distribution of power in government as intended? Does it have any meaning at all? Yes, it has meaning. The Constitution means what the Court says it means, just as it did in the Lochner case. At least until the Court changes its mind, as it did in the New Deal cases. Then the Constitution has a fresh new meaning, which lasts until the Court thinks it has an even better idea.

A “Living Constitution”

Not everyone objects to what the Supreme Court is doing in such cases, and there are those who gleefully applaud its advancing erosion of the Constitution as written. In 1969 Adolph Berle, attorney and former member of Franklin Roosevelt’s New Deal “brain trust,” joined those who praise the Court as a “revolutionary committee.” Robert M. Hutchins, former President of the University of Chicago, happily styled the Court “the highest legislative body in the land.” Those who value the Constitution, on and off the Court, take a different view of judicial inventiveness. As a result, two competing theories have developed as to how the Court should view the Constitution.

One view holds that the Court should hew as closely as possible to the written text, and to the intent of the framers as directly inferred from the text, or from contemporaneous written documents. That is commonly termed the originalist, or sometimes the strict constructionist, view. The other view insists that the Constitution should be “kept up with the times” by judicial innovation, rather than by amendment as provided for in Article 5. Keeping up with the times by way of judicial legislation requires the pea of judicial usurpation to be concealed under the shell of what is euphemistically called a “living Constitution.”

The idea of a living Constitution, given a modicum of analysis, gives the plot away. Nature being as it is, a living thing is necessarily a dying thing from the day it is born. A “living” Constitution is already on its way to the grave. The only questions are how soon will it be dead, and what progeny, if any, will it leave behind? The excuse the Court offers to keep the Constitution “living” is to claim it is applying such tests as “evolving standards of decency that mark the progress of a maturing society.” “I hate that phrase,” Justice Antonin Scalia protested in a speech on the subject. The problem, he says, is that, “[S]ocieties don’t always mature. Sometimes they rot.” If the Constitution can be regarded as an instrument that is “living” or “maturing,” that allows a majority of the Supreme Court to make of it almost anything it might imagine. If the Constitution is seen as an embodiment of eternal concepts of freedom and justice, as the Founders intended it to be, that sort of instrument is cherished more devoutly. Its covenants are kept more scrupulously.

Under the “living Constitution” view when democracy becomes an annoyance revolutionary activists seize the opportunity. Anxious to implement social and cultural innovations that democratic legislatures refuse to enact, they turn to a sympathetic Supreme Court. Majority rule of the people, involves millions of voters, and requires convincing a majority of those millions of voters to agree to new ideas and novel theories of government. This is difficult when what is proposed is not provided for in the Constitution, or even contrary to its provisions. Implanting the same ideas by majority rule of the nine Justices of the Supreme Court is so much easier. That takes only five votes.

The motivating spirit of the latter view is revealed in a question Chief Justice Warren once asked of counsel during oral argument before the Court. The attorney had shown convincingly that the action in question was legal and constitutional, so that the Court should not intervene. In response Warren asked, “But is it good?” That is a startling, and prophetic, response. It reveals the motivation the Court is not quite willing to acknowledge openly. What that question tells us is that if the Court thinks a matter before it is good or not good, that will be its basis for decision. What the Constitution may require, or the oath the Justices take to uphold the Constitution, are matters of no consequence.

There is nothing in the standard of “goodness”that remotely ties that standard to the requirements of a written Constitution, the rule of law, or even the rule of reason. That standard reflects a search, not for justice, but for pure power, uncontrolled by any objective test whatsoever. Such a regime is what Supreme Court Justices, guided by the “living Constitution” doctrine, in fact represent. The true basis of that regime has rarely been so candidly exposed as it was in Warren’s question.

When the Supreme Court invents new law there is no consideration by elected representatives, no open debate, no decision made in the light of day, no one to be held accountable at the polls. Under such a regime the Supreme Court pretends that what it says is constitutional law, and the American people, from law professors to the man in the street, must pretend to believe it. It’s that simple. On the great social issues before the Court “we the people” have no standing and no access to the bar of justice.

Critical Theory

The “living Constitution,” already under intensive care, is threatened by a yet more virulent infection from professors and theorists able to speak more candidly than judges as to what they are actually up to. The academics have invented a concept called “critical theory.” Critical theory holds that there is no such thing as objectivity in the law, no truth in the concept of the rule of law. In American law schools critical theory and its progeny, feminist jurisprudence, reject the very concepts of reason and logic upon which America and the whole of Western society are founded.

Logic and reason are the basis not only of law, but also of science, mathematics, and technology, not to mention daily living. The critical theorist scorns logic and reason as weapons of persecution wielded by “white male trash” designed for the “hegemonistic suppression” of females and minorities. Logic and reason must, therefore, be sent to the scrap heap of history as part of an illegitimate Western enterprise. Under the pounding of critical studies the structure of rational existence is destroyed. The precepts of critical theory require that the achievements of freedom and the rule of law be sucked into a void of lost centuries where civilization was once invented.

The alternative, critical theory asserts, is that will and not law should govern the acts of men and women. Critical studies theorists thus define themselves as lawless in the literal sense of the term. They aim to destroy existing law while presenting no articulate substitute. Heather MacDonald, John M. Olin fellow at the Manhattan Institute, condemns critical studies as “law school humbug.” And it is dangerous humbug. She sees critical studies as a revealing example of how intellectuals seek to misshape American society, not only in the law, but also in any other area they can get their hands on. This, she states, is part of their “inflexible ideology” that blinds them to “the reality in front of their eyes” if reality doesn’t fit their theory.

When values fought for over the centuries are disintegrated in a burst of jackhammer theorizing the words of civil discourse disappear. People cannot sensibly speak to one another. Without the institutions of civil discourse there is no procedure for the redress of wrongs. There is no mechanism to define what a wrong is. There is no frame of reference from which to reach mutually agreed upon public ends or to define public needs. There is only the gathering darkness of brute force in which the “living Constitution” has completed its natural life cycle, and the Civil War has won a great battle.

11. “God Is Dead”
Clearing the Ground

Of the many disparate forces, some not at first aware of the others, which have slowly but steadily drawn together to form the Civil War, those directing the attack on religion are among the most lethal. “God Is Dead” is the shocking proclamation of the nineteenth century German philosopher Friedrich Nietzsche. That terrible dictum has since leapt from the pages of philosophical speculation and taken on a life of its own. It has become a battle cry of devastation for religion, striking most heavily against the Christian religion and the values of the Judaic-Christian tradition. As it happened, the crusade against Christianity was fashioned some two decades prior to the sixties uprising, but was eagerly adopted by the rebels. And it sprang from an unexpected source.

United States Supreme Court Justice Hugo Black may or may not have been aware of Nietzsche’s grim prophecy. But if God was not dead Justice Black persuaded a majority of the Supreme Court that He surely ought to be. From his seat on the high Court Justice Black set about digging God’s grave long before his Supreme edicts would prove so very helpful to the rebels of the developing Civil War.

The Wall of Separation

The founders of this country were religious men, as was most of the population of the Colonies. Among the denominations and sects in the American Colonies at the end of the Eighteenth century there was no shortage of intolerance or persecution of opposing sects. In some colonies there was an established religion patterned after the Church of England, one of the tyrannies most colonists had hoped to escape. There was fear that some denomination, the Presbyterians in particular, would gain sufficient political support in the Congress provided for in the new Constitution to establish itself as the official religion of the new nation. Once a religion is established public support is provided and conformity thereto may be required. Founder and second President John Adams saw the danger: “There is a germ of religion in human nature,” he warned, “so strong that whenever an order of men can persuade the people… that they have salvation at their disposal, there can be no end of fraud, violence, or usurpation.”

It was against that sort of background that the first sentence of the First Amendment to the Constitution begins as follows: “Congress shall make no law respecting an establishment of religion....” This is referred to as the “establishment clause” of the Constitution. Given the religious turmoil against which it was written the meaning of that clause of the First Amendment is clear and essential, but also limited. What it prohibits is the establishment by Congress of a single nation wide religion sponsored, subsidized, or enforced by law. What if such a law were to be contemplated today? Which sect, cult, or congregation would be chosen? How would everyone else react? To pose such questions is to reveal the absurdity of any fear of a traditional religion being established in this country. The establishment clause has served its purpose and is irrelevant to modern conditions. But when Justice Black can persuade a Supreme Court majority otherwise, considerations of relevance or absurdity are of no concern.

Everson v. Board of Education was decided by the Supreme Court in 1947. The case involved a New Jersey statute that provided reimbursement to parents for money paid to transport their children to school on the public transit system rather than on school buses. In regard to children transported to public schools the Court held the statute to be valid. The additional question before the Court was whether such reimbursement could also be paid to parents who sent their children to Catholic schools, where religious instruction was part of the curriculum. In the Everson case Justice Black held for a Court majority that using public funds for transportation of children to Catholic schools was a violation of the establishment clause of the First Amendment. But Justice Black went much further than that.

The First Amendment not only prohibits the establishment of a state religion, Black wrote, but also does much more. That clause, he ruled, requires that there shall be a “wall of separation” between church and state. To make his intent vividly clear, Black added that the wall shall be “high and impregnable,” as if to prevent any whiff of God from leaking through. It is as though He were some pestilence threatening the health and safety of the body politic. Black went on. “No tax in any amount, large or small, can be levied to support any religious activities or institutions, whatever they may be called, or whatever form they may adopt to teach or practice religion.” This, said the Justice, is required by the establishment clause of the First Amendment. Therefore, Black concluded, neither the federal government nor the States “can pass laws which aid one religion, aid all religions, or prefer one religion over another.”

The Constitution guarantees against interference in religion only by Congress, yet much of the actual battle is fought over state and local government acts. How did these restrictions on congressional power come to apply to state and local governments? The answer is simple: because the Supreme Court thought they ought to apply. The Fourteenth Amendment to the Constitution was adopted in 1868, following the Civil War of 1860-1864, as part of an effort to assure that freed slaves would be guaranteed full rights of citizenship. That Amendment includes the provision that, “No State shall … deprive any person of life liberty, or property, without due process of law.” This is referred to as the “due process clause” of the Fourteenth Amendment.

To support his attack on religion in the Everson case Justice Black relied on the due process clause of the Fourteenth Amendment. The due process clause, he held, “incorporates” the First Amendment provisions concerning religion. That is what the Court said, even though the framers of the Fourteenth Amendment had no such idea even remotely under consideration. And there is nothing included in the Amendment to express such a notion. Judicial creativity under a “living Constitution” strikes again. The literature concerning this issue is vast and complex, as would be expected from lawyers and academics. But judicial constitution making is at the heart of the matter, and that is how a part of Justice Black’s constitution became part of the American Constitution.

Some who lack “Supreme” vision find it difficult to see a connection between a constitutional prohibition against “establishing” a government sponsored and supported religion, and erecting a “wall of separation” between government and religion. Much less a wall so “high and impregnable” that not one atom of religious devotion can penetrate it. The First Amendment simply tells Congress it has no power to establish a universal church as the national religion. Congress has never attempted to do so, and there is no true establishment issue in this country. The Supreme Court knows this perfectly well. Nevertheless, the Everson case in 1947 heralded a prolonged and tenacious judicial vendetta against religion in America, in the name of enforcing the establishment clause.

Does anyone really believe that allowing public money to pay for schoolbooks in a religious school, as it does in secular schools ,is the “establishment” of a religion? If the books were distributed impartially among schools of different denominations, which one would have been “established?” Does providing vouchers to send children to a private religious school rather than a dismal ghetto public school violate the establishment clause? The GI Bill of Rights, passed by Congress after World War II, gave millions of GIs money to go to college. They could go wherever they could get admitted. Has anyone noticed an establishment of the Catholic religion in America because some went to Notre Dame on the GI Bill? Or the establishment of the Methodist religion because some went to Southern Methodist University? Students using similar vouchers today might ease the financial burden of one religious school or another, just as the GI Bill surely did. But that hardly hands them the scepter of state power. Is offering a prayer at a public high school sporting event or similar school activity really a “law” amounting to the “establishment” of religion? Does a manger scene at Christmas in the public square “establish” Christianity by “law?” The Court has even prohibited a moment of silence in schools or other public places, fearful, evidently, that a prayer surreptitiously emanating from the spirit of a devout student might contaminate her neighbor.

The Supreme Court and lower courts following its rulings have at one time or another prohibited these and similar acts related to religious activities as amounting to a violation of the establishment clause of the First Amendment. It is such judicial decisions as these that violate the obvious and historically understood meaning of the establishment clause, not the acts these decisions prohibit.

In his book The Theme is Freedom: Religion, Politics, and the American Tradition M. Stanton Evans quotes founder and fourth President James Madison that “there is not a shadow of right in the general government to meddle with religion.” Evans shows how Justice Black and his colleagues have “stood the First Amendment on its head.” They have used provisions intended to protect the States from federal interference with religion “as a pretext” for interference by the Supreme Court itself.

The cumulative effect of these decisions has been to quarantine traditional religion from public life. Think of it as a kind of religious cleansing. It would appear that, having made up its mind to launch a grand war against religion, the Supreme Court needed a shield behind which to defend its purpose. Justice Black obligingly erected his “high and impregnable wall” to satisfy that need.

Free Exercise

The clause of the First Amendment prohibiting the establishment of religion is followed by a second clause providing that there shall be no law “prohibiting the free exercise thereof.” That is known as the “free exercise clause.” The reason for the free exercise clause was succinctly put by none other than George Washington. “Reason and experience,” he said, “both forbid us to expect that national morality can prevail in exclusion of religious principles.” Alexis de Tocqueville, that remarkably clairvoyant nineteenth century French observer of American democracy agrees. Tocqueville points out that religion and morality are socially unifying forces “that prevent democratized men from falling back on themselves.” If democratic citizens should abandon religion and rely only upon themselves, Tocqueville cautions, that would generate “a politically enervating status” that would “prepare a people for bondage.” It is the truth of that statement upon which the rebels of the new Civil War now depend to establish their power and their dominion.

It is also a truth that a majority of the Justices on the Supreme Court usually prefer to ignore. They act as though the ancient principles of the Judeo-Christian heritage were dangerous idiosyncrasies from which a naïve population must be protected by their more perfect wisdom. And this, the Court repeatedly protests, is in the interest of keeping the government Sneutraln on religious issues.

The “death of God,” as Friedrich Nietzsche prophesied, at least in the public square, has resulted in the near collapse of traditional morality. The faith of our founders has been excommunicated from public life. Moral guidance based upon that faith, established to formulate and assure civic order, lies buried under a shroud of judicial fabrication and usurpation. The advance of moral relativism, and rejection of core beliefs upon which Western civilization is founded, have followed. Yet a revival of the Judeo-Christian ethic could still threaten the new secular dominance, so the job of the Civil War is not quite finished. But by creating its Great Wall of Separation, the Supreme Court has erected a formidable barrier against “the free exercise thereof” through any revival of that ethic that would threaten the secular state religion.

The free exercise clause is an embarrassment the Supreme Court majorities have preferred to ignore. The Court’s decisions against religion amount to a law, or laws, which do prohibit the free exercise of religion— laws created by the Court. What the Court has done is to employ its constitutionally unauthorized interpretation of the establishment clause to justify its further violation of the free exercise clause. British philosopher Roger Scruton observes in an American Spectator article that there has never been a more effective means of “cutting off a whole people from its inheritance of moral and spiritual capital.” The Court has interpreted a Constitution designed to guarantee the right to exercise religious beliefs, Scruton perceives, “as an instrument of suppressing them.”

Saint Hugo’s State Religion

In their chants of, “Hey, Hey, Ho, Ho, Western Civ Has Got To Go” the rebels of the sixties believed that the diaphanous rhetoric of their rebellion included a total rejection of religion in all its forms. They did not know that their visceral need to destroy religious belief was at the same time a search for an alternative belief. Free sex they thought to be sufficiently deep and abiding to keep their movement alive and well, and did not know how enormous a void their revolution had created. The rebels were unaware that their slogans and their hatred were the scripture of a new religious anti-faith. Even today nothing so bristles a liberal as to be told that his ideology is an emotion, a belief, a fantasy, a religion. He will not concede that his new belief is not based on facts, cold reason, historical necessity, or reality as he imagines it to be.

It was the good fortune of the Civil War rebels that in their anti-religious passion they found in the United States Supreme Court a surprising and enormously effective ally. If God is dead and ready for burial, as He seems to be for many Americans, there looms a vacuum in the soul of America that can suck in almost any manner of substitute faith.

Far from being neutral, as it claims to be, a Supreme Court majority has cleared the ground for a new religion to fill the void its own decisions have done so much to create. Robert P. George is a professor of jurisprudence at Princeton University. In his insightful book The Clash of Orthodoxies: Law, Religion, and Morality in Crisis George calls the effect of these anti-Christian Court decisions an affirmation of a new “secularist orthodoxy” of the “isms.” That includes feminism, multiculturalism, gay liberationism, and lifestyle libertinism among others. These ideas are promoted behind the Wall of Separation established by the Court, so useful to disguise its false claim to neutrality. George’s book identifies those who espouse the Court’s secular creed of the “isms.” These include judges, professors, and others of the intelligentsia who work aggressively to impose their own beliefs on the whole of society as part of a new secular religion. To Be Politically Correct is the Second Commandment of that new secular religion, and Justice Hugo Black is its prophet.

The first job of Justice Black as prophet of a new state religion was to effect the massive collapse of social cohesion necessary to prepare the ground for new beliefs. It is upon the shattered rock of religious devotion, in the rubble of the 1947 case of Everson v. Board of Education, that a secular state religion has been founded. It is in the void created by that and similar Supreme Court decisions that the grinding mandates of the Civil War against America have been formulated and imposed. Justice Hugo Black surely deserves a pedestal as the patron Saint Hugo of the new order.

12. The Sword of “Justice”
A Common Destiny

The Supreme Court, though but a small platoon among the armies of the Civil War, has become one of the war’s front line contingents; the gavel of “Justice” one of its most deadly weapons. The Court’s attacks on the Constitution began long before the present Civil War took up its battle cry against the entire American republic. As the Civil War developed, for a time the twin assaults of the Court and those of the more general Civil War ran in parallel but disconnected courses. As this dual onslaught advanced it became apparent that the two insurgencies were relying upon similar emotions, ideals, and tactics. The covert Justices and the overt revolutionists share a zealous faith in whatever they believe to be just causes. Each feels a profound righteousness, and a passion for power that jusstifies any means that might be available to advance its insurgency.

Each assault was moving in its own way toward bypassing and eventually nullifying constitutional restrictions on power. The Supreme Court had manufactured various forms of “Justice” the founders had neglected to include in the Constitution to the Court’s satisfaction. The key to the Court’s powerhouse was fabricated in 1803 in the seminal case of Marbury v. Madison. It was in that case, as we have seen, that the Court assumed the authority to override congressional enactments, though there is no such authority in the Constitution. In subsequent opinions over a slow century or so the Court justified slavery, and validated racial discrimination through separate treatment of the black and white races even after slavery had ended.

By the turn of the 20th century the Justices began to impose their views of social and economic justice with ever increasing frequency. These cases were “conservative” in their effect of restraining the expansion of government at the time. But they cemented the foundation of judicial supremacy that was later to prove so useful in advancing liberal and radical causes.

In the opinions that resulted the Court has repeatedly interfered with state death penalty laws, and has imposed criminal “rights” that favor the accused over the victim. The Court has required States to change the way they elect members of their legislatures, and to redraw districts for election to the U.S. House of Representatives. The Court invented a right of privacy it then used to create a “constitutional” right to abortion, raising grave moral issues as to the value of human life. The Court has defined free speech to embrace pornography, including computer-simulated child pornography that cannot be distinguished from the real thing. Other cases attacked morals and institutions essential to the health of a democratic society. It eventually became apparent that these developments fit perfectly into the ongoing advances of the Civil War.
Without any official ceremony—perhaps through the laws of natural selection—the surge of judicial revolt merged with the rising tide of the Civil War. The merger formed a critical mass of authoritarian challenge to American ideals, values, and institutions. As the two revolutions fused in their drive to override American democracy, a radical Supreme Court majority has come to perform a dual function. In selected cases the Court becomes first a mighty platoon amongst the advancing legions of the Civil War. Once a strategic position is won the Court becomes a potent rear guard to protect the spoils. On the front lines the combined forces join in demolishing the American constitutional system one chunk at a time. As rear guard the Supreme Court embalms the destruction wrought as “constitutional law” to shield the elements of a true revolution from effective counter attack.

The Court has sanctioned the granting of racial and gender preferences in school admission and in public hiring practices. It has validated discrimination against white males contrary to the equal protection clause of the Fourteenth Amendment. It has intervened in the gay rights issue, pointing toward consecration of gay marriage that threatens the viability of the family. The Court’s incessant drive to destroy religion in public life has contributed heavily to the degeneration of morals, values, and behavior throughout society.

In these and similar decisions the Court has invented and imposed laws that reshape the culture, and threaten the structure of America. It has established itself as a commanding cadre, a kind of “grandfather” of the Civil War. In so doing the Court has transgressed the limits of judicial power to interpret the law as assigned to it by the Constitution, and usurped the legislative powers of Congress to make law.

Thomas Jefferson warned against allowing the Supreme Court to become the “despotism of an oligarchy.” He could hardly have issued a harsher or more pointed indictment to a nation freshly released from just such despotic rule. Abraham Lincoln warned against accepting judicial “chains of bondage,” spoken of the Dred Scott decision. There the court declared the Negro to be of an inferior race, and nullified congressional efforts to reach a compromise over slavery on the eve of the War Between the States. Jefferson and Lincoln warned of the danger posed by an irresponsible Supreme Court more interested in inventing and imposing its own arbitrary sense of “justice” than in enforcing the rule of law legitimately enacted.

The Supreme Court as a judicial institution does not possess the full powers of government, and so cannot aspire for itself the awesome power of an authoritarian state. Yet the Court, in accepting a common destiny with the Civil War, has the power, and apparently the will, to grease the treacherous slide that ends in repression and despotism.

Outsourcing the Constitution

The Supreme Court, entrenched at the leading edge of the Civil War, has begun to fabricate a weapon capable of even more catastrophic devastation. A number of the Justices have openly expressed dissatisfaction with the opportunities available to the Court to enlarge its jurisdiction. They feel confined if restricted to sources of law relating only to the American Constitution when interpreting that document. What these Justices seek is a vastly more powerful springboard for their revolutionary innovations. Their aim is to expand the Court’s search for precedent beyond American shores to include the opinions of foreign judges, dignitaries, or political celebrities.

At the Eleventh Circuit Court Conference in Hollywood Florida on May 17, 2005, Yale Law School Dean Harold Koh characterized Justices Anthony Kennedy, David Souter, Steven Breyer, Ruth Bader Ginsburg, John Paul Stevens, and Sandra Day O’Connor as enlightened “transnationalists.” That means these Justices are willing to cite foreign courts and legislatures to justify further innovations to the American Constitution, which they will then label “constitutional” law. This amounts to outsourcing the American Constitution to foreign interpretation. The Justices to whom Dean Koh refers affirm, in their own words, that this is their intent.

In Atkins v. Virginia, a case involving the death penalty for the mentally retarded, Justice John Paul Stevens wrote for the Court. He observes in a footnote that “within the world community, the imposition of the death penalty for crimes committed by mentally retarded offenders is overwhelmingly disapproved.” Justice Stevens softens his citation by insisting that citing foreign opinion does not necessarily mean taking it as guiding precedent. But if citing such a precedent has no precedential value, why cite it?
In a 5-4 decision handed down in March 2005, Roper v. Simmons, Justice Anthony Kennedy found for the majority that the death penalty cannot be applied against juvenile killers under the age of eighteen. The Court accepted briefs from the European Union, the Council of Europe, and former United Nations diplomats. The briefs asserted, among other contentions, that the execution of juveniles is “an irritant to international relations.” Justice Kennedy’s majority opinion finds that these citations demonstrate an “overwhelming weight of international opinion” against the death penalty for those under the age of eighteen. But the opinion insists that this finding was not controlling in the Court’s decision. However, “international opinion” did serve, Justice Kennedy confides, as “respected and significant confirmation” of its ruling. It’s not difficult to see what role this Justice is rehearsing to play in the outsourcing drama.

Justice Antonin Scalia, dissenting in the Roper case, argues against taking “guidance from the views of foreign courts and legislatures.” In so doing, Scalia points out, the Court has determined that “the views of our own citizens are essentially irrelevant,” while giving “center stage” to the “so-called international community.”

Justice Ruth Bader Ginsberg discussed her reliance on foreign precedent in a speech before the American Society of International Law on April 1, 2005. Justice Ginsburg calls the tradition of looking solely to the American Constitution for its own interpretation an “island” or “lone ranger” mentality. She terms such an approach akin to the view that the Constitution was “essentially frozen in time as of the date of its ratification.” There is, of course, the alternative view that “freezing” the Constitution as written, unless amended as provided for by Article 5 therein, was precisely the purpose of writing and adopting it. Justice Ginsberg would have the Court be “more open to comparative and international law perspectives.” She would also like to see the U.N. Declaration of Human Rights cited in American court cases. Whether that would be just in addition to, or in place of, our own Bill of Rights is not clear.

The incisivemedia web site LAW.COM on June 25, 2008 reported an address by Justice Steven Breyer to the Brookings Institution on the previous day. Justice Breyer was quoted as saying that regardless of complaints against the practice, judges would look increasingly to foreign sources for legal precedent when considering American constitutional cases. This, he explained, is part of the duty of any judge, foreign or American. Why? Well, says Justice Breyer, it is part of his duty “to impose structure on madness.” There is no indication in his speech as to how, or by whom, madness is to be defined in order to be restructured. Does he refer to the American Constitution as madness? One might wonder.

Justice Breyer predicts that a challenge of the next generation will be to determine whether our Constitution “fits into the governing documents of other nations.” Which nations’ constitutions the Court might select to try on for a fit in one case or another we have no idea. But if ours doesn’t fit theirs, the implication is clear enough. Ours will need some Supreme tinkering until it does fit. The resulting pattern of American constitutional “law” will be edicts of the “international community” according to whatever sources the outsourcers’ sorcerers might fancy on a given day.

Justice Sandra Day O’Connor, speaking to a lecture audience sponsored by the Southern Center for International Studies in October of 2003, offered her prediction. “Over time we will rely increasingly—or take notice, at least, increasingly—on international and foreign courts in examining domestic issues.” She is more explicit on a similar occasion, predicting the Court will “increasingly” make its decisions “in deference to international law and foreign opinion.” Reliance on foreign judicial opinions, this Justice foresees, “may not only enrich our own country’s decisions, I think it may create that all-important good impression.” That is an interesting comment. Is making a “good impression” the “all-important” job of the Supreme Court? And who are she, and her like-minded fellow Justices, trying to impress? Would it be a fair guess that an “all-important good impression” is intended to register, not within the American judicial system, or with the American people, but amongst her peers of the world’s judicial elite?

The worldwide sources to be consulted by these Justices may include decisions of courts in countries that have no constitutional basis of their own to begin with. No bill of rights. No rule of law. In the 1999 case of Knight v. Florida Justice Breyer found “useful” a Zimbabwe judicial opinion to the effect that inordinate delay in execution following a death sentence amounts to “inhuman or degrading” punishment. Zimbabwe, of course, is known for its propensity to avoid the agony of delay by applying the death sentence immediately, often sparing the poor fellow even the “inordinate delay” of a trial.

Robert Bork, law professor, former federal appeals court judge, and a senior fellow at the Hudson Institute has spoken frequently on the role of the Supreme Court in undermining the Constitution. He sees a tacit understanding between the U.S. Supreme Court and international courts toward the creation of a “global bill of rights.” Such “rights” as may emerge will be fashioned by courts some of which are not bound by constitutions at all. In his book Coercing Virtue: the Worldwide View of Judges Judge Bork concludes that what he calls the “New Class” (including judicial insurgents) will stop at nothing to impose a revolutionary ideology on the United States.

What Judge Bork describes is part of the trendy new “transnationalism” of which Dean Koh spoke. This movement is sweeping through the radical anti-American left of this country and throughout the world. The idea is that nationalism and the nation state are outmoded and need to be replaced by institutions that transcend national borders. Something like the European Union centralized in Brussels, Belgium. The EU is run by what former Czech president Vaclav Havel has described as the type of “layered bureaucracy” that characterized the tyranny of the former Soviet Union.

The European and EU judicial systems are based on the Napoleonic Code, enacted in 1804 by the French dictator who arose following the chaos of the French Revolution. The Napoleonic Code and the system of civil law provided for therein are designed to enforce governmental edicts from the top down. The American system is founded on the common law of England. The common law, by contrast to the civil law, was evolved over centuries by English judges striving to develop individual and property rights, among other matters. This was done on a case-by-case basis in deciding disputes brought before the English courts. The common law is a bottom up system of law, concerned with the individual rights of Englishmen. The civil law is a top down system designed to enforce the will of the state.

Though “human rights” are supposedly a concern of transnational idealists, such “rights” are worthless if there is no provision in law for their enforcement as the rights of individuals. The institutions of transnationalism, with statism and the civil law as their model, provide no such enforcement. The nation state is the cradle and only protector of individual rights, human rights, or the rule of law based on democratic enactment and consent.

Outsourcing American constitutional interpretation to rely on a system of transnational law can only result in diminution or destruction of the constitutional rights of Americans. Ultimately this destruction will reach the Constitution and the American nation itself. What judges in America have wrought, following transnational principles, Judge Bork asserts, “is a coup d’etat—slow moving and genteel, but a coup d’etat nevertheless.” It is a judicial remaking of American “political, social, and cultural life.” Justice Antonin Scalia, speaking from the eye of the hurricane, protests that, “Day by day, case by case, [the Supreme Court] is busy designing a Constitution for a country I do not recognize.”

The judicial counter-revolution against the American Constitution that began in 1803 with the case of Marbury v. Madison has reached its amazing fruition. There is to be a kind of judicial treasure hunt for bright nuggets of foreign law and “world opinion” sparkling on alien shores. These judicial jewels are to be polished and set to illuminate or displace provisions of our own Constitution that these Justices find to be insufficient, or even repugnant, as written. What is projected is a judicial imperium, in league with a glittering “international community” of omnipotent judges, joined with the internal forces of an ascendant Civil War. Our Supreme Court would then be free to pick precedents from any foreign authority it chooses: democratic, socialistic, or outright totalitarian, with no reference to a constitution, democratic consent, or majority rule.

In its long and tragic transition the Supreme Court has rejected the opportunity to stand against those who detest freedom, and prefers to impose its own command and control in the place of free institutions. The one constitutional institution that might have acted to arrest the enemies of the true American Republic has chosen instead to join their deadly enterprise. The Court has accumulated over the centuries the illicit authority to wield its own sword of “justice” against the clauses of the Constitution with which it disagrees. Or to add new amendments where that seems feasible. From its inception the Supreme Court has carried out a steady, and increasingly intense, attack on the Constitution of the United States of America. As it stands now, the only majority the Civil War insurgents need to legitimize their work, however ghastly it comes to be, is a majority of five, and the snap of a gavel.

A hard assessment? Yes. But an assessment that is even harder to avoid.
V. Scientists Sign Up
13. Corrupt Science
Scientific Method

Science and the scientific method are fundamental to the development of Western Civilization, and perhaps that civilization’s greatest gift to the world. The imprimatur of science has been a baseline of authenticity and trust in the development of the West. Faith in scientific integrity is now under attack on two fronts. One front, considered in the present chapter, is simple corruption in such areas as medicine or public health. This occurs though the adoption of practices that disregard the rigorous requirements of the scientific method. The result is areas of politicized science that cannot be relied on for authenticity. The second and more serious front is in crucial areas of dispute and uncertainty at the leading edge of science, most egregiously in bioethics and neuroscience. In those areas, considered in the following chapter, some scientists are claiming an omniscience that threatens democratic society.

When a material or a process or an idea is advertised as scientifically proven, or scientifically tested, how can the public be sure that the material or process is safe to eat, or use, or trust, as the case may be? It depends on whether certain questions were asked and whether computations or experiments were done in the proper manner. That is, was the scientific “proof” offered arrived at through the scientific method?

Which falls faster if dropped at the same time, a feather or a ball of lead? Why does an apple fall down from the tree instead of some other direction, or just stay on the tree where it is? How much material, or mass, does it take to disintegrate a city? The “obvious” answer to the feather and lead ball question is true if the test is performed off your back porch. The lead ball wins because the feather is slowed by particles of air it encounters in the fall, while the lead ball is not. The inquisitive scientist asks what the result would be if both were dropped in a vacuum. There the feather and the lead ball fall at the same rate of speed and hit the ground at the same time. Primitive people wouldn’t likely have asked either the apple or the feather and ball questions, and certainly not how to disintegrate a city. If they thought about the apple at all, it might have been to thank the tree gods for something to eat. Western society developed logic and reason rather than chance or superstition to find the answers. It developed the scientific method.

When a theory is proposed evidence must be gathered and the theory tested before it can be accepted as scientifically valid. Or, if a mass of evidence is discovered that cannot be understood the scientific method is called upon to see if a theory can be deduced to explain the evidence. Albert Einstein followed the first approach. He proposed a theory, his famous hypothesis that E=mc2: energy equals mass times the velocity of light (the constant “c”) squared. The velocity of light is 186,000 miles per second. That figure squared (multiplied by itself) yields a very high number. That number multiplied by the mass of the object in question (“m”) yields an even more astronomical figure. This means that enormous energy is locked in even tiny fragments of material. To test Einstein’s theory extensive experimentation and analysis were required. The final proof is the nuclear bomb. A city can be exterminated by the energy contained in the mass of certain material no larger than a grapefruit. Einstein had an idea and worked to find the evidence to prove it.

Sir Isaac Newton, the 17th century British mathematician and scientist, took the other route. He observed phenomena he could not explain, and set out to formulate a theory that would explain what he observed. Newton supposedly became interested in falling objects one day when an apple fell off a tree in his garden. He was also intrigued by the feather and lead ball experiment. Something was pulling these objects downward. What force was it? What of the earth and the moon? Why didn’t the moon fall into the earth like the apple fell to the ground? Were there interacting forces there, too, something like the force that acted on the lead ball and the feather? Further thought and experimentation led to Newton’s law of gravity, which holds that all objects in space affect each other according to certain mathematical principles.
The scientific method requires curiosity, initiative, objective skepticism, and above all logical reasoning in approaching nature and the universe. When a new scientific principle is offered, its method of derivation must be laid out for others to test and verify in the same way in which the theory was originated. Only then can the theory be accepted as valid. The scientific method is central to the explosion of inquiry and creativity that distinguishes Western civilization from societies that lack that approach.

The first obligation of the scientist, the key to the scientific method working properly, is honesty, integrity, and objectivity. When a new idea is presented as “scientifically” proven our trust in accepting it depends upon the assumption that the principles of the scientific method were followed in its validation. But, alas, science and scientists sometimes part ways. Even among scientists and practitioners rigorously trained in their discipline a point of view can develop, and be passionately held, that disregards the requirements of scientific integrity. That is occurring with increasing frequency today, which threatens an essential element of free society.

Nurse’s Orders

You are seriously ill and must be hospitalized. Can you be sure of receiving relevant, adequate, professional, and scientifically based treatment? Cross your fingers. Medical practitioners in some areas are being enticed away from the rigors of medical procedures based on scientific reasoning and processes. They are being attracted toward new doctrines derived from illogical theories, or based on beliefs irrelevant to scientific medicine. The results could touch millions of patients seeking treatment.

Sally Satel, M.D., is a psychiatrist at Yale University School of Medicine and a fellow at the American Enterprise Institute. She relates details of these new procedures in her book PC, How Political Correctness Is Corrupting Medicine. We learn of a new method of treatment by nurses called the “therapeutic touch.” This consists of the nurse moving her hands down the patient’s body to adjust the “human energy field.” This is done without touching despite the designation of the procedure. Satel reports advice to such practitioners that on occasion it is necessary after reaching the tips of the toes for the practitioner to shake the bad energy off her hands.

Aren’t nurses taught better than that during their training? Not necessarily. It seems that schools of nursing are getting to look a lot like colleges of education for teachers. The curricula are based on procedure, technique, and ideology, with very little substance. Dr. Satel finds that admission policies for nursing schools often lack rigorous standards, and that professors who appear not to have been introduced to the scientific method teach the courses.

Satel relates that in some areas of medicine PC credos such as victimology, multiculturalism, and redistribution of wealth are replacing science. Prescriptions emanating from these new sources, she cautions, are likely to be about “narrow ideas of social justice” that are sure to be “hazardous to your health.” Elizabeth Whelan, President of the American Council on Science and Health, recommends Satel’s book as essential reading for those who may require healthcare.

Dr. Eric Chevlen, a practitioner of medical oncology and pain medicine, observes that the rigors of the scientific method are seen by the advocates of political correctness to be a “devaluation” of the values of the feminist movement. The publication Nursing Science is quoted as advocating a new feminist approach that is “openended, ambiguous, dynamically constructed, incessantly questioned, endlessly self-revising, never set, but floating and moving with the river of life.” How would you like to take a swim in that river when your appendix is about to burst or you have a heart attack?

One possibility for those who might feel inhibited by the requirements of true science is to make up new facts and concoct new theories to simulate scientific proof of a theory or practice that is being advocated. Manufactured Facts

Suppose you have a great idea for improving the human condition, but the facts you get out of the laboratory don’t fit your good intentions. What to do? Hey! Make up some better facts. Here’s how you do it. Stanford University professor of biological sciences, Stephen Schneider, describes himself as a “human being” as well as a scientist, and wants to help make the world a better place. The first thing to do, Schneider says, is to find a cause that will improve the world. This cause, Schneider makes clear, is to be derived from the “human being” point of view. That’s good sentimental cover right there, and diverts attention away from the fact that scientific validity is not necessarily the basis of what is going on. The next step is to incite the public’s attention. That entails “getting loads of media coverage.” To make sure the coverage is effective it must include “scary scenarios,” and “simplified, dramatic statements.” Finally there should be “little mention” of any doubts the authors of such a program might have.

Schneider recognizes this approach might trouble some of his scientific colleagues. That, he counsels, requires that everyone engaged in the program must seek his or her own “right balance” between “being effective and being honest.” The guiding principle, the “human being” counter weight, would appear to be that the more passionate the feeling for a doctrine one is dedicated to, the heavier that agenda weighs in the scale when balanced against scientific honesty and integrity. The theme song of the German Nazis was Deustchland Uber Alles, Germany over all. Dr. Schneider’s view is something like passion uber alles. In either case the end justifies whatever means may be necessary to achieve it. And if you have to massage the truth to be effective, make it a whopper. Dr. Schneider is not alone in these ventures of juggled facts versus scientific integrity.

A panel of physicists determined that a scientist at the prestigious Bell Laboratories was guilty of falsification and fabrication of data in its molecular transistor program in a bid for recognition and prestige. He was a young scientist trying to get ahead. At Lawrence Berkeley National Laboratories the discovery of a new element was announced to the public. A new addition to the periodic table of elements was a scientific sensation. It was then found that the announced discovery was based on falsified data. There was no such element. Richard Smith, editor of the British Medical Journal, told Begley that cases such as these that do come to light are “the tip of the iceberg.” And such corruption goes right to the top.

James D. Watson and Francis Crick are credited with discovery of the double helix of human DNA (deoxyribonucleic acid) upon which is thought to be encoded all of the information needed for regeneration. That earned them the Nobel Prize in 1962. Watson describes scientific research as exceedingly competitive, conducted without regard for ethical principles, and as driven, not by a cool search for the truth, but by a passionate craving for recognition and fame. Watson reveals his own guilt, as well as Crick’s, in that pursuit. The research of fellow scientist Rosalind Franklin was critical to the DNA theory of Watson and Crick. So they appropriated her findings, incorporated them into their own work, and gave Franklin no credit. Why did they do this, and how did they get away with it? Watson says that it was because they needed her work and there was nothing to prevent their taking it. And by the time they appropriated her work Franklin had died.

How is it possible to evaluate the next Armageddon scenario splashed across the media and boomed through the ether that is presented as “scientifically” proven? How are valid scientific findings based on scientific principles to be distinguished from reports that arise from special interests or unscrupulous pursuit of fame and fortune? How can a conscientious citizen know when the “balance” between being effective in support of a program and being honest about its merits may have been tilted toward exaggeration or political causes by scary scenarios?

Political Science

Political science is normally thought of is an academic discipline concerned with the study of politics and government. “Political science” must now include physical or biological science that has been politicized. It is science that is cut, trimmed, and tailored to serve a pre-determined political end. The environmental movement is a prime contributor to this process. George Gilder and Richard Vigilante assert in a joint article that to justify their continued agitation, and to maintain their status as heroes of the earth, the environmental Greens have enlisted tribes of trial lawyers in their cause. The lawyers in turn have recruited swarms of scientists to work up “scientific” evidence to fit the needs prescribed. If the results they testify to in court in support of those who hired them don’t quite look like what actually came out of the laboratory? Well, jiggering the “scientific” facts a bit to make it look right anyway is the point, isn’t it?

Hordes of environmental idealists go about charging green windmills, backed by battalions wielding false science as their lethal weapon. In pursuing false causes finite resources are expended that might otherwise go to alleviate suffering of people in real life. Gilder and Vigilante comment that with the banning of DDT, the ban on development of nuclear power, and similar policies that retard growth, “environmentalist excesses” have killed “more people than environmental pollution ever did.” Much of the killing based on “scientific evidence” custom made for the purpose.

The cultural and political consequences of this new “political” science reach well beyond immediate environmental issues. The Greens, say Gilder and Vigilante, grasp for the same goal as did the old socialists, but with different arguments and tactics. The socialists played on envy to excite the proletariat to revolt against the capitalist exploiters, but that didn’t work very well. So the Greens play on fear, grasping for power as complete and intrusive as the socialists ever sought. Their goal is the same: command and control over the lives of their fellow citizens. A litany is developed, and their drive for power fuels a political policy that is insistently and incessantly presented as scientifically validated environmentalism.

The “romance of Marx,” says Tom Bethell, a senior editor for The American Spectator, has endured in a realm of physics “unmoored from experiment.” Bethell is concerned about the novel and even bizarre turns science is taking. He cautions that in science as well as art “revolutionary upheaval has been applauded for its own sake.” With the scientific method abandoned, fear is made to prevail over scientific probability and objectivity. Costly remedies are enacted, economic penalties rise, and human life is degraded.

Bethell fears that science is undergoing a “subtle but fatal change.” He finds a kind of coercion to conform, rather than to investigate and compete, in order to maintain lucrative programs or defend set ideas. This becomes a political goal no longer subject to scientific analysis. The integrity of scientific research and the structure of reason in human affairs deteriorate when the necessity for political support trumps scientific integrity. People and organizations that had seemed trustworthy are seen to have misrepresented the truth, and even to have advanced known falsehoods, to serve political and ideological purposes. Global warming and climate change sit at the top of the list.

Sharon Begley suggests that C. P. Snow got it right in a novel involving science at prestigious Cambridge University in England. A Cambridge master comments that scientific fraud “is of course unthinkable,” and that “unnecessary publicity” about it is “unforgivable.” The best answer to corrupt science is skepticism and the application of common sense to feverish claims of impending disaster and repeated “scary scenarios.”

But there is an alternative approach to justifying false calamities and scary scenarios.
Precautionary Principle

From the frigid Scandinavian northland, ventilating from the fresh frozen minds of the Nordic Council’s International Conference on the Pollution of the Seas, a challenge to the vitality of reason and the scientific method is sweeping across Europe. This Council of “political scientists” calls for elimination of alleged pollutants even if scientific evidence is “inadequate or inconclusive.” The problem is, the Council reports, that requiring scientific proof “has posed a monumental barrier in the campaign to protect health and the environment.” Thus, the need to establish a “causal link” between an allegedly harmful practice and a proposed remedy should be rejected. The Council would substitute the “precautionary principle.”

A spokesperson for the Council explains how the principle works. If the cause of some new environmental scare cannot be proved scientifically the deficiency of proof must be disregarded. The “greater good” must be met regardless of inconvenient scientific findings, or lack of findings. Presumably the greater good would be defined, under the precautionary principle, by anyone who can generate the “loads of publicity” necessary to frighten the public into believing drastic measures are required. There is no test as to whether those advancing the greater good know what they are doing or not. There is no reasoned evaluation of the assertions relied upon for precautionary policies and penalties. Nor is consideration given as to what might be accomplished through different allocation of limited resources. No account is taken of the harm that might come from what is being proposed. All that is nothing but bothersome interference with those dedicated to doing good.

This is not a satire. The precautionary principle actually operates that way.

If the precautionary principle is allowed to take hold there is no way “to draw the line between real and imaginary health risks.” So states Dr. Bonner R. Cohen, a senior fellow at the National Center for Public Policy Research. As Cohen observes, this means that regulators, following the precautionary principle, can go ahead and dictate policies regardless of the effects their regulations might have on the health and lives of people affected. Or what the cost might be to economic wellbeing if the economy must support unsubstantiated or even harmful projects.

If you see a wisp of smoke from an electric generating plant the remedy is simple. Shut it down. Lights out. If you think you see a black-ringed grabble goose with red toes in your neighbor’s field have the whole area roped off as an endangered species habitat. No matter if the neighbor can no longer make a living growing food on his farm.

The entire global warming affair can accurately be described as application of the precautionary principle on a global scale. Jim Manzi is CEO of an applied artificial intelligence software company. Manzi characterizes the precautionary principle as an idea that results in a “bottomless well of anxieties.” Manzi is particularly disturbed about global warming, which would require a debilitating reduction of trillions of dollars of economic growth and development in the near term. This in the name of avoiding “inherently uncertain results,” projected, but not proved, to occur in the long term. But the global warmers claim the right to bull right on whether they know what they are doing or not, wrapped in the righteousness of their allegedly good intentions. Such policies, Manzi finds, “conceal hubris in a cloak of humility.”

A beneficial pre-precautionary principle might be to pull the plug on the Nordic Council. Alas, it’s too late. The European Union has adopted the precautionary principle for regulators throughout its extensive empire. The EU now substitutes its clairvoyant hunches for the scientific evidence that might or might not support its regulatory proclivities. The regulator’s judgment is the law, even as to the shape of bananas that may be imported into its domain.

The precautionary principle would furnish armies of the Civil War in America with a weapon by which to regulate and destroy lives and property of citizens on a “nuclear” scale. Under the precautionary principle the restraints of science, law, the Constitution, or a democratic test in public elections are all wiped out. Marching orders designed, not to accomplish any demonstrable public good, but to control and command the public itself would prevail. Perhaps we should then be asked to accept priestly readings of the spirits of wooded glens, or of the echoes of ghostly animals painted on walls in ancient caves, to determine what new precautionary steps must be taken.

Skeptics Beware

Sampling the contents of the magazine Wired in a Los Angeles bookstore in February 1997, University of Aarhus professor of statistics Bjorn Lomborg ran across an interview with University of Maryland professor of economics Julian Simon. Simon contended that environmental conditions and living standards are getting better for most people in most parts of the world and will continue to do so. “I was provoked,” says Lomborg in the preface to his book The Skeptical Environmentalist. As a self-described “old left-wing Greenpeace member” and ardent environmentalist he doubted Simon’s conclusions.

Lomborg resolved on his flight back to Denmark to question Simon’s sources, confident that it would be easy to disprove his conclusions. “Honestly,” Lomborg confesses, as he got to work on the project with the aid of his best students, they expected to show that most of Simon’s book was “simple, American right-wing propaganda.” The result, instead, was to make of Lomborg himself the “skeptical environmentalist” of his book’s title.

The Skeptical Environmentalist is a book of 515 pages rigorously documented, including 2,930 footnotes and a 71-page bibliography (The lengths these academics will go to!). The sub-title of the book is Measuring the Real State of the World. Lomborg builds a “Litany,” as he calls it, of causes advanced by various groups of environmentalists, and examines each for its cost and effectiveness. Lomborg concludes that the world’s environment is not deteriorating, as the fund-raisers for environmental groups argue, but is in fact improving, just as Simon had said. The book demonstrates that we face problems, not calamities. This should be good news to concerned environmentalists. Is it?

The British publication Economist reports some of the reactions to Lomborg’s work among earth scientists and environmentalists. He is called a liar, a fraud, and worse. Anger, vitriolic and abusive, is directed toward Lomborg particularly in regard to global warming. There his studies show that the cost of proposed remedies is extremely high, that the resources required could benefit more people in many other ways, and that the proposed remedies are therefore a misuse of available resources.

Believers in global warming, scientists and non-scientists alike, refuse to share a platform with Lomborg. When he turns up at Oxford to talk about his book, the Economist reports, “the author (it is claimed) of a forthcoming study on climate change throws a pie in his face.” In a column relating the pie-throwing event James K. Glassman, resident fellow at the American Enterprise Institute, includes a picture of Lomborg’s face splattered with the pie, apparently a cream pie.

Lomborg relates that among his friends and academic colleagues he expected the initial reaction to his book to be the same “gut rejection” of skepticism as had been his own initial reaction to Simon’s article. There was that and more. Lomborg confesses disappointment that after extensive discussions, seminars, articles, and speeches, many of his associates refuse to change their ideas to conform to the facts he presents. But they make no attempt to dispute the facts!

Matt Ridley, the author of Genome, was asked to review four papers by noted authorities on the environment designed to refute Lomborg’s work. Ridley reports that he was “astonished” to find that none of the four had “laid a glove on Lomborg.” What Lomborg has revealed, Ridley concludes, is “a narrow but lucrative industry of environmental fund-raising that has a vested interest in claims of alarmism.”

Lomborg has written a subsequent book Cool It: The Skeptical Environmentalists Guide to Global Warming which presses the cost-benefit analysis. He concludes that no one is willing to pay the enormous cost of reducing the CO2 emissions said to cause global warming to the degree that the frantic global warming crusade would require. Those costs are projected at $95 trillion “up front” in economic destruction. Lomborg compares that cost to the projected cost of preventing the one million deaths that occur annually from automobile accidents. Simply reducing the speed limit to 5 miles per hour could eliminate deaths in automobile crashes. The real cost of that would be both economically disastrous and politically impossible. Either project, reduction of global warming or speed limit reduction, is absurd on its face and should be ushered out of town. Except that the sledgehammer politics of the Civil War warmers has so far kept their cause alive.

American Enterprise Institute fellow Steven Hayward comments that Lomborg has shown (as have many others) there is no realistic, large scale, near term alternative to fossil fuels. So, Hayward concludes, as Lomborg might, “Deal with it.” Hayward adds the amusing prediction that environmentalists will eventually find that “global warming is the issue that ate them alive.” And the scientists who cottoned up to the global warming fiasco may find as much egg on their faces and Bjorn Lomborg had pie on his. When the climate changers’ remedies get down to closing plants, destroying industries, and eliminating jobs there might be real climate change—in the political climate.

Degraded science is one more loosening of the principles upon which America and the West are founded. That contributes to the social degeneration upon which the Civil War depends. Columnist John Derbyshire observes that scientists working in an atmosphere heavily polluted by politics get their heads “stuffed with all the subMarxist and ethno-masochist flapdoodle of the modern academy”. They come to hate capitalism, Western civilization, and even “their own ancestors.” This induces a social climate that facilitates the triumph of raw will over ordered reason.

14. Omniscience Science
Bioethics

It we break the word “omniscience” in two the result is instructive: “omni” “science;” which might be loosely rendered as science everywhere and above everything. And that is what a least two major branches of science, bioethics and neuroscience, are claiming. Each of these disciplines holds that its findings apply everywhere and over everything. Nothing is to stand in the way of the truth as revealed and interpreted in their investigations. Whatever its origins might be, scientific or not, such a stance is that of an absolutist, a tyrant, a dictator. In taking this position each of these disciplines marshals critical forces that have the effect of subverting the institutions of a responsible democratic society. The battle of the Civil War to undermine and replace America advances on these two strategic fronts.

Bioethics, or biomedical ethics, has roots in the late 19th and early 20th centuries when many who held themselves out as Progressives were proponents of eugenics. Eugenics is a movement dedicated to manipulating the normal working of human biology with the goal of improving the race. Eugenics, though not called bioethics at the time, held that it was necessary for the health of the human race to kill off misfits, such as people who were too sick, or even too stupid, to contribute to the wellbeing of society. Believers in eugenics include such luminaries as Woodrow Wilson, twenty-eighth President of the United States, and Supreme Court Justice Oliver Wendell Holmes.

It was Justice Holmes who wrote a Supreme Court opinion in Buck v. Bell, 1927, upholding sterilization of “a feeble-minded white woman” with a family history of bearing feeble-minded children. Holmes held that “three generations of imbeciles is enough.” It was better, he asserted, to cut the fallopian tubes than “waiting to execute degenerate offspring for a crime.” During World War II the practice of German eugenicists in exterminating the Jews gave eugenics a bad name, and caused many geneticists elsewhere either to condemn the German atrocities or to keep their thoughts to themselves. Today a respected element of the scientific community is introducing similar theories that have the same effect of degrading human dignity and diminishing the value of human life.

Are kids and chimpanzees of equal worth? Some professors say so. Others defend sexual relations with pigs and chickens. Some argue legal rights for sharks and dogs. Is it OK to pull grandma’s feeding tube whether she and her family consent or not? Hang on. We’re talking bioethics here. In the nation’s academic enclaves, and in its laboratories of advanced science, such ideas as these are asserted seriously and tenaciously. Their proponents are richly rewarded, their critics savagely attacked.

Princeton Professor of Bioethics Peter Singer and a co-sponsor launched the Great Ape Project some 15 years ago. Wesley J. Smith is a law professor at Berkeley and a senior fellow at the Discovery Institute. Smith sees the Great Ape Project as having the goal of obtaining a United Nations declaration “welcoming apes into a ‘community of equals.’” The Spanish parliament appears on the verge of writing the essence of the project into law. That, says Spanish animal rights activist Pedro Pozas, would be the “spear point” that breaks the “species barrier.” Welcoming animals into a community of equals, Smith notes, would also destroy Judeo-Christian moral philosophy. The doctrine that holds all humans to be of “equal and incalculable moral worth” is a travesty were it to include animals.

An American extremist group that calls itself Community Environmental Legal Defense Fund advocates changing the ecosystem from the status of property under the law to “rights-bearing entities.” Your cherry tree could sue you if you fail to provide enough water for it. And should you let the cat out (of which you are a temporary guardian) and it kills a robin? Beware of the Robin Redbreast Security Patrol. This is the stuff of musical comedy, satire, or parody. Yet it is taken seriously and promoted as the ultimate in advanced bioethics. The inmates are taking over the asylum. But there is an animus in this that is worth pondering.

The attacks on the individual and the concept of human exceptionalism in the world of animate creatures are aimed at the heart of Western civilization. The basis of the American experiment is the concept that all human beings are created equal, and that they have a right to “life, liberty, and the pursuit of happiness.” This right is not an earned right, or a right granted by the state, or even a constitutional right. It is a right that exists as a result of the exceptionalism of the human being, a right conceived in the American Declaration of Independence as having been endowed by our Creator. There can be nothing more destructive of decency and civil order than the idea that a worm on a fishing line has the same natural right to claim integrity or even to exist.

Singer predicts that within the next 35 years belief in the sanctity of human life will “collapse” as a result of new scientific, technological, and demographic developments. This will leave “only a rump of hardcore, knownothing religious fundamentalists” to believe that every human life is of a higher order than animals. The belief that every human being has a claim to equal treatment before the law, in the nursing home, in the hospital, in the laboratory, and in the eyes of his fellow (human) beings would be trashed as mere religious superstition. This is a one-way street. Human beings are then essentially worthless. Humans become animals, but animals do not become human. Once the unique moral worth of every human being is rejected there are no barriers to the most horrible and degrading treatment of human beings by other human beings that can be imagined. One may wonder when these anti-human “ethicists” will begin hiring apes as laboratory assistants.

Views such as Singer and others of his persuasion profess may be shocking to many Americans, as well as to many of their fellow scientists, but not to the administrative hierarchy of Princeton University. There Professor Singer has been awarded the distinguished Ira W. De Camp Chair of Bioethics. What counts, then Princeton President Harold Shapiro assured critics of the appointment, is “the power of the professor’s intellect and the quality of his or her scholarship and teaching.” This means that at Princeton, at least, Singer’s ideas are pretty high quality stuff. As though something the equivalent of “USDA Choice” had been stamped on his forehead to certify the contents therein.

Hustler magazine’s Larry Flynt writes publicly that his own first sexual experience was with a chicken. The climax of the affair (so to speak) for the chicken was reached when its head got chopped off. Flynt did not say whether he got the chicken’s consent for their modest orgy, and here we do have a good word for Professor Singer. He adamantly insists, not only as a matter of bioethics, but also as a matter of animal rights, that the subject animal cannot be killed as part of the act of any other animal’s (human presumably) sexual gratification. That would be unethical. Imagine a society populated by Singers and Flynts. Would any act be held unthinkable? But if the world were all Singers and Flynts perhaps it wouldn’t matter.

Kids and Pigs

Wesley J. Smith in his book Culture of Death: The Assault on Medical Ethics in America reveals how such ideas as those of Singer and “Dr. Death” Kevorkian have become commonplace in departments of philosophy and ethics. For Joseph Bottum, books and arts editor for The Weekly Standard, Smith’s book reveals a world of “well-rewarded establishment figures” at odds with their society. Smith cites the treatment that patients in hospitals or nursing homes already encounter today. A man is forced to call 911 from his hospital bed when the staff declares him beyond help and denies further treatment. An elderly woman in a nursing home dies when her medication is withdrawn because she has been declared not worth the cost of her care. Patients are simply left with no means to continue living when their doctors determine that their lives are “effectively over.”

Terri Schiavo, supposedly brain dead and in a coma for many years, was left for days to die of dehydration and starvation when her feeding tubes were pulled by order of her husband. He had fathered children by another woman during Terri’s prolonged illness, and apparently wanted to get on with his new life. The tubes were withdrawn even though Terri’s parents believed she might recover and were pleading to be allowed to care for her at their own expense. Some have called her fate an act of premeditated murder. It was a spectacle that occurred under the sanction of both state and federal courts at the highest levels.

What would the reaction to such an event have been had Terri been a black or a homosexual? Suppose a prisoner on death row were to be “executed” by deprivation of food and water for days on end until he died. Might the same courts that sanctioned Terri’s death have found such a prisoner’s fate to be “cruel and unusual punishment” prohibited by the Eighth Amendment to the Constitution? Well, these days we can’t be sure.

David Gelernter, editor and a national fellow at the American Enterprise Institute, sees the killing of Terri Schiavo as part of a concerted war of attrition against Judeo-Christian morality. Most would agree, Gelernter observes, that innocent human life must not be taken “unless.…” He sees the “unless” exceptions as relating first to abortion, and then expanding to the terminally ill, to those suffering intense pain, to those whose lives are judged “useless,” and so on. This progression will continue, Gelernter fears, until the accumulation of exceptions “will strangle humane society.” The effort required in such a case as Terri’s to attempt to save her from her executioners is such that time is on the side of the “ethicists.” If one victim lives, there is always the next. The executioners, Gelernter reminds us, “have all the patience in the world, and all the patients.”

These events follow from what some bioethecists call a Futile Care Theory. That theory would allow doctors to terminate life-extending medical treatment even when the patient and the family wish treatment to continue. Under such a regime a seriously ill patient, especially if elderly, would not know whether to consider the man or woman with the stethoscope to be the doctor or the executioner. These ethicists also advocate in the elderly a “duty to die” rather than claiming valuable medical resources to keep them alive. Do we know how many Dr. Kevorkians are out there practicing medicine?

Wesley Smith warns that once the “odious notion” that some of us are better than others reaches a a certain legitimacy there may be no return. He predicts that forces will then have been set in motion that drive society toward the abyss with irresistible momentum. Smith asserts that there is a “moral equivalence” between Peter Singer’s philosophy and the killing of experimental humans by the Nazi doctors in Germany.

As a bioethicist Singer says that to consider the human species as superior in any way to any other species is “speciesism.” That’s one of the “isms,” he argues, that inhibit the development of advanced ethical theory. The idea that humankind is of a higher order of being, and so, for example, ought not to carry on sexual correspondence with farm animals Singer would dismiss as an example of the speciesism that he deplores. For Singer the taboo against people having sex with furry or feathery creatures is valid only to the extent that it requires the animal’s consent. Singer does not make quite clear how the animal is to express its consent, but we can rest assured that anyone with his ethical sensitivity in these matters would know.

Singer’s opinion that a baby (human) is of less value than a pig may or may not be meant to imply a sexual preference. As his thought advances Prof. Singer finds no objection to conception for the sole purpose of producing spare parts for an older child. Nor would he object to wholesale conception for the purpose of procuring and selling the resulting plenitude of spare parts.

English author and social critic C. S. Lewis foresees a “race of conditioners” that could “cut out posterity in what shape they please.” This would include experiments to change both physical and mental capacities and characteristics, or to fashion men and women for singular tasks as automatons with no complexity of human attributes. The orange-eyed fiends of science fiction would be the “conditioners” empowered to do the work. The wielders of such awesome power, Lewis says, are not bad men, nor are their subjects necessarily unhappy men. “They are not men at all, they are artifacts.” These views, it must be said, are by no means representative of neuroscience as a whole. But ideas have consequences, and radical new thought can be infectious. The prophecy of Lewis is that man’s “final conquest” will prove to be “the abolition of Man.”

In the ongoing Civil War, destruction of moral values is of prime importance to the rebels. Once stripped of his dignity and worth, the “naked ape” is raw meat to be manipulated, exploited, or destroyed at the impulse of those able to grasp and hold power. If bioethics should fail at the task there is neuroscience, blissfully confident in its ability to finish the job.

Neuroscience

If modernity is failing, Leon Kass, a physician, scientist, and educator at the University of Chicago finds the reason to be in taking the partial truths of science as though they embodied the whole truth of existence. The wisdom of that observation is confirmed in the work of a cutting edge contingent in the field of neuroscience. Do you think you are a unique individual, capable of plotting your own course in life? Do you have a free will, a sense of responsibility, maybe even a soul? No. This cult of neuroscientists maintains that the physical encompasses all there is to life. There is nothing “metaphysical” that needs to be taken into account by the scientific world. To these neuroscientists there is no such thing as the mind, the soul, or self-determination. What has been called the mind is simply physical particles lodged somewhere in the brain. Though it is yet to be discovered just what the particles are and where they are located.

Human life, in this view, is nothing but a temporary collection of organic material destined for disintegration and non-being. That is hardly a new idea in itself. How many times have we heard the phrase, “Ashes to ashes and dust to dust?” But the neuroscientific formulation in its extreme is new in the sense that it eliminates any concept of an independent and therefore responsible self during that miraculous interval between the dust and the ashes. To the neuroscientist of this persuasion that temporary remission from nonexistence is emptied of any metaphysical connotation, as is most everything else having to do with the “brief candle” of human time.

The term metaphysics (from the Greek meta, beyond or along with, and physika, physical material) is often taken to refer to the esoteric, the mystical, the transcendental, or the religious aspects of human discourse. But metaphysics encompasses not only all that but also whatever else is not physical, which includes the language of daily life and such concepts as law and civilization. The concept of morality and resulting behavioral rules are metaphysical concepts. To say that there is no metaphysical reality is to argue there is no basis for law or rules of civilized behavior. It is to say that ideas and images created by the use of words have no reality because they are not physically real. Words, language, meaning, and context are not real “things,” say the neuroscientists. They are mere inventions that have no physical substance, therefore no content or consequence that need be recognized.

As John Derbyshire, author and National Review columnist puts it, this branch of science now tells us, “The ‘I’…is an illusion.” That is to say there is no self, and since there is no self, there is no possibility of selfresponsibility or self-determination. Neuroscientists say that what has been called the self is, like the mind, only a physical function of the brain not yet discovered. Derbyshire counters that, for all their self-assurance, while neuroscientists “are chasing the self through ever narrower and darker passageways of the brain, they have not caught it yet.” Nor does he think they ever will. Physicists, he points out, have been pursuing the nature of physical reality for much longer than neuroscientists have been probing the brain, but the nature of reality still eludes them.

Neuroscientists may chase the self—the soul they deny—down the dark alley of absolute materialism and grasp at the prey they see cowering in a corner. But when they open their fingers to examine the catch their hands are empty. They have caught nothing of the spirit needed to understand the heights and depths of human capacity operating in metaphysical terms. To them music, art, poetry, novels, paintings of things the artist has never seen, but only imagined; none of this would exist. Such products of the human brain, enormously complex though it is, could not have been conceived or fabricated according to this cult of neuroscience.

The ideas of Einstein and Newton are not physical things, but they have a great deal to do with how we manage physical things. Mathematics, so dear to the scientist, is not a physical thing. English novelist and critic Aldous Huxley, brother of biologist Thomas Huxley, makes the point that it is not possible to live without a metaphysics, without ideas and concepts beyond the merely physical. The choice, he says, is not between adopting a metaphysics and not adopting a metaphysics, but a choice between “a good metaphysics and a bad metaphysics.” Huxley defines a good metaphysics, as against a bad metaphysics, as one “that corresponds reasonably closely with observed and inferred reality and one that doesn’t.”

Consider some of the questions that arise if neuroscience totally rejects metaphysics as its cutting edge purports to do. Is it likely that the transient organic material of which neuroscience says humans are made, if devoid of the metaphysical imagination, could have produced a Newton or an Einstein, a poem, a novel, or a symphony? Is it likely that neuroscientists will locate a niche in the brain the physical particles of which can be jiggered to replicate such creativity? And do they not communicate their bleak assessments to us in words that have no physical substance, and are therefore metaphysical? Have these neuroscientists, without admitting it, adopted a metaphysics that denies metaphysics? In any event they seem to have severed the ties to their own humanity. Professor Thomas Hibbs dismisses the neuroscientists’ dismissal of metaphysics with the observation that no life would be worth living “predicated on [these] assumptions of neuroscience.” Common sense can sound pretty good at times. But it may not prevail.

A 2003 editorial in Nature Biotechnology argues that legislation regulating development of higher forms of life “should steer clear of moral and ethical definitions. We need to stick to rational and scientific benchmarks.” Hadley Arkes, professor of jurisprudence at Amherst College, compares that view to “ancient fallacies,” as he calls them, such as scientism. Arkes sees scientism as “the notion that science is a law unto itself.” Scientism in this view holds that science encompasses all of human living. Therefore, says Arkes, the search for scientific truth “is a good that must not be constrained by anything atavistic and moral.” Arkes points out that views such as those expressed in the quoted editorial, arguing against science being constrained by “moral and ethical definitions,” are themselves “a moral judgment in behalf of a research unconstrained by moral judgments.”

Neuroscience exemplifies an atmosphere in which some scientists at the leading edge live and breathe. Wesley J. Smith reports that scientists in the field of biotechnology have already given us a glimpse of how bizarre the experiments envisioned by Futurists might be, unrestrained by morals or ethics. This would include “creating animals with human brains, mixing animal DNA into human embryos, implanting a uterus into a man so he can give birth, just to name a few.” Psychotropic drugs provide a means to manipulate the consciousness, reprogram the memory, and remake the self.

Andrew Ferguson, a senior editor at The Weekly Standard, observes that, while neuroscience can tell us that such things can be done, it cannot say whether they should be done. But neuroscience says we may not look into that question, that advances in science are beyond the competence of laypersons to evaluate or judge. Author Eric Cohen takes the matter a step further in his book In the Shadow of Progress Being Human in the Age of Technology. Cohen, an adjunct fellow at the Ethics and Public Policy Center, relates a new alliance between the counterculture and the culture of science and technology.

While the two cultures had seemed to disagree on such matters as the machine vs. the spirit; rational restraint vs. Dionysian indulgence; or gradual progress vs. spontaneous liberation, they have now found common ground. The common ground, as Cohen sees it, is “the belief that human limits should be overcome, taboos are anathema, and human shame is an illusion.” The key to the new union is that “no knowledge or experience should be off limits.” As the Dutch have determined in sexual matters, “If it can be done it should not be prohibited.”

A recently formed political organization of scientists calls itself Scientists and Engineers for America, or SEA. According to statements on its website www.setora.org and elsewhere SEA holds that there are areas of science in which there is no more room for debate. Findings are absolute, accurate, and unassailable. That position is about as far from the assumptions of open inquiry and honest reporting required by the scientific method as could be imagined. These “unassailable” findings, moreover, are not to be subject to public challenge or discussion by non-scientists on moral or ethical grounds, even by democratically constituted government. The neuroscientists would seem to qualify for the top of the list of those holding such views.

John Derbyshire cautions that the march of science may be unstoppable when it asserts, as it does, that “the self has yielded to the organism, morality to biology.” This, he perceives, “is the way the tide is running, fast and strong, in channels carved by science.” Novelist Tom Wolfe, versed in neuroscience, writes on the subject in an article titled, “Sorry, But Your Soul Just Died,” for Forbes ASAP. It is Wolfe’s apprehension that, “We now live in an age in which science is a court from which there is no appeal.”

Philosopher Russell Kirk answered the neuroscientists some years before their science had been exploited to its present level. Kirk speaks of “the strange human faculty—inexplicable if men are assumed to have an animal nature only—of discerning greatness, justice, and order, beyond the bars of appetite and self-interest.” Despite what its practitioners say there is nothing in neuroscience to disprove that “strange human faculty,” no matter where certain stimuli to thought and creativity might occur in the brain. Prof. Elizabeth Phelps, laboratory director of the University of New York Department of Psychology, remarks that just because it’s in the brain doesn’t mean it’s any less amendable to control according to moral or ethical standards. Danish philosopher Soren Kierkegaard warned in the mid-nineteenth century that “in the end, all corruption will come about as a consequence of the natural sciences.” Kierkegaard seems to have had a premonition that anticipated the neuroscientists and their vanishing humanity.

Scientific Counterculture

Still the neuroscientists hold to their dogma that you and I are no more free or self-determining individuals than are the contents of a rock hurtling through space, propelled by forces over which the rock’s contents have no control. The passionate legions of advanced science act as though they had already programmed themselves to their robotic vision of the future. There they are, ministering from the highest pulpits of their omniscient faith. From those scientific sanctuaries they would reach out to direct society as totally and brutally as any absolute despot ever did or hoped to do.

Add to this the merging of neuroscience with a counterculture that hates everything about American and Western culture. The result is a potent union of forces in mutual rejection of the values upon which American society is founded. It is a new and powerful army openly dedicated to the Civil War against America. Science, if these views are accepted, furnishes an underpinning for the most virulent nihilism. That would include the “population bomb” people who want to get rid of most of us, Friedrich Nietzsche and his joy in cruelty and violence, and the environmentalist exterminators who believe humanity to be a scourge of the Earth, a pollution source best done away with. These powerful forces set themselves apart from the principles of a democratic society in their arrogant belief that they know it all. To cap it off they insist that they must answer to no one who disagrees with them. These neuroscientists and their newly found counterculture friends create an environment in which anti-democratic and authoritarian sentiments thrive.

In addition to rejecting political or constitutional guidance, such a science rejects God, faith, and religion as well. To compensate for the absence of conventional faith, author and commentator Lee Harris finds that science has invented and placed in God’s vacant chair “its own ersatz god.” Harris describes this new god of science as “a blind and capricious universe into which accidental man has found himself inexplicably thrown.” This is a rather odd sort of “god,” and a peculiar sort of faith. The irony is that faith of a very different kind has been an essential ingredient in the foundation and advance of science itself. That faith was, and still is, based in the Judeo-Christian tradition that nature is an ordered affair guided by established principles that can be discovered and put to use by mankind.

A similar perspective is offered y Herbert London, author, academic, and president of the Hudson Institute in his book America’s Secular Challenge: the Rise of a New National Religion. The challenge London poses is a secular religion in which “the precept of an ethical structure whose genesis and moral authority are external to man” has been removed. This leaves man with “a pernicious relativism of his own making,” or with “a cold, allencompassing scientism” that is unable to answer man’s questions concerning his own making and place in the universe. Midge Decter, author and commentator, writes that London’s book “graphically illustrates” that increasing secularism in this nation is aiming “to destroy moral responsibility.”

The most “advanced” neuroscientific position reveals a powerful leading edge of the scientific community creating an atmosphere significantly more toxic than the corruption of science we spoke of in the preceding chapter. The doctrines of neuroscience establish, whether willfully of not, a new front in the Civil War against America. As Leon Kass, a leading authority on medical ethics suggests, we allow too little time to pause and consider where science and the technological state might be heading.

At the same time John Derbyshire cautions against dismissing the whole of science upon which so much of Western civilization is founded and continues to depend. It is true that when objective scientific analysis gives way to arbitrary power the Civil War advances more easily. Still, science retains what Derbyshire calls a “core magisterium” that “we can and do trust” even while at the periphery there are those in regard to whom it is best “to withhold trust … until the smoke of battle has cleared.”

VI. Choice Weapons
15. Words
Mutilated Meaning

Democracy and the democratic formation of public policy cannot work unless we all speak the same language. This requires not only English, but English based on a commonly accepted vocabulary; words the meaning of which is agreed upon. This the rebels of the Civil War against America know very well. To succeed they must mutilate the words and expropriate the concepts that define freedom. Who captures the words wins the war. This chapter and the following three chapters consider whether Americans are at the point of losing a common language and the common concepts formed in language, including the “language” of the arts..

The American founders employed words of the highest order to think through the possibilities that faced them, and to form a governing document to embody their work. The result was the words that form the Constitution of the United States of America. The founders were equally precise and eloquent in writing the Declaration of Independence. The words they used to persuade the Colonies to join in establishing a free government for themselves include the idea that all men are created equal, and have a God given right to “life, liberty, and the pursuit of happiness.”

Abraham Lincoln in his Gettysburg address invoked an accepted structure of patriotism and dedication to assure that “government of the people, by the people, and for the people shall not perish from this earth.” In the Civil Rights Movement of Rev. Martin Luther King, Jr. American concepts of freedom and equality were powerful indictments hurled against inequality, segregation, and discrimination. Rev. King’s battle was to see a “colorblind society” in which the rights of all Americans are applied equally. “We shall overcome” was a dream coming true. The cry for civil rights raised by Rev. King was a just and articulate invocation of words representing the best of America to correct some of the worst of America.

The word “republic” is generally understood to refer at a minimum to a governmental arrangement based on law, acting as a guardian of property and personal rights, and in which the people governed have a voice. When the Soviet Union was established it was careful to label itself a Union of Soviet Socialist Republics. The even bloodier regime of Mao Tse-tung and his successors calls itself the Peoples Republic of China. As these brazenly un-republican systems scattered satellites across the globe the word “Republic” often appeared in their official nomenclature, frequently copying the Chinese form of “Peoples Republic.” Tyranny likes to drape its naked power in a cloak of decency.

Always lurking at the perimeter of freedom there are not only the tyrant, but also the softer-spoken regulator, public relations manager, school administrator, intellectual, news editor, professor, lawyer, and teacher. These are people skilled at deploying words and phrases, and talented at remodeling words to suit their purpose: bully words; coercive words; killing words. Raise the question whether blacks may be responsible for some of their own complaints and you are a racist. Try to analyze the effect of homosexual practices on marriage, the family, and society, or even on homosexuals themselves, and you are a homophobic bigot. Suggest a discussion on the relative achievement—intellectual, athletic, artistic, or otherwise—of various racial or ethnic categories and you are a bigoted racial profiler. Racist. Bigot. Homophobe. The PC police are on guard as well against speech forbidden because it is sexist, ageist, sizeist, or speciesist.

These are buzzword hand grenades thrown against free speech to evade or suppress discussion without engaging the merits of the subject. The reflexive use of such words acts to conceal, perhaps even from those who reflexively use them, some truth they would rather not have widely understood.

Media Cleansing

Words may serve as wings to new adventures of the mind or as sledgehammers of repression. Though the origin of the term “politically correct” is not clear, it is usually associated with authoritarian societies, and is used in some form or another in Marxist societies to enforce “correct” speech or thought. Use of the term in America nearly always signifies enforced conformity contrary to the freedom of speech the American Constitution provides. The requirement to be “politically correct” was adopted as the Second Commandment of the Civil War rebels to enforce their contempt for the concept of free speech. Though monitoring by the PC “police” so far is more socially coercive than physically or legally enforceable, accusative “buzzwords” can be very effective in the meantime.

In the clamoring world of PC enforcers “sensitivity” to “minority rights” is required in a manner that journalist William McGowan in his book Coloring the News shows to be selective at best. A black doctor receives a glowing feature story in the New York Times to demonstrate how well diversity in medical school admissions is working out. A few years later the same doctor botches operations, causes a patient’s death, and loses his license to practice medicine. The Times says nothing.

When Matthew Shepard, a homosexual, is beaten, tied to a fence, and left to die in the Wyoming cold the Times quite rightly joins a national chorus condemning the act. Shortly thereafter an Arkansas white boy, not a member of any diversity clan, is brutally raped for hours by two homosexual men and left to die. The Times sees nothing in that story that qualifies as part of “All the News That’s Fit to Print,” as the Times proudly proclaims its policy to be. Nor did any of the other national media notice that incident. To report such an incident might take a certain glow off the glamour of multicultural diversity. Not to mention the offense to the homosexual community, despite the uncontradicted truth of the story.

Use of no words at all may at times be the most effective form of censorship. In a 2003 survey by the Center for the Advancement of Women it was found that a majority of American women are now pro-life. Hot news on a hot button issue? Not to ABC, NBC, and CBS television networks, none of which carried the story. They all opted out of reporting a significant event in one of the most crucial and difficult moral issues facing the country.

In Vermont a reporter for the Burlington Free Press, owned by Gannett, is fired for saying that allowing only “people of color” to speak at a forum on racism is “reverse racism.” In the Free Press case the New York Times informed its readers that in its view assimilation of racial and ethnic groups into American culture is now “seen as a dated, even racist concept.” For the “Paper of Record” it seems that “assimilation” has become a dirty word, a denial of some unarticulated right to separatness that in other circumstances would surely cause the same paper to raise a howl of “apartheid!” The very concept of Americanism has been disparaged, if not rejected, by mutilation of the word “assimilation.” That same word used to symbolize the remarkable amalgamation of races and creeds that has created America.

The scandal of pedophilic priests in the Catholic Church exposed a profound moral and sexual corruption in the perpetrators, and too often also by the way church authorities handled the matter. The entire scandal was exacerbated by the corruption of language used by the American media in reporting it. The facts are simple. Priests in significant numbers sodomized boys in their charge, apparently from pre-teens on up. The guilty priests are homosexuals. There are numerous homosexual priests in the American Catholic Church. Some seminaries have been run pretty much according to the standards of the original San Francisco “bathhouses.” The stated purpose of the “bathhouse” (a cleansing euphemism) is to invite multiple, indiscriminate, impersonal, and serial sexual indulgence by homosexual men.

These are the plain facts of the matter.
Or are they?

Hoover Institution research fellow Mary Eberstadt has collected some of the media reaction to these events. The New York Times assures its readers that, “It should be clear by now that this scandal is only incidentally about forcing sex on minors.” The Times does not clarify just what the scandal is about if it is not about “forcing sex on minors.” To the New Yorker the “big shocker” is not the abuse itself but “the coldly bureaucratic ‘handling’ of it” by the bishops. The New Republic takes a similar stance. The handling of the scandal by the bishops is a shocker, particularly the untroubled self-vindication of Cardinal Law of Boston, who was finally forced to resign. But priests sodomizing minors? Homosexual rape? You won’t see those words in major media reporting.

Liberal Catholic publications take a similar escape route. Psychiatrist and former Benedictine monk A. W. Richard Sipe reports that, “It’s not a gay problem; it’s a problem of irresponsible sexual behavior and the violation of boundaries.” Or, as the Catholic magazine Tablet sees it, “The problem is not the abusing priests’ homosexuality, but rather their immaturity and their abuse of power.” The president of the Catholic organization Dignity seeks to lay the matter to rest with the assurance that, “Homosexuality has nothing to do with it.” While much of the national media seems to agree, elsewhere some do notice that heterosexual men are not widely reported sodomizing boys in their charge. Or of “abusing” them as the common exculpatory term is used to soften hard reality. In Church doctrine sodomy is a sin, even among “consenting adults.” At law sodomizing a minor is a felony crime.

Dan Seligman, a contributing editor of Forbes magazine, wonders whether the media bosses behind such mindbending use of words in their reporting of pedophilia in the Catholic Church really believe in what they are saying. Are they “Coloring the News,” as William McGowan demonstrates they do in other respects as well in his book of that title? Perhaps, Seligman speculates, the bosses fear lawsuits. Or do they dread being judged racists, bigots, or homophobes if they insist on objectivity in reporting news that might offend vested interests intolerant of criticism? Or, Seligman wonders, are the top editors simply fearful of standing up to “the new militants in the newsroom?” McGowen’s book, says Seligman, leaves the impression that media managers are “utterly sincere” in their claim to be doing the right thing in their reporting of such matters. They believe they are functioning objectively as news media should in a free society. “That,” says Seligman, “is the most depressing possibility of all.”

PC on the PC
Author Mark Goldblatt reports that he was working on his personal computer using Microsoft 2000 as his word processing software when he wanted a synonym for the word “fool.” The thesaurus provided by Microsoft Word 2000 had only one offering: “trick.” That was it. Trying “idiot,” Goldblatt was told by Word 2000 that the word was “not found.” Nor were the words “goon,” “nincompoop,” “ninny,” “numbskull,” “nitwit,” “halfwit,” “dullard,” “dunce” or “dolt.” Trying “jerk” produced “yank.”

Goldblatt called a friend and asked what the thesaurus in his machine could come up with. For the friend “fool” produced not only the words Goldblatt couldn’t find, but “dunderhead” and “ignoramus” as well. He was also using Microsoft Word; but it was Word 97. It seems that since the issuance of Word 97 the hi-tech gurus at Microsoft had expanded their interest beyond chips, gigabytes, and teraflops to massage the vocabulary of those who use Microsoft Word. The words stored in Microsoft Word 2000, the software employed by millions of users on their PCs, would seem to have been subject to PC of quite another kind.

Wishing to report his concern directly to Bill Gates, Goldblatt called Microsoft and asked for Gates. Not surprisingly, instead of getting Gates he was shuttled around to various offices until finally Kate promised to look into it. After more delay an email from Kate revealed Microsoft’s approach to revising its spell checker, dictionary, and thesaurus. Kate explained that the purpose was not to “suggest” words that might have “offensive” uses or definitions. Each new release of Microsoft Word is updated “to reflect current social and cultural environments.”

Feeling the pain of those who might be “sensitive” to certain words, a spirit of insensitivity cleansing has descended upon the Gatesian world. Words considered by someone somewhere in the depths of the Microsoft Empire to be potentially offensive to someone else somewhere else have been chipped right out of the English language. Censorship? An inhibition on the scope of free speech? Of course not. Just getting in sync with the “current social and cultural environments.”

If your computer uses MS Word 97 consult its thesaurus for synonyms of the term “brainwashed.” Here is what is listed: “enlightened, refined, educated, humanized, cultured, domesticated, socialized, indoctrinated, civilized.” The term “brainwashed” was invented, of course, to describe intensive and coercive psychological indoctrination of dissenters in totalitarian societies. Such “reeducation” practices were often enhanced by brutal physical inducements as well. To brainwash was to “cleanse” the mind, to wash the brain cells clean of words or thoughts that might be used to oppose the regime, speak ill of the dictator, or harbor ideas of individual value or self-determination. But according to Word 97 the worst that happens to you if you get brainwashed is that you get “indoctrinated.” Other than that, hey! Brainwashing makes you “enlightened, refined, educated, humanized, cultured, domesticated [and] civilized.” Now try “brainwashed” in Word 2000. You will be informed that, “No results were found.” Has Microsoft brainwashed itself?

Those who conduct such an enterprise may truly believe they are being sensitive to the needs of others, and don’t want to offend anyone. They think they are devoutly following the new Second Commandment (whether or not they have heard of it) “Be Not Judgmental.” They do not seem to consider that in selecting words to delete from use they are not only shrinking the English language, they are shrinking the minds of those who use their product, and their own minds as well. They don’t want to think bad thoughts, or provide the users of their product with the means to wound or offend others. The result is to expunge from existence a part of reality that must be expressed in words to understand the world in which we live and those who inhabit it. Words cultivated over the centuries to enrich human insight and understanding vanish with a click of the delete key, and in the world that follows Microsoftian massaging, no longer exist.

Try to imagine a language in which anyone who has anything to do with using words could obliterate all words that offend him or her. Then let them delete also the words that he or she imagines might offend someone else. Word cleansing requires literary cleansing, drama cleansing, history cleansing, stand-up comic cleansing— cleanliness everywhere. And minds that have never known the lost words will have become too withered to notice or to care how clean they have been scrubbed. Our capacity to think about our environment and the people in it, including the worst of us, is diminished. We shall presently be reduced to a vocabulary like that of the college woman drawn into serial drunken “hookups” for the night. She knew the arrangement was wrong, but could describe what was wrong with it no more articulately than to say it was “all icky.” Gates only knows what future battalions of words will be clicked into eternity under the quick fingers of subsequent Microsoftian word cleaners. Be grateful that the wonderful and remarkably rich English language is still safe, and vital as ever, in the honest black on white pages of Roget’s Thesaurus. Keep it near your computer if you don’t want your own mind cleansed and diminished by dot-com nerds operating far above their pay grade.

A Common Tongue?

The power of the English language to bond a polyglot population into one people whom we call, and who call themselves Americans, has been crucial to the success of this country. Melik Kaylan, a senior editor at Forbes.com, looks at developing trends in the English language as America becomes more heavily ethnic and diverse. He foresees that English will still dominate domestically, and will continue to be used in foreign trade and diplomacy. The question is, what kind of English? Kaylan reminds us that multiethnic societies have historically kept their unity through a “central anchoring language.” Latin played that role in building and maintaining the Roman Empire, as did English in the development of the British Empire. Degeneration of the Latin language at its periphery hastened the Roman Empire’s dissolution in the Middle Ages. The more widely Latin spread, Kaylan notes, “the thinner it became.” Eventually everyday Latin dwindled to “a series of demotic dialects,” shorn of the richness of classical Latin that survived only in courts and monasteries.

A similar dilution of English seems to be occurring today, due in large part to increasing dominance of the spoken over the written word. Everyday spoken language tends to be ragged and not always fully articulate. But until recently there was a recognized “literary tongue” in the written language. In Anglo-American culture this extended at least from Shakespeare and Milton to the Federalist Papers, and the beautifully crafted histories of Winston Churchill or the novels of William Faulkner. Former Berkeley professor of linguistics John McWhorter perceives that an American society that relies on the spoken rather than the written word is a society that “marginalizes extended, reflective argument.” When the sound bite is the substitute, McWhorter fears, “The implications for an informed citizenry are dire.” This without even considering the effects email and the Internet have on the language. Language designed for instantaneous consumption is not likely to produce much poetry. More likely to appear are specimens of “txt msgs” such as “C U B4 U go” ending with a sidewise :-)

The eighteenth century English writer and critic Samuel Johnson, known as Dr. Johnson in the Scottish lawyer James Boswell’s biography of him, called language a living heritage. It is a conduit of a people’s memory that links back to the finest thoughts of its history and tradition. We are sacrificing that memory, Melik Kaylan says, when the complexity of English is lost to multilingualism encouraged by multiculturalism.

This occurs first in the cities, some of which are destined to become majority non-English. The effect slowly spreads to the whole of American culture as thinner and thinner English tends toward the kind of “demotic dialects” that hastened the collapse of Rome. As in Rome there may also develop a “mandarin class.” These would be lawyers, interpreters, academics, and mediators of various sorts, tending to form new divisions of class and authority as the complexity and power of the language withers away in the general population. This sort of development fits neatly into the revolutionary package of the Civil War.

The trend is accelerated by the practice of bilingual education in the public school system as part of multiculturalism. The bilingual idea is to reinforce the immigrant’s attachment to his or her native language, culture, and homeland. English is taught as a “second language,” just the opposite of promoting a “central anchoring language” for immigrants in their new culture. English as a second language cuts new Americans off from access to the “finest thoughts” of their adopted land by denying them the richness of its language and the literature written in that language.

Tinkering with words may be all it takes to erase the image of civilization.
16. Minds
The Younger the Better

“Education is the motor-force of revolution.” So says William Ayers, distinguished professor of education at the

University of Illinois, Chicago. In 2008 Ayers was elected vice president for curriculum of the American Education Research Association, the largest organization of education professors and researchers in the country. Ayers is also a 1960s co-founder of the Weathermen Underground terrorists with whom he participated in bombing plots against the U.S. Capitol and other government buildings. He and his wife Bernardine Dohrn hid out for most of the 1970s to avoid terrorist charges against them. They remain under investigation for a police station bombing that killed Sgt. Brian V. McDonnell. Ayers, still of the totalitarian left, say he would do it all over again. He still hates America but has turned from violence to vocabulary, and traded bombs for “social justice.” He remains highly regarded and influential in education circles.

Phyllis Schlafly, columnist, attorney, and author, recalls in an article for The Washington Times that Ayers’ ideas became popular among the students of the 1960s college uprisings. The radicals of those days gradually worked their way through the education system to become tenured professors at colleges and universities across the country. They dominate the teachers colleges. From those influential positions they have turned the mission of the colleges that teach teachers away from the study of history, civics, and other basics to indoctrination in “social justice” as the “motor-force of revolution.”

Schlafly explains that to Ayers and his like social justice means Marxian redistribution of wealth, high taxation, maximum government regulation, and most of the rest of the socialist agenda to which those such as Ayers remain dedicated. The public schools, established to transmit the American heritage, as well the basics of essential disciplines, have been captured by the academic legions of Civil War. Students are indoctrinated to distrust and hate their American heritage before they have developed either the knowledge or capacity for questioning or resistance.

In a Seattle public school children had been furnished with Lego blocks out of which they could build almost anything they might imagine. One class built an entire little village, with houses, shopping centers, stores, and all the rest. Then two students got into an argument about which one owned one of the buildings. Red Flag! Teachers descended upon the children to explain their error. They were speaking in terms of private ownership and the capitalist system, whereas they should be preparing for a communal and sharing world. Zip went the Lego blocks. Out the door. The students complained, the parents complained, and the teachers allowed the Lego blocks to be brought back. But only on certain conditions. One condition was that buildings are to be owned by designated groups of people, or to be property of the entire community. No individual can own anything. And all the houses are to be of the same design and size.

Any vestige of individuality, free enterprise, competition, creativity, or imagination is to be battered right out of these kids’ heads. Particularly subversive is the idea that an individual and his family might own a home. The Seattle school children are not to be contaminated by the great tradition of Anglo-American law that “A man’s home is his castle” safe against government interference. The contrary indoctrination is paid for by the working families of the city of Seattle, at least some of whom might have something else in mind for their children’s education. Too few parents know what goes on in our “public” schools. Dr. Laura Schlesinger relates a conversation with her grown son who told her of having been warned by his teachers during his school years not to tell parents about what they were doing in “sensitive” areas. The reasons for this, and it is a common policy, becomes clear when we see what does go on there.

As a government monopoly the schools have little incentive to improve, invent, question, or reform. Heading a teacher union monopoly, union bosses have more interest in work rules, wages, and benefits for their teacher members than in educating children. In California a drive was underway to gather petitions for a statewide voucher initiative. If enacted the law would allow students to take their per capita share of public school funds and spend it at the best school they could find—public, private, or religious. Something like the GI Bill after World War II.

The president of the California Teachers Association proclaimed the initiative to be so “evil” that “it should never even be presented to the voters.” He seemed to imply they might be stupid enough to approve it. Asked if he did not think that sentiment to be somewhat undemocratic, the reply was, “We would not think it’s ‘undemocratic’ to oppose voting on legalizing child prostitution.” Well. To the CTA, the union that represents California teachers under state law, voting for vouchers to get children out of wretched and failing public schools and into schools of their choice is the equivalent of sending them off to a pedophilic brothel. Do we know how many people with that sort of mindset are at work “educating” the children of this country?

The ideology of all too many teachers, protected by their government monopoly, is grounded in anti-American globalism, multiculturalism, and radical social change. Teaching students about their country and their heritage gets little attention, and most of that is negative rather than positive or patriotic. Forbes Magazine writer Peter Brimlow, in his book The Worm in the Apple, details how teacher unions, primarily the National Education Association, spend millions of dollars of their members’ dues to defeat efforts to implement school choice. The same determination applies against reforming requirements for certification or discipline of teachers. Power brokers at all levels of education actively oppose reform on principle, and regardless of the manifest need.

The NEA affiliated California Teachers Association alone spends huge sums on incessant radio ads aimed primarily to gin up support for smaller classes (more teacher union members), higher pay, and no outside “political” interference. That would interfere with the inside politics of their own highly politicized curriculum. The teacher unions not only resist any attempt to make teachers accountable, but also insist on tenure rules that make it virtually impossible to fire a teacher who is incompetent or ideologically anti-American. Do these arrangements have a negative impact on students? Former president of the American Federation of Teaches, Albert Shanker, wanted to know when students were going to start paying union dues.

The administrators of the education bureaucracy fear choice and competition even more than the teachers do. These “educrats” know better than anyone else, though they would never admit it, that most of the massive school bureaucracy is not merely useless, but an overbearing impediment to good education. In the typical American school system there are Superintendents, Assistant Superintendents, Principals, Assistant Principals, Vice Principles, Aides, Interns, Coordinators, Facilitators, and so on. Counselors circle like carrion birds, hovering to pick at the slightest whimper of any poor kid who likely as not would rather just be left alone for a while. He might even begin to figure out the rough spots in life for himself.

There was a time when in a typical grade school the Principal was the only administrator, who also often taught classes as well. The educrats who have since swarmed into the public schools at all levels have no function unless they issue rules and regulate something. The more educrats the more rules, the more rules the more frozen the system becomes. And the more frozen the system becomes the more resources are devoted to maintaining the freezer. Fewer resources, and even less motivation, are available to address the real needs of students and of the dwindling number of good teachers who remain in the classroom.

Linda Chavez, President of the Center for Equal Opportunity, and Daniel Gray, an authority on teacher unions, reveal in their book Betrayal the strategic aim and purpose of the teacher unions. As early as the 1970s National Education Association Executive Director Terry Herndon said it is “to tap the legal, political and economic power of the U.S. Congress.” The plan, Herndon elaborates, is to develop an organization with “sufficient clout” in the Congress “to reorder the priorities of the United States of America.” Changing those priorities is what the present Civil War, as do all civil wars, aims to do.

America’s priorities have been democracy and economic freedom, based on self-reliant responsible citizens. Is it fair to assume that the NEA has something else in mind? The Lego block incident in the Seattle schools affords a vivid snapshot of what that might be. Could their aim be the farther to the left the change the better? The Chronicle of Higher Education reports similar developments in the colleges of education, charged with preparing the next generation of teachers for the public schools.

As reported in The American Enterprise, the Chronicle of Higher Education finds that prospective teacher applicants to colleges of education are asked key questions. The colleges want to know whether they “value social justice, acknowledge white privilege, and agree to be change agents in battling sexism, racism, and homophobia.” The “change agents” become “community organizers,” an occupation frequently relied upon as a qualification for President in the 2008 presidential election. Applicants’ evaluations for entrance to a college of education are scrutinized according to how devoted they say they are to the foregoing values. These schools and the teachers they graduate are dedicated to subjecting the nation to the onslaught of these particularly tenacious brigades of the Civil War. And the system they run is uniquely situated to indoctrinate young minds at their most pliable stage.
Professor Ayers’ “motor-force of revolution” is operating on all cylinders.

Language Police

Teacher unions and school administrators worry incessantly that what they consider insensitive, racist, sexist, chauvinistic, ethno-centric, or similar matter offensive to their reigning dogma may be oozing into textbooks children have imposed upon them. These teachers and administrators work hard to control as closely as possible what goes into those textbooks. No heresy, no incorrect thought must be allowed to shine through. This requires “review” of textbooks for their political and social orientation.

New York University professor of education Diane Ravitch in her book The Language Police relates what happens when public school textbooks are sent to the sensitivity reviewers for scrutiny. The mandate given to these reviewers, says Ravitch, is to “eliminate, delete, remove, replace, revise—that is, censor—offensive material.” The author has included what she captions, “A Glossary of Banned Words, Usages, Stereotypes, and Topics” taken from various language cleansing sources. Here are some examples.

Textbooks must avoid exclusive reference to Judeo-Christian (Western) art or literature. That would be “ethnocentric.” The phrase, “He took it like a man” is prohibited. To portray a mother giving kisses and hot milk to a child at bedtime, or a father taking children on adventurous trips, is not allowed. Showing “people of color” as athletic, unemployed, or uneducated must be avoided. Showing Native Americans performing a rain dance or children “playing Indian” is taboo. So is showing Asian people as very intelligent or excellent scholars, or Chinese running laundries or restaurants. Images of Mexicans grinding corn or Hispanics as warm, expressive, emotional, or hot-tempered are to be avoided. The image of Jewish people as diamond cutters, doctors, dentists, lawyers, classical musicians, tailors, or shopkeepers is declared offensive. And don’t show older people as physically weak, helpless, dependent, senile, forgetful, or engaged in a life of leisure activities. Ravitch’s “Glossary” includes 32 pages in small type of these helpful admonitions to the language police in carrying out their duty to purify schoolbooks to acceptable standards.

To Gary Rosen, editor of Commentary, Ravitch’s book demonstrates that the villain of the language cleansing enterprise is the “multicultural left” and its advocacy of “diversity.” Diversity proponents say they work to protect the sensitivities of their “diverse” student body. At the same time they seem to feel it necessary to decrease the diversity inherent in the language their diverse student body reads and speaks. That the ability of students to perceive and describe other people, ideas, or events is diminished if the censors mangle their vocabulary does not appear to trouble these mentors of our young. Nor do they seem concerned that in so doing these ministers of sensitivity shrink the capacity of their flock for thoughtful analysis of their own lives and environment. Perhaps the minds of those who enable these programs are themselves so desiccated from constricting the minds of others they don’t notice what is happening. Unfortunately, it seems more likely they know exactly what they are doing.

Ravitch relates the guidelines the New York State school system suggests to publishers in order to meet sensitivity requirements. Publishers are advised that it is often not necessary to refer to a person’s ancestry, disability, ethnicity, nationality, physical appearance, race, religion, sex, or sexuality in books of literature or history. What is then left, Ravitch asks, “to help us understand character, life circumstances and motives” of fictional or historical figures? What has resulted from these “diversity” requirements she says is “a bureaucratic system that removes all evidence of diversity.” In so doing this “reduces everyone to interchangeable beings whose differences we must not learn about.”

Erasing words is presented as advancing a “good cause.” It is to be pursued with tenacity until language purification becomes a natural and accepted routine. Sensitivity in practice sets up a bank of censors who don’t call themselves censors, but are prepared to serve that purpose when the word itself becomes officially banned to conceal its reality. That is what is most likely to occur, says Ravitch, so long as the material students are given to read is “strained through a sieve of political correctness.”

The concept of diversity is designed to suggest an amalgam of cultural, political, and personal richness. That “diversity” leads, instead, to enforced sameness, regimentation, and intolerance is not to be spoken. “Diversity is Conformity.” George Orwell would chuckle at that one. Classical scholar and author Tracy Lee Simmons foresees that we can anticipate only “a bland, homogenous ignorance” in the student exposed to such a regime. The evidence mounts that a bland and homogenous ignorance is precisely what the education establishment desires. Those fortunate enough to come from “intellectually ambitious households,” Simmons says, may survive as genuine persons.

The Civil War involvment in this is evident and often strident. Computer scientist and author David Gelernter laments that for the last generation or more our schools have been run “as if we were too sophisticated” to encourage children “to love your country.” Patriotism just isn’t quite chic to the masters of our more worldly socialist school system. It is dangerous. Language purification is an easy way to assure that students never know to begin with what a rich heritage their language embodies. When language is sufficiently purified there need be no concern about errant thoughts of patriotic allegiance to a national entity and its tradition of free institutions. The words needed to form such thoughts will have been condemned and executed.

George Orwell wrote an appendix to his black prophecy Nineteen Eighty-Four to explain how the language of “newspeak” would evolve. Newspeak would be introduced “partly by the invention of new words, but chiefly by eliminating undesirable words and by stripping such words as remained of unorthodox meanings.” The word “free,” for example, would continue to exist but with only a narrow usage allowed, such as, “The dog is free from lice.” Newspeak, Orwell explained, was designed “not to extend but to diminish the range of thought.” That purpose would be served by “cutting the choice of words down to a minimum.”

The duty of the language police is to clamp onto the minds of American students from first grade through the colleges and universities an iron vice of conformist thought. The constrained vistas that follow are called sensitivity and diversity, and passed off as education. Students subjected to this regimen are rigorously taught what to think. These schools would not dare to teach them how to think.

Purpose

In the clamor over the ills and cures (if any) of the public school system the central question is often missing: What is the purpose of educating children? What sort of citizens are we preparing? What sort of future society does the material given to students today imply? What sort of society is needed if the country is to survive? Is this something we can talk about? Not in the public schools if the educrats and the teacher unions have anything to say about it. They know what they are doing and don’t want to be interfered with. Harvard University professor of government Harvey Mansfield (There is a pool of residual sanity at Harvard) ties the wretched performance of elementary and secondary education to the attitudes and aims of the National Education Association.

In addition to its left wing political agenda, the NEA also promotes education based on ideas such as selfesteem and self-actualization. As Mansfield demonstrates, that approach is related to the aim of the NEA to produce a certain kind of person. What the NEA wants, Mansfield says, is a nation of equals “averse to risk, competition, and conflict,” silent and submissive in the classroom. Each such product would be “simultaneously wrapped up in himself and compassionate to others.” Think rows of cloned heads, nodding in serene unison, the same half smile on every face. Could there be any greater harm to another person than to deprive him or her of the opportunity to find and develop the best that is within them? If education is to be reformed it is essential, Mansfield says, to get beyond “the self-comforting self.” Imagine the army in Iraq or Afghanistan, or wherever it may be required next, living up to the NEA’s model of a good citizen. Each soldier would be averse to “risk, competition, and conflict.”

What of the New York fire fighters and police officers who responded on 9/11? Did they act according to the public school model that Mansfield describes? Recall the public school teachers columnist Mark Steyn observed on 9/11, shrieking against patriotism and denouncing America. Even on that day. These teachers publicly railed about displays of the American flag, and spat anti-patriotic venom against Americans who rallied to the country’s defense. This as smoke and ashes drifted across lower Manhattan. The firefighters and police who responded that day with courage and heroism, at the cost of many of their lives, had evidently not been subject to an up to date education.
Was it the very model of heroism and devotion to duty of those firefighters and police that struck fear and loathing into those distraught teachers that day? Their concern, these teachers of the nation’s children openly complained, was that reaction to the terrorist attack might presage a national upsurge of patriotism. Did they fear their kids might no longer degrade their country among nations of the world as their teachers tell them to? Would they lose the lesson that this country is no different from any others, and should be “deexceptionalized?” Were those screeching teachers concerned that from the events of 9/11 their charges might learn something about patriotism, achievement, courage, duty, and self-respect? That they might grow into something more than malleable NEA balls of clay?

During the liberation of Afghanistan a fourteen-year-old boy in a New Jersey middle school drew a picture of a Marine shooting a Taliban enemy. When school authorities saw the drawing they suspended the boy for five days as punishment for his misdeed. His mother came to school to question the basis for that action. (His father was unavailable due to service in the U. S. Navy in the Persian Gulf.) The child’s mother was told that such a drawing is “not the work of a normal mind.” Perhaps the lad could try again, and draw the Taliban shooting the Marine to see if that passes for the work of a “normal mind” in today’s lower education.

Abigail Thernstrom, senior fellow at Manhattan Institute, and Harvard professor Stephen Thernstrom in their book No Excuses examine the school system from public to private across the educational spectrum. They conclude emphatically that for genuine educational advances to occur, especially for Hispanic and black children, the nation’s system of education must be “fundamentally altered, with real educational choice as part of the package.” Choice would include a genuine alternative to the anti-American, anti-capitalist dogma that pervades so much of the present socialist school system.

British writer and philosopher Roger Scruton holds that at the core of the unacceptable performance in America’s public school system there is a double fallacy. The first is that the emphasis has changed from teaching future teachers subject matter to teaching them the techniques of teaching. One teacher, questioned whether she was competent to teach arithmetic, responded indignantly, “I don’t teach arithmetic, I teach children.” For students in the colleges of education this situation, says Scruton, gives the ignorant student the advantage. The student with a passion for knowledge may become too bored with the dry mental fodder fed by the system to endure long enough to obtain his education “certificates,” and thus cannot be “certificated” to teach.

The second fallacy Scruton sees at the root of educational deficiency is the philosophy of John Dewey, an American psychologist, philosopher, and educator of late nineteenth and early twentieth century. Dewey, heavily involved in the Progressive movement of his time, held that teaching should seek to encourage the child’s own self-expression. A child should be taught to “get in touch with himself.” Knowledge and learning must not be “forced” upon him. The child should be presented only material that is “simulating to a mind like his.” This, by Dewey and his followers, was called “progressive” education. It is as though a child’s mind were to be frozen at the child’s level forevermore. Not to mention the mind of the “educator.”

A child is not to be “forced” to learn anything? Isn’t the point of education in a civilized society to pass on its history and its tradition to give growing children a framework into which to grow? Isn’t it the job of the teachers and schools to impart to the growing child the information and thought by which the child can become—well, civilized? Dewey and his current disciples would seem to prefer letting the young run their savage course. That’s pretty much what the kids in the inner cities do. Perhaps one day the education establishment will open its eyes. A Tea Party at the next local school board election might provide an incentive.

The two trends of which Scruton writes, child-centered education and disinterest in substantive content, have been exacerbated by the “zeal of the egalitarians” who hate learning because real learning reveals differences in achievement levels. And that leads to value judgments about people and ideas. It might even lead to questions about the nature and purpose of public school education. You can see right there the subversive threat to the First Commandment of the Civil War: Be Not Judgmental. The system we call public education has long been turned away from an intelligent analysis of the purpose it is to serve, and sunk into sterile and debasing ideology..

Anti-Matter
Some thirty years ago English writer Malcolm Muggeridge warned that Western civilization was the first in history “to breed and indoctrinate” at public expense “the barbarians who will overthrow it.” He saw that the West was brainwashing its citizens to expect, even to welcome, that downfall, thus enforcing “the death-wish of the few” on the many. This is what Pope John Paul II calls a “culture of death.” The mayhem begins in the schools.

Columnist and author Mark Steyn characterizes grade school education as a “form of child abuse” that cuts off those of the upcoming generation from the inheritance of their culture. In the long run Steyn sees the “relativist mush” taught in grade schools as a threat to national security. In 2002 the Washington Post reported a survey showing that some 60 percent of the nation’s high school seniors lack even a basic knowledge of U.S. history.

Roger Scruton sees this John Dewey-inspired regimen as a “denial of history, traditional learning, and moral common sense.” Students subjected to that regimen are impoverished in spirit, and robbed of self-hood. Instead of cultivating a responsible self, students are taught to diminish themselves into “citizens of the world” before they can realize that a citizen of everywhere is a citizen of nowhere.

Scruton finds higher education also fostering a “culture of repudiation” to replace the concept of citizenship. He sees running through the humanities departments of American and European universities a concerted theme of “the illegitimacy of Western civilization.” Author and historian Arthur Herman terms the typical Englishspeaking university “a well-funded instrument for destroying traditional Western culture rather than preserving it.” Mark Falcoff, emeritus scholar at the American Enterprise Institute, notes that portraying the United States of America as “sinister, hypocritical, imperialistic, racist, ruthless and cruel” is common in the hate-America discourse of our universities. Any good the country might have done is presented as “entirely inadvertent and accidental.”

A result is a culture of repudiation that seeks through speech codes, anti-racism policies, sensitivity training, diversity programs, and similar coercive measures to eradicate loyalty to American culture, and to destroy the nation state that supports that culture. Without the support of the nation state dedicated to freedom, and operating within stated constitutional rights and judicially enforced guarantees of liberty, there is no liberty. There are no institutions to protect freedom, only the “relativist mush” of internationalism of which Steyn speaks.

The de-Americanization of American school students, immigrant and native alike, is illustrated by comments of high school graduates collected by educator and commentator Kay S. Hymowitz. Being an American, these students say, is “not very special.” Being an American citizen is “not very important.” Since everybody is a citizen “it shouldn’t mean nothing.” “I don’t want to be a citizen. It’s stupid to me.” Lost to these students are the ideals expressed by a judge of the Seventh Circuit Court of Appeals in a 1992 case: “Patriotism is an effort by the state to promote its own survival,” to implant the values that justify its existence, and “to transmit those virtues and values.” The Civil War has changed all that.

The Intercollegiate Studies Institute recently released a study of some 40,000 college seniors in 50 top American colleges and universities. The study tested their knowledge of basic facts about the American government and economy. At none of these institutions did senior students average better than a D+ (a score of 69 of a possible 100), and only 14 scored in the D range of 60-69. Eighteen scored in the 50-59 range, solid Fs, while the remainder rated, one must suppose, something like “Super F!” Three scored in the 30s. The more prestigious schools did no better than lower rated institutions.

Victor Davis Hanson is professor emeritus of classics at Fresno State University, one of 22 campuses in the California State University system of some 400,000 students. It is the world’s largest public university, and exists alongside the separate University of California system. Hanson found in his 20 years of experience in the State University system that it fails to emphasize grammar, composition skills, oral presentation, history, literature, music, or art. Students are not taught to analyze their own ideas or to defend them in a logical manner. Nor are they required to memorize dates, facts, or concepts. They are taught that their consequent deficiencies in verbal and analytical ability “have little to do with lack of discipline, effort, or talent.” Such deficiencies, they are told, are attributable to pathological sources outside their control such as racism, gender bias, or public neglect.
Hanson has found that students educated under this regimen are expected, after graduation, “to proselytize for this creed of entitlement, big government, and victimization.” That is, to become community organizers. Students subject to this sort of indoctrination are not able to stand erect as fully developed and self-sufficient adults, but must always bear the hunch back feeling of remaining something less. Given the Marxist model of command and control these teachers emulate they are turning out what they might be proud to call a “lumpen proletariat” of the mind. Sodden, beaten down, and ready to be manipulated.

Alexis de Tocqueville foresees Americans becoming “a flock of timid and industrious animals, of which government is the shepherd.” Tocqueville, acute as his analysis is, does not perceive that even the industriousness might also fade away into indulgence. But his intuition of a sheepish longing for the government shepherd is prophetic. That is a victory the Civil War strives to achieve.

Roger Scruton traces these trends directly back to the campus eruptions of the 1960s, but with an interesting take on the reasons. Scruton suggests that the professors, the teachers in teacher colleges, and many of the rest of the intelligentsia believe and teach as they do simply because they don’t know anything else. “Truth, validity, and knowledge” were excised from the curricula of the sixties. Then, Scruton says, nonsensical ideas dressed up as liberation from the oppression of the past were introduced instead. That liberation, Scruton perceives, was not only liberation from “truth and reason,” but also “from the very thought of the human community as something more important than yourself.” M. D. Aeschliman terms the system that yields such products “a new form of barbarism.”

True education is exciting and exacting. The best and most remembered teachers, for those lucky enough to have had any, are the taskmasters and hard graders who made students test the best in themselves. Goals are set and the will to understand and to excel is encouraged.

Instead, millions of American students every year are being shrunk back toward their childhood, toward innocence and ignorance. They are being infected with the virus of a kind of anti-matter, destined to destroy that which it touches.

No apples for the wardens of this system, who were once called teachers.
17. The Arts
“Music”

If the destruction of words in the schools and elsewhere, and of the concepts constructed with words, are insufficient to undermine a civilization, there are always available the rhythm of music and the graphic images of the artist to lend assistance. As the Beatles were the idols of adolescents a generation ago, groups such as Ice T, Tupac Shakur, Sean “Puffy” Combs, Jay-Z, or Ludacris have been the icons of what serves today’s teenagers as music. The Beatles were musicians, and whatever their faults music was their signature. Today’s rap and hiphop celebrities can scarcely be classified as musicians. Their signature is sex, violence, hatred of the police, and obscene degradation of women. The image presented is that of thugs and sexual predators lacking a connection to humanity.

There is nothing much new in this “music” but it is still there. And its message still mutilates the sights and sounds of freedom and civility, which augments the goals of Civil War. The sounds of these groups, and the sight of their performances, often obscene, advance the conditions of social degeneration which the Civil War requires.

Hoover Institution research fellow Shelby Steele, a leading black commentator on current events, links this degenerate genre to the myth of what he refers to as the “Bad Ni--er” of the slavery era. “The BN is unbound and contemptuous,” says Steele. He hates his condition, his master, his society, and pretty much everything else. So he takes vengeance against the master’s women to assert his feeling of total freedom. The indifference of the BN to human feeling makes him a “revolution incarnate.” This may not be surprising as a part of black history in America. The intriguing question is why a derivative from such fuming hatred appeals so strongly to American youth today, white as well as black. How does such violence and malice invoke passionate acceptance across racial and social lines, and generate rich commercial rewards?
Steele notes the decay in family life during the pubescence and adolescence of the younger generation. Many of today’s suburban white youth, living in the turmoil of a divided family, or no family at all, can identify with the slave’s anger and isolation. They can share the same need for false myths to substitute for a reality they cannot bear. Steele points out that today’s youth, so many lacking strong bonds within a home, find surrogate institutions lacking as well. The church and the school also fail to furnish loving care, moral guidance, or the will to strive for excellence. So it is that the white kids of suburbia are “oddly simpatico” with the black American experience.

These kids are not candidates for sentimentally romantic music. They are not drawn to the intricate and hauntingly beautiful blues creations that salved the pain of southern American blacks at one time. Nor will the vibrant Dixieland jazz that originated in the funeral processions of New Orleans fill the void. Not even a comparatively temperate middle class revolt such as the Beatles satisfies today’s lost generation. This generation craves something more directly related to its own experience, however difficult it might be for them to articulate just what that experience is. In the meantime they gravitate to such “music” as is available—hot, throbbing, visceral, nasty, and hateful.

Steele’s explanation of what these lost youngsters are attracted to is not comforting. He sees the appeal of the BN, at its deepest level, as indifference, or even immunity to feeling. Steele’s insight is that the BN did not want to feel the love and fear that would bind him to other people. To allow feelings would have left him open to accommodation to slavery for the sake of family and friends. He became the total rebel, fearing not even death, and could “slap a white man around” with no concern for the consequences sure to follow. Many of today’s isolated suburban kids fear that to acknowledge feeling would ensnare them in a similar way to the slavery of their parents’ failing marriage. And they are terrified of falling into the same trap.

An extreme expression of rebellion for the young male today is to revel in the vile and sadistic depiction of women rather than risk submission to the bonds of a woman’s tenderness and love. Women are nothing more than objects to be exploited in every way imaginable. They are the “ho’s” portrayed in the graphic “lyrics” of the rappers and hip-hoppers. Cool, man, cool. The very popularity of the term “cool” carries with it a significant degree of removal from the turbulence of human relationships in the wider population as well.

In her book Home-Alone America Hoover Institution fellow Mary Eberstadt analyses “The Primal Scream of Teenage Music” as one of her chapter titles puts it. Eberstadt examines what one commentator describes as “the fetid heavy-metal/hip hop swamp of profanity and misogyny” typical of this genre. She discovers that the plaintive wails of the teenager center on divorced parents or absent fathers. Speaking of Eberstadt’s book Susie Currie, who describes herself as a mother and housewife in Maryland, offers such examples as Papa Roach’s “Broken Home,” Blink 182’s “Stay Together for the Kids,” or Snoop Doggy Dogg and Soulja Slim’s “Mamma Raised Me.” The sub-title of Eberstadt’s book is The Hidden Toll of Day Care, Behavioral Drugs, and Other Parental Substitutes.

Manhattan Institute senior fellow John McWhorter sees the most popular music of his fellow blacks in America as presenting a “grim, violent, misogynist, sybaritic black male archetype.” This creature appeals to young blacks as a “symbol of authenticity.” They want music that is real, and in their experience it is music of “the gutter” that they find most real. McWhorter notes the frequent murder of black rappers such as Camouflage, Freaky Tah, and Jam Master Jay—the violence of their music played out on the streets. In every talk he gives McWhorter gets at least one question about a “hip-hop-revolution,” meaning mob violence in the streets. McWhorter sees much of hip-hop and the rest as “staged alienation” by these rappers for the purpose of promoting their records. But there is nothing staged about the attraction their violent and hate filled recordings elicit.

A curious aspect of this hatred of women, and the absolute misogyny these groups thrust upon their audience, is that, at least until recently, it has gone without public protest in the age of Women’s Liberation. Women are depicted as bitches and “ho’s,” convenient receptacles of lust, never as persons with feeling or substance. Men who use women only to satisfy their erotic appetite become “satellite fathers” who only sporadically, if ever, support or visit their children by different women. Perhaps this genre could be characterized simply as a war against feeling. Feeling is bondage. To reject feeling is freedom. But it is freedom of a very different sort than American society was created to nurture and enjoy.

Rap and hip-hop come from an underclass where human connections are, Shelby Steele says, “fractured and impossible.” Any attempt to connect with another is filled with such pain and disappointment that human feeling itself must be rejected. In the end, as Steele sees it, the real problem is not so much rap’s “cartoonish bravado” as it is the conditions of life for which rap is used as compensation.

Philosopher and author Roger Scruton sees teenage music as rites of passage, but not in the normal sense of passage from adolescence to adulthood. The rites of passage of youth today “are not from adolescence but more deeply into it.” This is represented in their music which, Scruton finds, is an invitation to join the gang. It is a mark of their core membership, their clan or tribe. Any criticism of their music from outside the gang is not only offensive, but also a threat to the escape into music that compensates for the real relationship they cannot attain. The determination is to dive even more deeply into adolescence, and to protect the status thus achieved. Scruton provides a key to a better understanding of what so often seems like an entire culture of adolescence from college riots, to Woodstock, to the “clans” of campus intolerance and on beyond.

Teenagers have teenage rebellion hard wired in their makeup. Add to that the alienation such as Mary Eberstadt describes. Mix in the indoctrination kids get in school where they are trained in sex devoid of love, and to hate and reject the values that have held American society together. Should they, as it were, rebel against their own rebellion, they are told they must not judge what is affecting them. In their rebellious isolation they turn to an ultimate nihilism not unlike that of the BN Steele describes.

Steele paints in Dante-like images the fires of despair that circle ghetto living. In doing so he bares the pain of many a suburban kid’s alienation as well. Absorbing Steele’s insights is to sense and feel the horror of a modern Inferno. Might the kids, black and white, who live in it be prepared one day, after all, to transform the nihilism, hatred, and violence of the rap music that advertises their lives into a hip-hop reaction quite beyond any staged alienation? Is this music a prescient vision of the revelation gospel of the religion of Civil War? Is it a glimpse into the pits of hell on earth to which total disbelief in America would give birth?

Artists and Intellectuals

The rappers and hip-hoppers are not the only “artists” who inflame revolutionary despair and civil misbehavior. Alienation, anger, and ugliness growl at the viewer from the paintings, sculptures, and other works at the cutting edge of what passes as avant-garde art. We see a cross immersed in a bucket of urine, titled “Piss Christ.” We are enlightened by a photograph of one man standing over another man lying on the floor and urinating into his mouth. Over there is a wall hanging that one must walk on an American flag in order to view. A Mother of Christ is splashed with dung.

Works such as these are the creation of artists who say they are breaching the limitations on human potential, pushing the envelope, creating new artistic vistas. In their “cutting edge” creations what they are cutting are “the ties that bind,” as the Protestant hymn has it. These works are said to awaken the bourgeoisie, sluggish and spiritless, out of their stupor of complacency. The stated avant aim is to release the viewer to join the artist in expression of a new and more liberating freedom. Some aficionados suppose that artists of this persuasion are on the trail of perfection. They see artists of an earlier time as searching merely for the beautiful while the new art seeks to transcend mere beauty and capture the eternal sublime.

In their search for the sublime, says Carol Iannone, an editor at the journal of the National Association of Scholars, modern artists seek to confront a “limitless unknown” abstraction that transcends everyday experience. Historian and educator Jacques Barzun finds the results of that search to be not sublime but depressing. He notes that such artists are fond of claiming that art has no obligation to teach morality, while at the same time denouncing the Judeo-Christian moral structure. That sounds a bit like a newly invented moral judgment rejecting conventional morality. The search for the sublime justifies all in these artists’ eyes. The Eighteenth Century English statesman Edmund Burke, in an essay “On the Sublime and Beautiful,” also found excitement in contemplation of the Sublime. But after reflecting on the French Revolution, which sought the Sublime on earth only to end in butchery and tyranny, Burke settled for the more serene contemplation of the Beautiful.

The artist and the intellectual can often be found in close kinship in the “anti” world of disbelief. Like the artist, the intellectual seeks stimulation and excitement in a dull life. The twentieth century French philosopher Raymond Aron recognizes the dullness of civilized society. He characterizes civilization as “an elaborate invention… for abolishing fierce passions.” Many of Aron’s contemporaries, such as Simone de Beauvoir and Jean-Paul Sartre, the father of Existentialism, applied themselves to stir up those ancient passions. The totalitarian regimes of the mid-twentieth century showed them how to do it. For his part Aron recognizes that to oppose the established order, whatever it may be, is the “occupational disease” of most intellectuals.

Artists who claim to be guided by the principle “Art for art’s sake” are admonished not to paint or write with the primary intent to sell their product. (If the promotion catches on anyway and the stuff sells for mega-bucks, well, what can one do?) Artists attempt to express their feelings about society, themselves, and their convictions. When art proceeds from a mold of hatred and rejection of the society in which the artist lives, artistic expression tends to become banal soup cans or incomprehensible abstraction. “Perhaps,” Jacques Barzun suggests, it is energy and movement of an inventive and changing world that appears hostile to “the contemplative seekers of beauty and perfection.” Some artists come to believe that art for art’s sake is art for life’s sake, and that without art—that is, art such as theirs—existence would be unbearable. Socrates urged his fellow Athenians to consider the thought that life is worth living only in contemplation of beauty.

These and similar ideas about beauty are examined by Alexander Nehamas in his book Only a Promise of Happiness The Place of Beauty in a World of Art. Socrates and his followers taught that contemplation of beauty could inspire a knowledge of goodness and truth. The Socratic philosophy in this view holds that life is impoverished when we deny beauty its place. It is interesting to turn these observations around and wonder if the ugliness in so much of current “art” does not gradually inspire the violence, intolerance, and hatred we see around us.

Avant-garde intellectuals and artists believe themselves to be among the brave forward battalions in an endless march toward the Sublime. Each would like to think of herself or himself as a new Rousseau, Nietzsche, Picasso, Gauguin, or Moliere, grasping for bold new creative structures. The reality is more the timid recluse seeking solace in disparaging his fellow beings. He finds comfort in cramped beliefs shared by a small coterie of like-minded ascetics, their works more morbid than sublime. The credo of rejection, Barzun finds, provides comfort and escape for those unwilling or unprepared “to wage the battle of life.” For Barzun a classic writer such as Rabelais leaves the reader “exhilarated,” as does seeing a Greek tragedy. The effect of the “advanced” artistic movements is to leave the mind “depressed,” as does a play like Death of a Salesman.

Mass Culture

A provocative commentary on popular culture springs from an unlikely source, Thomas S. Hibbs, professor of medieval philosophy at Boston College. His book Shows About Nothing examines the assumptions and effects of contemporary television and movie entertainment. What he finds is that mass culture shares the nihilistic qualities of the avant-garde, but only to a degree that Hibbs dubs “nihilism lite.” The shows he talks about contain nothing strident and often exhibit no emotion at all. Jonathan V. Last, online editor of The Weekly Standard, sees Hibbs’ book as expressing a realization that rampant nihilism induces the “flattening of man,” a “reduction and simplification of what it means to be human.” Last notes that both liberal and conservative critics of mass culture usually haven’t bothered to watch much of it. Prof. Hibbs, by contrast, rather sheepishly admits that he likes mass culture. As a result he knows it quite well.

The show Ally McBeal illustrates Hibbs’ analysis, not by the character of Ally, but by the world in which she lives. Similarly the film The Ice Storm allows, Hibbs says, “the banality of evil to find its finest contemporary expression.” Hibbs agrees that such works do subvert traditional values as conservatives claim. They also teach violence as liberals claim. But the real danger exposed by Hibbs is not that mass culture is an attack on specific moral or religious values. Nor is it an attack on the philosophies of either the Left or the Right. Mass culture is an assault on the very concept of morality and moral values. That is the essence of destruction the Civil War strives to achieve. These shows advance that aim without pretending to be part of the Civil War at all. Hibbs believes that we have already created a society beyond good and evil, as the nineteenth century German philosopher Friedrich Nietzsche predicted. Hibbs takes Seinfeld as the epitome of the sitcom, populated by those Nietzsche envisions as the last survivors after contemporary society has been destroyed. But far from being the supermen Nietzsche imagines, these men, “when faced with the great questions and ultimate issues of life,” says Hibbs, “blink and giggle.”

Hibbs presents a pair of Seinfeld episodes. One is a discussion of abortion, the other a discussion of whether people should be allowed to choose the toppings for their pizza. The two are presented at the same level of significance. Each discussion mocks both sides of the question. “Pizza, abortion,” Hibbs notes, “its all the same.” Hibbs finds the underlying motif of both to be “morality as farce.” There are no higher or lower values. As Jonathan V. Last sees it, Hibbs seems to argue that liberal democracies “breed nihilism through the abundance of comfort and safety.”

These shows seek neither the sublime nor the beautiful. They assume that art is about the ordinary, and the more banal the better. In that belief they follow from the soup cans of Andy Warhol or photo-like paintings of smokestacks or haystacks. Critic Terry Teachout describes Warhol’s art as “a burlesque of democracy, Tocqueville’s worst dream come to life.” After Warhol, says Teachout, anything could pass as art from hardcore porn to “gangsta rap.” The creator of such art is now assured that he will be seen as a “subversive innovator.” And the academics, Teachout adds, are “secretly relieved” not to have to give serious thought to real art. The haughty intellectual, contrary to his pretensions, in fact often exists on about the same level of seriousness as one of the characters in Hibbs’ popular shows.

Hibbs’ analysis reflects a population seeking, not to fulfill the vision the founders had in mind at Philadelphia in 1787, but drifting toward the abyss where the trails of cynicism, nihilism, and calculated destruction end up together.

Nouveau Avant-garde

No matter how depraved avant-garde art becomes, there is always the nouveau avant-garde waiting to drive art to new depths. It was some 85 or 90 years ago that the notion of ready-made “sculpture” was expanded to include the display of a common urinal. That advance surely brought culture to the masses. Not to mention the compliment implied to the plumbing class, which could thereafter make some claim to artistry. Those who may have rebelled against urinal art at least were spared for a time what was to follow, including contents thereof, and even more imaginative degradation than that. The Atlantis gallery in London was among the first havens of artistic creativity to discover that pieces of the human body may be considered works of art if properly presented. The Atlantis sought to demonstrate this in its exhibit Body Parts by Germany’s Gunter von Hagens, whose name sounds like a character escaped from a Wagnerian opera. But whereas Wagner dealt in metaphysical realms, von Hagens’ theme is meatier.

Von Hagens invented a method of “plasticizing” corpses so that their tissue is not only preserved, but also kept pliable enough to be worked into the designs he offers as his art. Body Parts displays 25 intact corpses in various poses, and some 175 body parts arranged to suit the artist’s esthetic sensibilities. One corpse has his head split open, holding his brains in his hands. Another has been flayed and is shown standing erect, holding a sheet of his own skin. Then there is the artistic apogee of a pregnant woman, her womb ripped open to display the fetus. The exhibit has since traveled to enlighten the world, though in San Francisco some of its elements began oozing liquid, suggesting the possibility of its early demise.

The Italian artist Piero Manzoni enjoyed his vino to excess. This caused his liver to cease performing its necessary functions at the age of 29, dragging the rest of his mortal corpus with it. Advised of his condition, Manzoni wished to leave something of himself in terra firma as evidence that he had been here and done something. Signor Manzoni struck upon a novel idea. He chose what might be called the essence of his worldly efforts, that which his bodily functions had worked for all it was worth, and which the artist was now willing to expel unto eternity. He canned 90 samples of the stuff as “an ironic statement” about the marketing of artworks, and labeled the production “merda d’artista.” He then announced that these precious memoirs were for sale. The Tate Gallery in London reportedly paid some 22,000 pounds sterling for a specimen, so to speak. The Pompidou Museum of Paris and, not to be caught lagging behind this trendy curve, the Museum of Modern Art in New York, each purchased their very own cans of the artist’s essence. Gallery goers eager for a moment in the presence of evanescent fame, but who may wish to share only the presence, but not the essence, of this unique exhibit, beware. At least half of the original 90 cans, unable to contain themselves, have exploded.

For a safer visit to similar artistic extremities visit London’s Tate Modern Gallery where Martin Creed’s Work No. 401 may be enjoyed. This work is a taped segment of nine minutes, running continuously, of the artist making flatulent sounds into a microphone. Just how the sounds originate is left to the viewer’s imagination. The art can (must if you are there) be heard throughout the Material Gestures wing of the gallery, immaterial though this particular offering might seem to be.

Writer Dinesh D’Souza thinks that modern art is accepted, not so much for what it is, but as an expression of the artist’s authenticity. Art becomes a specialized part of the broader quest for personal authentication in today’s drifting society. D’Souza finds people in all walks of life busy writing a book, attending drawing classes, or out painting landscapes on the beach or under a tree. Museums have a far easier time collecting donations than do local churches. So ubiquitous does D’Souza find adulation of art and the artist that he suggests art has replaced (conventional) religion as the most significant cultural institution in America. It is a culture of the new morality, which holds that human nature is basically good and society corrupt. The goodness of human nature must therefore assert itself against the constraints of an oppressive culture. The new morality is based on passion, feeling, and self-identification that seem to find their most congenial manifestation in the work of the artist. Whereas art was once admired for its fidelity to nature, D’Souza declares that art is now admired for its fidelity to the “‘inner nature’ of the artist.”

This suggests that there must be as many distinctive inner natures as there are artistic bodies to incorporate them, and so at least that many definitions of what art is valid and what is not. Who is to say whether a piece of art is faithful to the inner nature of the artist? The old conundrum was, does art imitate nature, or does nature imitate art? The question now becomes whether art imitates an infinite number of inner natures, and whether society, through its individual constituents, imitates art. This raises the image of society, like the particles in a nuclear event, flying apart in all directions with catastrophic effect.

The Critics

If it is difficult to caricature the works of the most avant of the modern garde, it is even more difficult to parody its critics. Consider a recent exhibition at the Williams College Museum of Art, “Prelude to a Nightmare: Art, Politics, and Hitler’s Early years in Vienna, 1906-1913.” The exhibit is der Fuehrer’s watercolor paintings done in his Vienna years before he became der Fuehrer. According to the exhibit’s curator Herr Hitler’s subsequent atrocities were motivated by an esthetic vision of the world he wished to create—his version of the Sublime. Hitler’s extermination of the Jews was in pursuit of his mission to “beautify the world” by getting rid of an element whose physical appearance offended him. Or, as a review in The New Yorker explained, Nazism was an artistic ambition “to remodel the world according to a certain taste.” And the world Herr Hitler created did bear a passable resemblance to his paintings.

Art and music critic Terry Teachout observes that for a moment after 9/11 sanity returned to the art world, and even the avant garde critics returned to earth. The idea, he says, that the major artistic creations of Western culture serve as “unwitting capitalist tools” used to prop up a decadent Western establishment was in abeyance. The notion that a splotch of mud or feces on a wall commands the same respect as a Rembrandt self portrait went into remission. Violinist Yo-Yo Mah, tenor Placido Domingo, the New York Philharmonic, and the Metropolitan Opera all gave special performances. Diana Krall’s hauntingly beautiful “The Look of Love” became a best selling CD. Works that had been, until discarded by the militant warriors of rebellion, universally recognized as great works of art got a fresh reception. The nihilistic belief in the impossibility of beauty lost some of its social cachet.

Still, those who clamor that great art and literature are nothing but weapons of the powerful imposed on the powerless to prop up a decadent culture did not vanish. Teachout warned early on that if such forces appeared to be in temporary retreat “they’ll be back, angrier than ever, as soon as the smoke clears.” Their noses had been stuck in the same ideological trough for too long to sense a way out. And, as Teachout observes, in the universities they have both tenure and patience. And if in the end they don’t really believe in anything to replace that which they loathe, Teachout notes that “they disbelieve in it passionately.” Remission of disbelief after 9/11 did not last long. Perhaps the next such event will give birth to a more lasting revival, assuming we survive the event.

The sights and sounds of the arts as promoted by the critics demonstrate a kind of spontaneous dictatorship of political correctness. This has penetrated deeply into both elite and popular culture as the artistic brigades of Civil Warriors march on. Music, painting, and sculpture, as well as situation comedies and other material in the mass media, have all been infected. At its worst this is a culture of rejection, violence, and obscenity. It is a howling protest without any cause but to destroy. At best the values, or lack of values, this art world manifests is a nihilism of whimpering indifference.

If it’s alright to “Piss Christ” or spread dung over the Virgin Mary, why not throw a little over Mona Lisa or splatter the Sistine Chapel? How about some explicit graffiti scratched into the marble of the Winged Victory at the head of the grand staircase in the Louvre? A new post-post modern Art of Defacement might come to be recognized. When that which has all the depth of juvenile protest is accepted as art there are consequences. Both the artist and those who think the stuff worth considering risk embalming their minds at the pubescent level. The possibility of a society of perpetual pubescence, and the artistic milieu that reflects acceptance of that condition, gnaws at the roots of society and generates a sense of foreboding.

At least one museum director understands all this. Philippe de Montebello, former Director at New York’s Metropolitan Museum of Art, believes there should be an aura of mystery, of magnificence, of wonder in a great museum. “I do everything I can,” says M. Montebello, to shape the Metropolitan Museum so that “when you walk up from 82nd Street into the great hall, it should take you out of the din of ordinary experience into something wondrous.”

Don’t look for merda d’artista at the Met.
18. Politics
Identity Politics

Concepts of diversity and multiculturalism were designed to aid minority groups designated as victims of discrimination or intolerance, most likely due to “white male dominance” in one guise or another. The groups so designated become captives of politics designed, not to assist them, but to control and manipulate them for political purposes. Multicultural and diversity leaders enflame their “victim” groups to dwell on their grievances and to demand retribution. The victims must proclaim fidelity to their separate cultural or ideological tribes against the general culture. The victim classes are exhorted to blame society for their own inadequacies to justify their claim to privileges or monetary compensation. This keeps glowing the angry coals of their carefully constructed victimhood. The resulting tribalism and cynicism is played out in identity politics.

Foreign-born captives of diverse ethnicities are taught that their native language comes first, and to consider English as secondary. This even though English is their best ladder to success. But such guidance is not accidental. The chiefs of the new tribalism don’t want their charges to succeed. If they did succeed, they would become self-directing individuals no longer subject to manipulation by the tribal chiefs.

Tribal leaders build a relationship of interdependence with the Democratic Party. The largest and most influential of these is the African-American community, led by such figures as Jesse Jackson and Al Sharpton. Hoover Institution research fellow Shelby Steele recognizes that identity politics might once have been a useful and positive program to help black people rise and assimilate. Now, from his own experience he finds that black identity politics has become a virtual prison of thought, a bar to genuine progress for African-American people. Steele characterizes tribal leadership as “apoplectic” over any program, such as welfare reform, that would encourage individual growth. Awakened and responsible individuals threaten the suffocating group identity required by the black masters to keep hold of the masses they control. The trap of black identity politics is squeezed tight by maligning any black who dares to reject identity requirements, and succeed on his own, as a betrayer of his people. Black tribal chiefs vilify accomplished black conservatives such as Ward Connerly, Clarence Thomas, Condoleezza Rice, or Steele himself. Black conservatives are charged with “heresy” for moving off their ideological plantation. They may even be told they are “not really black.” What makes you black today, Steele says, is not simply being black, or being a member of a black culture, “but a belief in this politics.” That is why Bill Clinton, as he established his post-presidential office in Harlem, could claim to be black. Steele cites blacks who stray from the mandates of identity politics as having been confined to a “special gulag” of ostracism and calumny. Speaking from personal experience, Steele grants that for a black person to read in the New York Times that he is not really black “is to be annihilated on some level.” Linda Chavez, president of the Center for Equal Opportunity, reports similar conditions within the Latino community.

Where identity politics reigns anyone, black or of any other tribe, who advocates such ideas as personal responsibility, individual initiative, hard work, and the rest of traditional American virtues is charged with fomenting a form of racism. Ironically those were once the qualities of black communities themselves in many American cities. It was the Democrats’ “Great Society” of welfare and single moms that destroyed black communities that had been cohesive and successful even under segregation. The greater irony is that black chiefs of identity politics still ardently persuade their tribe to thank the Democrats for doing this to them by regularly voting Democrat by over 90 percent.

Identity politics of whatever group, black, Hispanic, Native American, or other, rejects such accepted American values as family, faith, reason, and love of country. John Leo, a senior writer for U.S. News and World Report, in his book Incorrect Thoughts shows how this can happen. Leo cites a Los Angeles school case in which a student had marked “American” on a form asking for his ethnicity. He was reprimanded for not checking “Filipino” like his parents. The poor boy misunderstood the whole point of identity politics. He didn’t know that he was required to stay in his approved ethnic feedlot. He was unaware that in breaking out he would threaten the sense of victimization upon which group entitlements are based. The New York Times has gone so far as to label the idea of assimilating ethnic and racial groups into the American “melting pot” as “dated” and “racist.”

Tunku Varadarajan is a New York University professor of business, and himself a first generation immigrant to America from India. He notes the “seamless commonality” of grievance groups molded into identity politics. Whether it be lesbians, gays, cross dressers, Hispanics, blacks, or some other group, membership in such a group provides “an alternative route to power in American life.” This route to power Varadarajan finds is open exclusively to minorities and liberal whites. Shelby Steele comments that in this context for a minority member to think of himself “as an American and an individual is to lose power.”

Identity politics has taken on geographic characteristics in addition to minority, racial, and other bases. Gregory Rodriguez, a director at the New America Foundation, cites studies that show over the past 30 years the development of segregation by ideology. Many of the millions of Americans who move each year from one county to another choose counties and neighborhoods where people think and talk like they do. Conservatives and liberals gather together in their separate enclaves. This leaves fewer jurisdictions with a healthy mix of opinion and debate. The composition of the U.S. House of Representatives has changed accordingly. Moderates who were willing to listen to the arguments of both sides of an issue, and perhaps help fashion a compromise, comprised 37 percent of the House in the 1970s, but only 8 percent in 2005.

Now that the country is organized into “little reservations,” Rodriguez notes, we all vote together “with the members of our little tribes.” While we still speak of individualism and democratic ideals, Rodriguez fears that we practice neither. And today each group expects politicians to “carry our water” rather than striving for policies in the public interest. In the realignment of counties the reds get redder and the blues bluer. The small minorities in these ideologically divided communities often cease participating in political life altogether. Rodriguez finds that Americans who hold graduate degrees “live the most homogenous political lives” of all. Think faculty club lounges at almost any college or university today with liberal percentages in the high nineties.

Political Assassination

A harbinger of the ruthless and radical Democratic Party to come has been its treatment of Republican appointees to public office, especially the judiciary, when they have come before the U. S. Senate for conformation. Judicial positions, it might seem obvious, should be filled by those who display a judicial temperament. That means candidates who show an ability to approach cases presented before them impartially. Disputes are to be judged on their merits, according to the facts of the case, and the relevant provisions of the Constitution or statutory law. Personal opinions or predilections, if any, should be scrupulously put aside.

But that is not how it works for Democrats in the U.S. Senate. To these Senators judicial temperament in a candidate is a disqualifying liability at his confirmation hearing. It is a ticket to insults, humiliation, character assassination, and rejection. Judicial impartiality is a threat to the very purpose of Democratic Senators, who do not believe in the constitutional purpose of an impartial judiciary. The purpose of these Senators is to rid the courts, especially the Supreme Court, of the very spirit of judicial impartiality. It is to establish and maintain a Supreme Court majority to sanction radical causes, and to create radical policy that could not be approved through the legislative process. Toward this end some of the nastiest, most reckless, and divisive politics in Washington is played out over judicial nominations.

The tactic of political assassination, perhaps more aptly called the politics of personal destruction, was illustrated at its most vicious in the bellwether case of Judge Robert Bork. Character assassination is most effective if initiated directly after a candidate is nominated, before the nominee has a chance to speak for himself, or for others to speak for him. That tactic was sharpened to perfection when President Ronald Reagan nominated Judge Robert Bork for a seat on the Supreme Court in 1987. Legal scholars with virtual unanimity recognized Bork’s extraordinary qualifications. These included his scholarly work and his distinguished record as a Judge on the District of Columbia Court of Appeals. Most observers of the Court expected an easy confirmation.

But Bork’s supporters did not understand the political stakes. Nor, in fact, did President Reagan. They did not realize that judicial appointments were a critical front in the Civil War against the American constitutional system and the restraints it places on judicial authority. Nor did they anticipate the unscrupulous tenacity of those who viewed Bork as a deadly enemy in that war.

When the nomination was announced Senator Edward Kennedy rose on the Senate floor to condemn “Robert Bork’s America.” That America, the Senator intoned, would be tormented by midnight police raids, segregated lunch counters, back alley abortions, and courthouse doors “locked on the fingers of millions of citizens.” Those charges were, to put it charitably, grossly untrue. His speech was necessary, the Senator later explained, to prevent those Senators recognizing the high qualifications of the nominee from signaling their early support. It was, he elucidated, “to hold them in their places” until what Bork’s opponents candidly described as a “war room” could be set up in the basement of the Senate Office Building where the confirmation hearings were to be held.

Those who supported Bork, including President Reagan, were wholly unprepared to counter the attack on Bork’s competence and character that was about to occur. In the ensuing weeks the theme of Kennedy’s lies, the charge that Judge Bork was morally insensitive and intellectually dishonest, newspaper and TV ads by left wing opponents, polls based on loaded questions, and similar attacks were driven into the public consciousness like jungle drums beating out the cry for blood. False charges were repeated over and over and echoed by a sympathetic media looking for sensational revelations. Simple factual refutations received little notice, and the shrill lies at last seemed to be the truth. Senator Kennedy revealed later that numerous and repetitious phone calls had to be made to persuade (coerce) those who knew better to testify against Judge Bork. Favorable witnesses were threatened with treatment similar to that Judge Bork was suffering if they did testify for him.

Few of the Democratic Senators who had been ready to confirm Bork on his qualifications had the courage to stand up against the blasts leveled against him—and might well be thrown against them should they support him. A public that knew little of judicial politics or constitutional law became convinced that Bork was the demon the Democrats accused him of being. So savage and irrelevant to Bork’s qualifications was the attack mounted against him that it led to the coining of a new verb: “to bork.” Those subject to the procedures used against Judge Bork get “borked.” The managers of the politics of personal destruction, the political assassins, had done their work: the nomination was defeated. And the constitutional rule of law, which must be preserved if the Civil War against America is to be defeated, received a very heavy blow.

Moral High Ground
Democratic smear campaigns against candidates for high office are, in their perverted way, efforts by the accusers to establish themselves as taking the moral high ground. This is an effort closely allied to the morality of quilt discussed in a previous chapter. By demonizing candidates they wish to defeat as unfit or incompetent, the attackers seek to make themselves by contrast seem morally virtuous. Similar attacks, equally venomous, occur in areas other than confirmation hearings as well, frequently regarding self-made blacks. It would seem reasonable to expect that bright and resourceful black individuals who have succeeded on their own merits would be held up as shining examples of black achievement. Not when the Civil War Democrats see them as a threat to the basis of identity politics. Ward Connerly, Walter Williams, Shelby Steele, Supreme Court Justice Clarence Thomas, and many other blacks prominent for their individual achievements have felt the lash of this unrelenting tactic.

The underlying reason for such intensity, says Hoover Institution senior fellow Shelby Steele, is that liberalism, as now practiced by the Democratic Party, has survived “decades past the credibility of its ideas.” The key element in the Democratic Party’s struggle to remain politically competitive is to maintain its hold on the ninety-some percent of the black vote the Party regularly gets. To do this the Party must claim, and seem to hold, the moral high ground. This requires a merciless pursuit of the identity politics of race as an indispensable weapon toward that end. The Party has survived, Steele observes, only through its having “captured black resentment” of past white injustice as the chief source of its power.

Black identity with the Democratic Party is maintained through a process that Steele in his book White Guilt makes abundantly, and to some embarrassingly clear. Steele sees that when suppressed peoples, including blacks, are freed their problem is no longer suppression but the very freedom they have achieved. Freedom shows those newly released from oppression “their underdevelopment and their inability to compete as equals.” White power can then be charged as the cause of the felt inequality, and whites made to feel guilty for the plight of blacks.

Steele finds white guilt to have arisen from the commendable admission by white Americans beginning in the 1960s that they had practiced racist ways. The result has been a submersion of white moral authority into the pit of an “overwhelming sense of guilt” over racism of the past, guilt that has to be assuaged. The most obvious way to mitigate that guilt for those who feel it is to disassociate themselves from the racist past. This is done by creating and supporting social programs that are “ostentatiously remedial in purpose.” In reality, according to Steele, the hidden purpose of these programs is “pandering to the socialistic longings of minority leaders.” This requires that the “victims” upon whom black identity politics depend be kept in the holding pens of a new plantation of the mind.

As actual racism subsides it is necessary to invent new devices by which to maintain the façade of victimhood. Not to mention the fountain of white guilt that gushes forth the cash and the privileges demanded by the black masters of this new mental plantation. The subtitle of Steelews book White Guilt is How Blacks and Whites Together Destroyed the Promise of the Civil Rights Era. As true racism subsides the charge of racism is plucked from the grave by inventing a “systemic” racism of “white privilege.” If actual racism is not apparent it has to be invented. The invention is that racism is so embedded in the system as to be immune from being erased or even clearly identified. Thus racism remains available to justify and sustain whatever new grievance rhetoric that might be conceived. Anyone who disputes this is a racist!

Liberal whites earn their badge of moral legitimacy by accepting this hidden guilt and acceding to demands for amelioration, however false or mendacious the guilt or the demands. This Steele finds is necessary for “guilty” whites to sustain the moral superiority of being non-racist. Confessing their guilt and paying off its “victims” is the totem of their virtue. But that professed and unreal guilt may also be a mask to hide, even from themselves, a genuine guilt for what their gaudy public guilt has wrought upon so many millions of innocent black people.

Steele relates his being attacked by a typical self-professed “architect” of the Great Society legislation of President Lyndon Johnson. The man accused Steele of belittling the achievements of that and similar programs designed to assist blacks. The man angrily reminded Steele how grateful blacks had been when the Great Society programs were enacted. To that man Steele’s attacks on the results of the Great Society programs showed that Steele was no longer grateful for those programs enacted for the benefit of him and his fellow blacks. The man was furious at Steele’s ingratitude. To that man such ingratitude was an attack on his own badge of moral righteousness that he had won by supporting the Great Society programs to show that he was non-racist.

To Manhattan Institute scholar John McWhorter this incident demonstrates that the man was not interested in the welfare of those supposedly helped by the programs. His interest was “in their affirmation of himself as not a racist.” If such a man finds that blacks are no longer grateful for the programs he helped to design for their benefit, he sees his own precious credentials as being not a racist damaged or irrelevant. If blacks see instead the damage these programs have done to them, especially in the inner cities, the man’s moral superiority is destroyed. His compassionology is unveiled. His countenance of good feeling, his facade of friendly and virtuous intent, is revealed as only a facsimile of good will, a smile for the record.

Columnist Maureen Dowd attacked U.S. Supreme Justice Clarence Thomas on this issue for his dissent in a case upholding affirmative action enacted to benefit blacks. She accused Justice Thomas of “ingratitude” concerning policies designed by whites for the benefit of blacks. Shelby Steele not only condemns the column as “vile,” but also reveals the uncomfortable truth about the attitude behind Dowd’s column. That attitude is to throw blacks the bone of affirmative action “if you’ll just let us reduce you to your race so we can take moral authority for ‘helping’ you.” John McWhorter calls the Dowd column “the most repellant 700 words I have ever read.” Black leadership, Steele charges, “actually sells black dependency as a white opportunity for moral deliverance” by portraying their own people as “nearly helpless victims.” Steele recalls that in the days of segregation, “When they called you a nigger … at least they didn’t ask you to be grateful.”

Steele notes that Justice Thomas had to endure the “ignominious ordeal” of being “borked” in his own confirmation hearings because of his “lifelong struggle to be his own man.” Steele recalls that the mendacious “borking” of Thomas (unsuccessful in his case) was tacitly condoned by black organizations. That was because Thomas is from a group whose leadership is too insecure “to countenance this degree of individuality and personal responsibility.”

Black leaders and the politicians, both black and white, who manipulate this tainted power cannot afford to let go. When blacks move “beyond grievance” and begin to succeed “by dint of their own hard work,” Steele perceives, the entire grievance structure becomes redundant. Successful blacks, like anyone else, merge into society and become, at last, Americans first and blacks only incidentally. Then the political power the grievance chieftains have cultivated for the political far left evaporates. So black leaders in league with the Democratic Party continue to pose as the good-hearted benefactors of those whose opportunities they thwart, and whose capacities they imprison. Any black who succeeds off the victims’ plantation must be vilified and persecuted as a threat to their leaders’ pose of moral goodness. The leaders fear exposure for helping “victims” that the leaders in fact manufacture and indenture to serve themselves.

This type of identity politics is characterized by columnist Tunku Varadarajan as “part of a battle over moral terrain.” It is part of the Civil War against America that has succeeded in shifting the moral grounding of the nation considerably to the left of where it lay before the student uprisings of the nineteen sixties. The Civil War is designed always to make its work appear virtuous and its results expressive of a new normality. The true agenda is to impose a new morality upon the nation. This would include easy divorce, licentious sex, gay marriage, abortion, destruction of the family, driving religion from public discourse, denigration of the individual, extreme environmentalism, anti-human pseudo-science, destruction of the free market economy, and other ideas and programs discussed in the present book.

The National Education Association, a lynchpin in the structure of the Democratic Party, plays a key role in this battle for the moral high ground. The NEA understands that if young minds can be captured and held, then promoting its anti-American, statist plans will come easily after a generation or so.

The responsible individual, the essential base of a democracy, will have been “de-moralized” out of existence. Like school vouchers to the president of the California Teachers Association, any opposing educational alternative will be defiled as something akin to “child prostitution.” The NEA will then be in position, as expressed by a former NEA president, “to reorder the priorities of the United States of America” in league with the Democratic Party and the other Civil War battalions.
Gertrude Himmelfarb, American scholar, social commentator, and prolific author, puts the moral issue in perspective in her book The Moral Imagination. The book demonstrates that subversion within the West has pitted the traditional moral imagination of the West against what David Gelernter calls “its dead opposite.” That would be rebellion against the America and the liberal imagination described by Lionel Trilling fifty years ago. The struggle to define the moral imagination, as Gelernter has observed, “is turning into full-fledged war.” Himmelfarb notes that Lionel Trilling found, even in the Liberal Imagination that existed half a century ago, the poisonous seeds of pagan ferocity. It is paganism that rules the new Civil War and defines its claim to the moral high ground. Himmelfarb finds nothing less than a drive toward supreme authority for those engaged in the “full-fledged war” she and Gelernter describe being waged against America and the West.

Political Collapse

The stability of the American political system has rested on the existence of two dominant parties of moderation. Republicans trace their heritage back the Civil War, and the abolition of slavery by Abraham Lincoln’s Emancipation Proclamation of January 1, 1863. Republicans have been the more conservative and less friendly to big government intervention.

Democrats trace their heritage back to Thomas Jefferson, who also believed in limited government. The Party was then called Republican, or Republican-Democratic. After Andrew Jackson was elected President in 1828 the Party was called Democratic. At least since the time of Franklin Roosevelt’s New Deal in the 1930s Democrats have been much more inclined than have Republicans toward social experimentation and more intrusive government programs.

On the great issues of democracy, love of America, religion, patriotism, national defense, and human freedom the parties until recently had no basic disagreements. Despite bitterly fought elections, personal animosities, and the like, the two parties have lived together as what might be called non-identical twins, with more in common than not. Now the balance between the major parties has tilted off center, and a fundamental split, both ominous and permanent, seems to be occurring. The heart of the matter, David Gelernter points out, is that Republicans and Democrats used to agree at least on the core values. “Today,” he laments, “core values are exactly what they disagree about.”

The core constituency of the Democratic Party was once the conservative “solid South.” Now it is composed of such groups as radical environmentalists, feminists, multiculturalists, masters of victimology and identity politics, trial lawyers, a labor movement whose leaders promote socialism, and the blogs MoveOn.org, and DailyKos. And over ninety percent of blacks remain bound and blindfolded by Democratic identity politics. This conglomerate of groups is joined and guided by wealthy money sources such as billionaire George Soros.

Soros includes among his litany of disaffection for America the idea that the war on terror is a “false metaphor” invented by President George W. Bush that should be “repudiated.” Authors David Horowitz and Richard Poe of the David Horowitz Freedom Center point out that Islamic terror and its mantra “Death to America” predate Mr. Bush by at least 20 years. Yet this “false metaphor” is the “considered wisdom” of a billionaire who, the authors say, “controls the purse strings of the Democratic Party.” The billionaire’s opinion on terror, and similar poses by others as well, might be written off as laughable eccentricities were they not so dangerous, if not suicidal.

Analyses of Democratic Party composition and tactics is offered by William Kristol, editor of The Weekly Standard, after the presidential elections of 2000 and 2004. His analysis shows a significant difference compared with the results after the 2008 election, both in Congress and elsewhere. In the two earlier elections Kristol includes among the powerful forces of the left in Congress the Teddy Kennedy wing in the Senate, probably a minority of the Democratic Senators. In the House of Representatives the hard left Nancy Pelosi wing was a majority of the Democrats in that chamber. After the Democratic sweep in 2008 Pelosi, as Speaker of the House, has sought from the beginning of her tenure to impose her radical “San Francisco values.” Important committee chairs have gone to the most radical of the Democratic members such as Barney Frank and Henry Waxman. With Barack Obama as President the Democrats in both houses have been drawn almost without exception into the Party’s radical circles. They have voted lock step with the Party line on nearly every trillion-dollar thriller.
Outside Congress Kristol evaluates the far left to include most Democratic grass-roots activists, the liberal columnists, the New York Times, and Hollywood. All these and similar groups, says Kristol, “hate conservatives with a passion that seems to burn brighter than their love of America.” The doctrinaire rebels of the hard left also have the support of numerous well-funded foundations and a majority of the intellectual community, both in the universities and the think tanks. These forces can rely on a public softened and conditioned by the coercion of political correctitude, and soaked in the general PC stance of the national media.

The resulting strategy is on display at the annual YearlyKos, named in recognition of the blog DailyKos. DailyKos is one of the more influential elements of the far left blogosphere that has named its collective self the “Netroots.”

The book Crashing the Gate: Netroots, Grassroots, and the Rise of People-Powered Politics, written by DailyKos founder Markos Moulitsas and Democratic strategist Jerome Armstrong, purports to set out the political philosophy of Netroots. Dean Barnett, who blogs at HughHewitt.com, has analyzed that philosophy. His conclusion is that, “[T]hey don’t have one. Seriously.” But if the Netroots don’t have a governing philosophy, they do have a potent tactical weapon, which Moulitsas describes simply as “winnerism.” If they don’t know what the stand for, they have no doubts about what they oppose. They hate America, Republicans, conservatism, free enterprise, the rule of law, and every limitation on government set forth in the Constitution. This core constituency is, with anger, cunning, and deception, dragging the party of Thomas Jefferson, Andrew Jackson, Franklin Roosevelt, Harry Truman, and John F. Kennedy toward an outright repudiation of the American heritage.

Assistant editorial page editor at the Wall Street Journal, Daniel Henninger, observes that in the 2008 election the Democrats’ oft stated intent to change the economic policies of “the last eight years” of the Bush administration is but a hint of their real intentions. Their real target is the economic and social policies of the last 200 years. The 2008 election, Henninger observes, was about “a long-term change in America’s idea of itself.” It is a change to mimic the European concept of a “social market economy.” What is left of the free market economy will be just enough productivity to feed a “deep welfare system.” Loss of social and political freedom is sure to follow.

America’s non-identical twins, the Democratic and Republican Parties, seem to have parted for good, and gone their separate ways. The Democratic Party, its leadership infiltrated by angry and dedicated partisans of the Civil War, is poised to abandon the American Dream outright for an attempt at authoritarian rule. The two-party system that was once the basis for dialogue necessary to conserve and cherish American civilization is broken. A sharp turn to far left authoritarianism appears to be the final offering, and tragic fate, of the once democratic Democratic Party of America.

VII. A Flickering Torch
19. The “F” Word
A Measure of Freedom

There is a word that insinuates itself into an environment in which liberty is disintegrating. The word forms from a state of mind that takes hold when a significant portion of the population feels a deepening cynicism, a corrosive distrust of existing institutions, and a yearning to surrender to a new order. At the same time those defending the order of freedom, even if they remain a majority, find themselves losing from their vocabulary of counter attack one thought, one word, one symbol of liberty after another. One segment of society embraces radical change, and precipitates among those defending liberty acts and ideas that cause one, then another, and another, of the anchor holds of social stability to be let go. Beliefs and practices that have guaranteed democracy and the rule of law come to be scorned as outdated and useless, and fall into decay. A new leader appears with spellbinding promises of change, a new beginning, and the intoxication of remaking a failed social order. At the crucial point the “F” word emerges from under the wreckage, still in disguise for a time perhaps, but ultimately unable to conceal its true identity. Fascism then raises the enticing mirage of a completely new and more perfect order.
George Orwell was a superb master of words, their meaning, and their invaluable use as political weapons as his books Animal Farm and 1984 amply demonstrate. Well over half a century ago Orwell observed that the word “fascism” had so deteriorated from overuse that it had come to mean nothing more than “something not desirable.” And so it has for many. For some it may have no meaning at all. Then one might ask has such a word as “love” also deteriorated from overuse? The one word, in its purest form, denotes a mysterious and wonderful experience. The other word warns of dehumanized horror and oppression. There is no adequate substitute for either of these words, overused or not. Love speaks for itself, and may be unspeakably divine. Fascism denotes events unspeakable in quite a different sense. Use of the word fascism, if there is no adequate substitute, requires clear definition.

Benito Mussolini, Italy’s fascist dictator of the twentieth century, put the matter succinctly: “Everything within the State, nothing outside the State, nothing against the State.” It is that simple. Fascism claims everything within the State, allows for nothing outside the State, and will tolerate nothing against the State. The individual is nothing; the state is everything. Fascism understood as a generic term includes Communism, the Nazi type of fascism, or Islamic fascism. These and similar societies can be subsumed under the generic term “fascism” for the principal reason that life in all such societies is reduced to the same level of oppression and misery for the people who must endure them.

The individual is eliminated and vanishes in unquestioned submission to the will of those who wield state power. All social and cultural institutions not controlled by the state are dismantled to insure that the regime gathers within itself total command over all aspects of society. This is true no matter what rationale for the creation of such a society might be used, or what slogans, visions, or beliefs might be employed to bring it about. The term applies whether such a tyranny calls itself Nazi, Communist, the Taliban in Afghanistan, or the mad Mullahs in Iran.

To assess whether, or to what extent, fascism is present or developing in a given society is not difficult. We need only to imagine a scale of values against which to set leading social indicators of that society. Democratic freedom lies at one end of the scale and fascism at the other. Law and reason lie at one end of the scale, will and emotion at the other end. One end is rational, the other irrational. At one end of the scale are institutions that allow people to make their own judgments, at the other end are institutions that allow an elite to rule, dominate, and oppress. At one end is the individual, endowed with inalienable rights, at the other end are clods of faceless putty in a uniform mass. Private property is recognized and protected by law at one end of the scale, state ownership or control prevails at the other. At one end there is the family of the mother and father and their children, at the other end the “village” raises the children. At one end is liberty, at the other tyranny.

A fascist state can form out of a free society by violence and intimidation. Or fascism may arrive by slow accretion within a wide range of social and cultural disintegration. In either case certain conditions must be met. The concept of the individual must be ridiculed and destroyed. The authority of the family must be subverted. Education must be a tight state monopoly. A godless state religion must be substituted for traditional religion as the source of culture and morality. The value of human life must be degraded. One of the most reliable indicators of all foretelling a fascist order is a deep layer of fanaticism, intolerance, and hatred within the forces of ascending tyranny. When these conditions occur, singly or in multiples, the social pulse can be taken according to their density and distribution along the scale of values suggested above. Where various symptoms cluster along the scale signals the health or peril of the body politic.

A society can begin falling under constraints of oppression, one after another, for various reasons, or even with no conscious reason or intent. What are the consequences of “bioethecists” insisting that humans are no better than animals? Where would destruction of conventional marriage leave us? What is the effect on the human psyche of obscenity presented as art, or of “music” that reduces women to “ho’s”? What kind of citizens are schools and colleges turning out? What is the probable effect of any of the matters discussed in this book likely to be on the enrichment of the American heritage, or a drift toward oppression? These are elements of the present Civil War, fought with words and ideas, not guns and bombs, but deadly all the same.

An essential goal of fascist revolutions is to destroy the integrity of the mind. It is to incapacitate the civilizing authority of what Freud calls the super ego. It is to neutralize the individual’s will to shape his own destiny. It is to block off the thinking mind from thought. It is to despise logic, deprecate reason, and disparage law and morality as implements of white male oppression. Finally it is to fill the cavity of the mind, once so emptied, with passion, hot and irrational. This process can succeed more easily when the youth of a society are not taught the richness of the culture they have inherited. Those who do not understand the long struggle to wrench freedom from tyranny are less vigilant in its preservation. As is often observed, those who do not know history are condemned to relive it.

In an individual mind the symptoms of fascistic drift may include anger, righteousness, coercion, lawlessness, boredom, a sense of injustice, a feeling of victimhood, the excitement of the mob, or idealism beyond human capacity. These must be coupled with a passion of hate that strips the mind of civility and fills it with fanaticism. Fascism is a revolution not so much of the mind as against the mind.

Hatred, the core element of the fascist mindset as the German Nazis so baldly demonstrated, may speak at first in muted tones. In those dispensing government largess condescension toward their charges merges into contempt, then hatred. Columnist David Pryce-Jones identifies the “insufferable and unexamined conviction” of the planners and fixers that they know what is best for other people. These he calls “liberal wreckers… mutants from Communism and socialism.” The worst of the environment they create is in its moral climate, as one by one rungs are struck from the ladder to human dignity. There may be no intent at the beginning to create a fascist system. Yet, when the smiling façade of government compassion cracks it often reveals, not that far beneath the surface, the scowl of a lust for power.

Doing Harm

The twentieth century British poet and critic T. S. Eliot wrote that, “Half the harm in this world is done by people who want to feel important. They don’t mean to do harm—but the harm does not interest them. Or they do not see it, or they justify it because they are absorbed in the endless struggle to think well of themselves.” The spirit of freedom is ever in danger of being caught up in the gossamer chains of idealistic perfection. The drowsy and discontented dreamer may not carry a clear message, only a felt need. He rouses himself to satisfy his need to do good by marching endlessly in the Movement of the Day. Any movement will do, from “No More War” to “Save the Ring-Necked Toad,” so long as it distracts him from that which he can no longer endure, which is himself.

Eyes gleaming and jaw set to righteousness, he invents or adopts as many new visions of moral virtue as may be required to keep on marching. He remains devoted to protests and indignation whether or not he ever accomplishes anything, and no matter what harm he may do. His need is to profess high purpose and to feel good about himself. That is how half the harm in the world is done. There can be little doubt that such a condition as this brings many to swell the ranks of the Civil War.

The crucial question is how the other half of the harm is done, the harm that is done intentionally, with motivation well beyond crystalline causes and virtuous feelings; the harm done by the true revolutionary. George Gilder, a writer, futurist, thinker, and pundit, among other attributes, identifies revolutionaries as impatient with doing good and contemptuous of half measures. They want to plan society to the last breath of discretion, and impose the plan’s implementation against the last shred of individual dignity. “Reaching for control and certainty,” Gilder cautions, “we end up in the embrace of evil.” That is the aim of the second half of those who do harm, those passionate for power, who must impose their evil on others to validate their victory.

Karol Jozef Wojtyla faced Nazi repression as a clandestine student for the priesthood after Germany invaded Poland in 1939 to begin World War II. Communist repression followed when Poland was “liberated” by the Soviet Union in 1945. Wojtyla was ordained to the priesthood in 1946 after the war ended. When in 1978 by then Karol Cardinal Wojtyla became Pope John Paul II he knew first hand who the enemies of freedom are and how they operate. Speaking at the Denver youth rally in 1993 the Pope warned that humanity is engaged in an “apocalyptic combat” against “the culture of death.” The culture of death rejects the humanity of the human creature as flawed and inadequate. Human nature must be changed, and we must believe that to be possible. The culture of death is a zealous craving for the power to replace a religion of heaven with a religion of heaven on earth. The culture of death is the destiny of attempts to enforce a vision of perfection that cannot be, yet rises up again and again to say it can. Such a vision is fascism in some form.
Even in the freest of societies there is an undercurrent of despotism, of impatience with the goodness of freedom, a weariness of the effort required to maintain a free and open society. The student insurrection at Berkeley and on other campuses in the nineteen sixties could not have ignited the intense response across the country that it did had there not been ample fuel ready for the spark to be struck in a substantial and willing minority. The Civil War forces are composed both of those who do harm unwittingly, and those who intend to do harm. Many who support the war believe they are doing good, and ignore the harm they do. They are led by those with intent to do harm, as much ham as necessary to advance their command and control of American society.

“Socialism Is Dead”

Socialism, a close cousin of fascism, also begins as a construct of the mind. It is presented as a scheme to improve the human condition by way of modest, even democratic, control of human behavior. However enticingly modified, socialism is a subset of the same disconnect from human nature as fascism. The genetic tendency of the socialist dream of equality, justice, and perfection is to morph into the nightmare of the fascist state: the Union of Soviet Socialist Republics, or in Germany the National Socialist Party. Not all states called socialist evolve to the totalitarian end of the scale. But all that is required to hasten the journey is a determined assault against a population that is losing faith in its own tradition, its common sense, and its capacity for renewal. The enticement may include oratory as brutal as Hitler’s, or be of a more golden-tongued and seductive mode, depending on the nature of the population to be controlled or the deceptive skill of the orator.

For much of the Twentieth Century socialism was touted as the wave of the future. Riding the high tide of history socialism was to be woven inevitably into the fabric of the democratic state, forming a warm cocoon of super-Scandinavian-like nannyhood and a free lunch for all. Then socialism, it seemed, had run its course, whether called Communism in the Soviet Union or Nazism in Germany. Socialism had been tried and failed. “Socialism,” it was said, “is dead.” Yet its coffin has never been displayed. There has been no final service. And there is no gravesite that can be visited to assure this is true. Even in Lenin’s tomb on Red Square in Moscow the preservative of his pickled remains somehow makes him seem, as intended, more alive than dead.

Russian mathematician Igor Shafarevich observes in his book The Socialist Phenomenon that socialism has existed throughout history in one form or another. In the West more often than not that has been in the form of Christian heresy, such as the Brethren of the Free Spirit that flourished in northern Europe in the 13th and 14th centuries. The Brethren held all things in common, including wives, and believed that for the enlightened (themselves) sin was impossible. Historically the basis of Christian socialist heresy has been a rebellion of the educated class against the constraints imposed by God and His Creation.

Though not called socialism or fascism in earlier periods, the symptoms are the same. There is the aim to prohibit private ownership of property, to abolish the traditional family, and to eradicate traditional religion. In place of these institutions there is the promise of material equality for all and the elimination of individual and gender differences. Shafarevich calls the socialist enticement a ceaseless war against what is normal, a striving for self-destruction and nothingness. The socialist phenomenon, says Shafarevich, inherently “seeks the death of the human race.” Shafarevich held a position at Moscow University, then under the Soviet Union, when he wrote his book. He looked the monster in the eye as he set his words to paper.

In America the Plymouth Colony, established with the landing of the Pilgrims in 1620, created an idealistic agricultural economy. The land was to be common property worked by all, its products to be shared equally in a spirit of commonality and good cheer. The model for the Pilgrims’ scheme of production and distribution was that laid out by the ancient Greek philosopher Plato in his classic work The Republic; socialism by another name. In Plymouth Colony it soon became apparent that those more lazy or indolent would habitually show up late for work in the fields. Why not? All would be rewarded equally no matter how well they worked. Deprived of the fruits of their labor the industrious also began showing up late and worked less hard. The result was food shortages, suffering, illness and bad feelings all round.

It became obvious to Governor William Bradford that the communal system was not working. It was contrary to human nature. The industrious who followed the rules and produced the food resented having the product of their work taken away to feed the indolent and indifferent. Bradford decreed that thereafter each family would have its own plot, and that all produced on that plot would belong to them. They could eat it, store it, or sell it as they wished. At the end of the first year under the new regime the harvest was bountiful. There was a surplus of food and the colony gave thanks to the Lord for their good fortune. The first Thanksgiving may have been a celebration of the triumph of free enterprise capitalism over collective idealism, long before Karl Marx wrote to repudiate such a system.

The great American literary critic, Lionel Trilling, half a century ago found Marxism a particularly dangerous form of socialism. That was because it combines “a kind of disgust” with humanity as it is “and a perfect faith” in humanity as theory can imagine it to be. Trilling warns that any idea “unconditioned by reality” can slide easily into tyranny. He sees the lesson of George Orwell’s Nineteen Eighty-Four to be the tyrannical power the mind can develop when it severs its connection “from the bondage of things and history.” The result, in the mind or in practice, may be called collectivism, communitarianism, socialism, or some other variant of collectivism, some other pattern of fascistic tendencies. The present Civil War induces Americans toward the same glowing image, a fresh icon of perfection ornamented in a more chic and alluring design. Author and commentator Michael Knox Beran notes that Bill Clinton spoke of communitarianism in his second inaugural address. Beran calls Hillary Clinton’s book It Takes a Village “her meditation on the theme.”

That the core ideals of socialism have not gone away British historian Eric J. Hobsbawm demonstrates in his book Interesting Times A Twentieth Century Life. The book is about Hobsbawm’s infatuation with and service for Communism. He relates his “indulgence and tenderness” for “the memory and tradition of the USSR” after the Soviet Union collapsed in 1991. Hobsbawm evokes the nostalgic “mass ecstasy” of marching with his comrades, which he calls something of a sexual experience. Maybe better, for to Hobsbawm the “climax” of the march “can be prolonged for hours.” Even Viagra can’t do that.

The great American free market philosopher Ludwig von Mises, born in Austria, writes that socialism and capitalism cannot long abide. When rules and directives suffocate choice we are left with the ruin that accompanies perpetual meddling and leveling. Still, those unconditioned by “things and history,” even when they admit to the stench of their failed idealistic experiments, hold to the ideal. If there was needless oppression and slaughter, it was just a “mistake,” and “next time we’ll get it right.” It is the party of “next time” that stands at the door of tyranny, and battles incessantly to destroy its opposition and its country. The communal system does not work, but its ghost still howls about injustice, promises change, and seeks a new testing ground.

Socialism is dead? Not quite. It just needs a new name.
Sociofascism

The job of the prophet is to make the revolutionary’s existence seem “glamorous, darkly fun, and, above all, spiritually heroic,” says lawyer and writer Michael Knox Beran. Michel Foucault, the French postmodernist philosopher, calls revolution “a violence, an intensity, an utterly remarkable passion.” The appeal of revolution is both visceral and cerebral. It can even be sexual as Eric Hobsbawm so vividly confesses.

The perpetual revolutionary truly believes that, with just a little tinkering, he can fumigate the horrors of his past indulgence in the sparkling visions that went dark in the Nazi concentration camps, the Soviet Gulag, and the “Great Leap Forward” of the Chinese Communists. The true believer still aches for something to replace the failed image of the New Soviet Man, or the Ubermensch of the German Nazis, the Overman or Superman. It was the eighteenth century German philosopher Friedrich Nietzsche who conceived the Ubermensch to represent the next step in the evolution of man.

Whatever such societies are called, pursuing the glowing mirage of the Overman inevitably brings forth new masses of ashen Undermen, kept down by cannonades of propaganda, or brutal enforcement if that fails. That the lost pilgrim does not choose to recall. He is blind to the machinery of evil that feeds on the worst crumbs of human experience. Nor does he wish to know that in the shadows of his flawless image, amongst the bottom dwellers of society, there are always the jailers, the torturers, and the murderers waiting to be recruited to the cause. Or that he could become one of them. The tens of millions of corpses that destroyed his blissful visions of the past were only a mistake. Tomorrow, “next time,” he is sure he will arrive at the green meadows of his dreams in pristine and odorless purity.
Revolutionary idealists are likely to be ignorant of history, emotional in motive and theory, and above all intolerant of opposition. A widespread rejection of rational thought and the lessons of human history make these insurgents all the more dangerous. The hordes that fill the streets from time to time do not hide their distaste for American institutions that inhibit their passionate drive to power. A sign observed during a street demonstration in Seattle says it well: “NO COMPROMISE! NO UNITY!” That demonstration is a typical combination of characters. There are the leftover sympathizers with Communism from the Cold War; the graying rebels of the sixties; students from the University of Washington, force-fed the political correctitude of the contemporary American campus; and America haters of other stripes and causes as well from time to time. It was crowds like this that the first Soviet dictator, Vladimir Lenin, gladly welcomed as “useful idiots.”

The gossamer idealist invariably shrieks, when cornered, that he is being attacked by far right extremists, by religious zealots, by corporate villains, by despoilers of the earth. Those who dare oppose him he labels fascistic, and attempts to destroy their character without responding to, or even hearing, the substance of their arguments. That is a common exercise in what psychologists might call projection. Insurgents grasping for power work hard to project onto those who oppose them vivid images of the tactics and goals they themselves pursue. When such as Hillary Clinton raise the specter of a “vast right wing conspiracy” against America, it is to conceal their own plan to rule and regulate their fellow Americans in every way possible. It’s a neat trick when you can get away with it, and these people often do.

Jonah Goldberg is a contributing editor at National Review and author of the book Liberal Fascism: The Secret History of the American Left from Mussolini to the Politics of Meaning. Goldberg writes that today’s revolutionaries look to both the New Deal and the uprising of the 1960s for their inspiration. In both of these Goldberg finds “the parallels with classic fascism too obvious to ignore.” His list of parallels includes “the cult of action, the glorification of violence, the exaltation of youth, the perceived need to create ‘new men.’” Conventional morality and traditional authority are rejected, street mobs and “people power” are glorified, crime is justified as political rebellion, and the rule of law is said to be “a form of oppression.”

Liberal revolutionaries vow to end talk radio, which they see as part of their feared “vast right wing conspiracy.” In typical doublespeak this is to be done by reviving the “fairness doctrine.” That would require some form of “equal time” for “liberal” views to counteract the predominantly conservative views expressed on talk radio. The left was driven to this after attempts to field liberal talk shows largely failed for lack of public interest. Perhaps that is because liberal talk shows are essentially redundant, being repetitions of what is seen daily in most of the major media of the country. The public tunes in to conservative talk radio in order to get another side of the story. When the public votes with its radio dial for what it wants to hear the liberal extreme loses the vote. Forced feeding of the revolution is required to repair the damage. The Civil Warriors know their aims would not prevail were they fully understood by the American people.

Essayist and philosopher Roger Scruton observes that suppression of opposing views is often motivated, not only by those fearing their own aims might be exposed, but also by those fearing to know their own errors. Scruton cites the nineteenth century British philosopher John Stuart Mill’s observation that when we discourage dissent we perpetuate error by making it impossible to see our own mistakes. The Soviet Union suppressed dissent, and it took 70 years for the truth at last to wriggle into the light of day. But only after some 60 million people had died in defense of the error.

To demonize capitalism and condemn the free market is a widely practiced political sport both in America and abroad. It is a brainwashing precursor to the worship of collectivist oppression. A democracy that is wavering and unsure of itself, threatened from without and enduring blasts of hatred and destruction from a Civil War within is in grave danger. Columnist Mark Steyn observes that, “Much of the Western world has a big hole where its sense of identity ought to be.” Much of America has dug into the same hole. A fading sense of identity opens the way to disillusion and despair.

A nation losing its social cohesion does not necessarily hurtle through the gates of a fascist “paradise” all at once. The symptoms may accumulate as a minor constraint here, a larger oppression there. Congress tells us that toilet flush boxes must hold only 1.6 gallons of water. Incandescent lamp bulbs are out; compact florescent bulbs are in. A bureaucratic determination that the spotted owl is an endangered species shuts down whole industries and towns related to timber harvesting in the Pacific Northwest. In Congress there are persistent threats to grant the federal government authority over every water hole in the land, public and private, in the name of environmental purity. Are you eating the wrong foods? Getting too fat? Somewhere in Washington people are paid to worry about things like that. New programs require droves of new regulators to impose massive new regulations. The social landscape is pockmarked by petty interdictions and mined with coercive penalties. After a time any who might still be inclined to protest are worn down to acquiescence one by one through what one observer termed a kind of “slouching fatigue.” Petty tyranny, like the teaser for a new movie, is conditioning for the Big Show.

The philosophers of collective repression, comfortable and tenured, puffing on their pied pipes of illusion, may never intend to go out and do anything themselves to effect their barbaric theories. But those they teach may. What editor and commentator John O’Sullivan labels a Marxist based elite has spawned “a substantial lumpenintelligentsia” of teachers, clergymen, and “knowledge workers.” The lessons they learn from their intellectual sponsors are clear and unequivocal. No laws, no social norms, no ideas of decency, nothing of what could be called Western civilization need be acknowledged or respected. All must be destroyed. Carefully crafted words, and the edicts of political correctness may do for now. Intimidation, terror, violence, and worse, remain the reserve instruments to acquire and maintain power.

Class warfare that is not called class warfare will be employed to complete the destruction. It will be a threeclass pincer movement. The top class will be Big Business and the truly wealthy joined at the hip to Big Government and its army of bureaucratic enforcers. At the bottom is a growing dependent class insistently told they are unjustly held down. This class is bought off by welfare, by having to pay no income tax, by the prospect of nationalized health care, and by the incessant chant that they are “victims” to whom society owes retribution. The top and bottom classes form the jaws of a gigantic vice turning slowly and painfully to squeeze vitality and life out of the productive middle class caught in between.

Those trapped in a tightening grip are the businessmen, inventors, entrepreneurs, salesman, promoters, technicians, and creators of new ideas and new jobs who grow the economy. Economic creativity and productivity decline sharply. The juice is pressed out of liberty. The “negative” rights of the Constitution are crushed in favor of “positive” rights granted by government. The forms of freedom are twisted into scrap. Once the country is “remade,” government officials will direct every aspect of society in the name of the people, as in “Peoples Republics.”

The fascist-in-waiting bides his or her time, summoning recruits-in-waiting; those who despair of conditions as they are, lost souls who search for a new light to follow into the next heart of darkness. The time ripens when circuits of the cultural matrix that once fused moral thought to action dissolve, a few molecules at a time, and disconnect. The ground is prepared for a great transformation; some join in with gossamer visions, some with a hunger for power. Even those who cannot deny the history of collectivist terror remain certain that, “Next time we’ll get it right.”

The stage is set, the actors are in place, and the script is written—and re-written on a daily basis. The Civil War is advancing its cause, while many institutions of Liberty and civility are doubtful or in decay. American society is poised at a critical tipping point. Will it be reaffirmation of our founding principles and reinvigoration of the resulting institutions? Or will it be a new kind of fascism unique to this country? If it is the latter, Americans will need to be coddled into believing that the social structure is being preserved, not destroyed; that what is happening is reinvigoration, not destruction. The façade of societal reform will be dressed in fetching attire to conceal the reality underneath. To reflect such a condition, call it sociofascism.

Is this new regime actually as bad as those original Tea Partiers thought it was? It’s been nearly a year and a half now, let’s look at some of the evidence.

20. Hate Hope and Change
Hate

Hatred is a prime motivating force of the authoritarian mind. Adolph Hitler acted on his belief that hatred was the very fuel of the Nazi enterprise, the energizing substance of the machine. The American presidential elections of 2000 and 2004 elicited in the Democratic Party a passion of hate remarkable in its intensity. Celebrities such as Michael Moore, Ben Affleck, or Barbra Streisand, seen in Hollywood as “heavy thinkers,” exuded hatred toward President Bush that was extraordinarily vitriolic. A similar fanatic hatred toward the President and his policies came to infect the whole Democratic campaign effort. It was a passion that leapt the bounds of fact, reason, or policy differences. Hatred extended to the ritual eccentricity of intellectuals in colleges and universities, to the think tanks and, only barely disguised, to most of the establishment media.

Hatred needs a justification, something to stir the vitals and enflame the mind, a target to be destroyed. A core rationalization for the venomous Democratic hatred of president George W. Bush and the Republicans is that the election of 2000 was “stolen” from them in the Florida recount, which they had insisted upon in hope of upsetting Bush’s narrow victory. Was it stolen? By the established rules of the game Bush had won by a hair, and Florida Secretary of State Katherine Harris was prepared to certify the results. George Bush knew it. Al Gore knew it. Gore called Bush personally and conceded, saying he was on his way to make a public concession at Democratic election headquarters. Then somewhere along the way to his public concession lawyers got hold of the case. They thought they might be able to muddy up the waters sufficiently to emerge with a Gore win. Gore agreed to give it a try. He called Bush back and announced that he was not going to concede after all.

The waters did get muddy. Imaginative legal wrangling occurred that finally forced the U.S. Supreme Court to decide the case. The Court affirmed the original determination of Secretary of State Harris to certify George W. Bush as the winner. A subsequent objective consortium of national news organizations reviewed the Florida ballots and agreed that Bush had won. Gore did not have the election stolen from him. What, then, did happen to occasion such volcanic hatred amongst the Democrats?

It was when their own fraudulent attempt to steal the election from Bush failed that the Democrats went livid with rage and frustration. They had not only lost, but they also had their hands caught in the cookie jar. They needed a target for their seething rage at losing the election, compounded by exposure of their failed attempt to steal it. The angry losers needed a myth of victimhood. The false grievance of a stolen election, pounded endlessly in print and over the airwaves soon became endemic to the whole Democratic Party. The Big Lie became the Truth: George Bush stole the election of 2000 from Al Gore.

When the Democrats subsequently lost the election of 2004 by over three million votes, the cries of unreasoned hate, amplified by a second defeat at the hands of the same enemy, resonated in even greater intensity. The Democrats’ Party Chairman, Howard Dean, shirtsleeves rolled up, face painted red, verging on apoplexy screamed, “I hate Republicans and everything they stand for!” So as to leave no doubt Dean explained. Politics, he said, “is a struggle between good and evil. And we’re the good.” Hatred needs refueling from time to time.

In the election campaign of 2008, by contrast, the hatred and vitriol of the Democratic Party’s trademark character assassination was subdued, a reserve hidden behind the glitter of Mr. Obama’s seductive oratory. Wigwagging back and forth between his two TelePrompTer screens, the message of change and hope poured out in a balm of reassurance. That is, it did until Sarah Palin entered the scene. Then the lid came off. Suddenly there was an opponent who drew crowds at least as large and enthusiastic as Obama did. Governor Palin electrified the conservative base of the Republican Party, which had been less than ecstatic over its own candidate’s bumbling campaign that lacked any clear message or true conviction. Gov. Palin was another matter. It was she, not her stumbling running mate, who was a threat to the Democratic Party. She had to be destroyed by any means necessary. The candidate, the “good cop,” remained serene and placid above the fray. He stood aside while hate spewed forth with a vengeance from the rest of the Party and its supporters.

To destroy Gov. Palin it was necessary to scrape and scour for every wart or carbuncle to be found in her public, private, or family life, and to enlarge anything found to hideous dimensions. Endless charges of misbehavior in office were filed against Gov. Palin, none of which were proved. And how dare she flaunt a handicapped baby in the face of abortion on call. And whose baby was it? Hers or a daughter’s? The “news” that her 17-year-old unmarried daughter was pregnant was front page above the fold in both the Washington Post and the New York Times. The press was shocked. Shocked! As though no one had ever heard of unwed mothers these days. And from the devotees of “free sex!”
Then there was late night host David Letterman commenting that Palin bought makeup in order to update her “smutty flight attendant” look. The stench of this sort of “campaigning” makes its own statement. At its bottom the roar of vituperation, venom, and dirt directed at Gov. Palin and her family was the howl of fear; fear that Palin’s charisma might win the election, which it very nearly did.

But the visceral basis of liberal hatred of Palin was deeper than party politics. Sarah Palin represents what the founders of this nation had in mind as citizens. She is the epitome of the great middle class that has sustained this democracy for over two centuries. She is the antithesis of the radical chic, the professoriate, and all the others who itch to tell their fellow citizens how to conduct their lives.

In any objective sense Sarah Palin is a wonderful example of librated femininity. She has been successful in business, in politics, even in beauty contests. A “poster girl” for women’s rights? A triumph for the “fem-libs?” Not quite. To the anointed elite she is a crude creature of the tundra, a grotesque apparition appearing out of nowhere, hardly a cut above what they would call “trailer park trash.” Everything she stands for taunts political correctitude. Precisely. Those who work so diligently to remain politically correct just can’t stand it. Nor can they understand it. To the thoroughly conditioned liberal-progressives of this country, Sarah Palin and the middle class America she represents are a different race, a lower class, dirt under their fingernails, repellant. They ought not exist to pollute the idealistic perfection of their betters. They are disgusting.

That myths of stolen elections, the lives of opponents corrupted or degraded, and the alleged victimization of their own constituencies are instructive keys to the nature of liberalism and its infection of the Democratic Party. They reveal dependence on fantasy over fact. President Obama returns again and again to the corruption of “the last eight years” to excuse his own errors and failures. American Enterprise Institute national fellow David Gelernter writes that as the substance drains out of the left-wing agenda “nothing remains to feed on … but the bitter weeds of hate.” It was out of these “weeds” that there developed what Gelernter terms “the tragic, pathetic upsurge of hatred for George Bush” that Obama continues to exploit. (Less convincingly each time he tries it.)

This follows the pattern of progressive liberalism as analyzed by novelist Alan Drury. Drury sees liberalism as having sunk into a pattern of “rigid, ruthless, intolerant, and unyielding orthodoxy.” This evolution of liberalism lies at the core of the Civil War, and has been building for years to the culmination Drury succinctly identifies. And Drury’s analysis was offered even before such ultra radical hate-filled Internet blogs as DailyKos and MoveOn.org moved in. These blogs have claimed a major influence, and have dragged liberalism and the Democratic Party ever farther toward the left cliff of American politics.

Journalist and columnist Andrew Sullivan warns that “decadent left” organizations, centered on the east and west coasts, hold such extreme anti-American views that they “may well mount what amounts to a fifth column” in America. The term “fifth column” was invented as a metaphorical addition to Hitler’s armed forces to identify subversive elements within America during World War II. The militant, enflamed, and dedicated legions of the Civil War, eagerly engaged in boring from within, fit the concept of a fifth column. The Washington Times editorial page editor Tony Blankley terms the Democratic Party’s reaction to the Iraq war “the most blatantly unprincipled war opposition short of treason in living memory.” He wrote with kind restraint.

That the hatred originating within the Democratic base appeals to a large number of Party followers was affirmed in a Scott Rasmussen poll taken prior to the 2004 election. The poll asked prospective voters for George W. Bush and John Kerry respectively whether America is “generally [a] fair and decent” country. Among Bush voters 83 percent agreed that it is. Among Kerry’s voters only 46 percent responded that this country is “fair and decent.” Similar polls find that something like one-third of Democrats believe President Bush had a hand in planning the 9/11 attacks! If your child is searching for an occupation with an assured future clientele, you might suggest psychiatry.

Hope

In December of 1964 Chancellor Edward Strong at Berkeley was confronted with a rebellion he did not understand. His response to the rebels was that of civility, to “reason with them” and “seek a common ground.” The rebels played a different game, one that did not recognize rules of civil discourse. When the Chancellor asked, “What do they want? What do they really want?” neither he nor most of those around him understood that they were confronted with a revolutionary situation. What the rebels “really” wanted was to destroy the authority of Chancellor Strong and University President Clark Kerr and to get them both fired. The goal was to eliminate established authority, not to reason with it. There was no common ground to be reached. The Chancellor and the President did not know that there were only two choices: defeat the rebels or capitulate to them. The result for both President Kerr and Chancellor Strong was loss of their jobs, and the belated knowledge that the only “compromise” with revolution is surrender.

When Barack Obama appeared on the scene running for President the response of the American people to the candidate was similar to that of the Berkeley administration to the FSM. They asked themselves the same questions about the Democratic candidate, of whom they knew virtually nothing. Who is Barack Hussein Obama? What does he want? What does he stand for? What is his true character? What does he really want? The response of the candidate was glittering rhetoric with little substance. Instead the candidate offered an Easter basket of beguiling clichés of Hope and Change snuggled amongst “Yes we Can” tufts of diaphanous fluff. Anyone favorably drawn toward the candidate could easily find in his basket of gaudy eggs one or two with a coloration they could fit into their own hopes for change. And then shout heartily with the crowds, “Yes We Can!”

The true goals of the candidate were never drawn in revealing detail. For the mass of voters the Obama campaign of 2008 revealed little of what to Hope for in the season of Change to come. “Yes We Can” was perfect cover for the moment. The candidate never articulated the massive change that is now being attempted. That was neither perceived nor voted upon by the American public. The American voters in 2008, like the blindsided University of California officials in 1964, were not aware that they were dealing with a revolution. Barack Hussein Obama and his “golden oratory” were the perfect foil for the hard left that now has its hands around the necks of America and the Democratic Party. Those who have never been happy with the constraints of democratic constitutional government now swing the wrecking ball of its destruction.

Such revealing statements as Obama did make on rare occasions were never closely analyzed to evaluate his character and ideals. Yet his words, if listened to carefully against his background, even during the campaign, did etch in ominous outline his vision of America. Obama said occasionally that the kind of change he had in mind would be “transformative.” He would “remake” America. No specifics of what that meant were offered. Pick out a gaudy egg of your dreams from his colorful basket and don’t worry. If Obama’s transformative rhetoric was noted at all it sounded to most voters more like typical overblown campaign filler than anything seriously contemplated. Neither the fawning media nor his sleepy opponent pressed Obama on the matter. Sarah Palin tried to, but was overruled. There was also Obama’s off-the-prompter comment to “Joe the plumber” that he intended to “redistribute the wealth.” But even then there was no serious public discussion about how heavily that shadow of Karl Marx might hang over the candidate or his presidency.

A leading index to the real Barack Hussein Obama could be found in pre-election words spoken about the prospective appointment of Justices to the Supreme Court. Mr. Obama boldly stated that he wants “my judges” to help people who are “in the minority,” or “on the outside,” and who “can’t protect themselves” from being dealt with “unfairly.” To do this “his” judges must bring their own “perspectives,” “ethics,” and “moral judgment” to their decisions. There is nothing in that about the Constitution or the rule of law, or other principles upon which this nation was founded. We have seen in earlier chapters of this book what happens when judges are set loose to roam in the wild realms of their own “moral judgment,” “ethics,” and private “perspectives.” They are free to make up any kind of “law” they might fancy on a given day, and brazen enough to claim it comes from the Constitution.

The attacks on Gov. Palin and the inept conduct of the Republican campaign diverted what scant attention there was away from the true nature of Obama’s background and makeup. Little was said, even by his stumbling Republican opponent, of Obama’s lifelong radical left associations, of his refusal to reveal his Harvard Law School records, or the effect of his twenty years absorbing the hate America ranting of Reverend Jeremiah Wright’s “God DAMN America” Trinity United Church. We heard only passing references to Obama’s camaraderie with such as terrorist bomber and virulent leftist Bill Ayers, or of the true nature of Obama’s work as a radical “community organizer.”

There was little if any examination of Obama’s prominent role in assisting and funding the radical ACORN organization, involved in multiple incidents of fraud, intimidation, and voting irregularities. Columnist Frank Gaffney, president of the Center for Security Policy, notes that hardly anything was heard of his early childhood in Indonesia. Those formative years seem to have been spent in Muslim schools by a child whose father is a Muslim. Obama makes much of having become a Christian as an adult, but in what church? Rev. Wright’s, of course.

Few Americans understood the real Hope of the candidate himself, or its implications that shone only dimly through his teleprompted rhetoric. For the majority of voters Hope was for some warm and fuzzy vision of Change that seemed to shine so brightly in an attractive young man from the streets of Chicago whom they were about to elect their President.

Change

It did not take long after his election to see the real Barrack Hussein Obama begin to emerge from under the clouds of his campaign oratory. Instructive as to the change Obama plans for the country are his disparaging references to rights guaranteed by the Constitution. The Bill of Rights set forth in the first ten Amendments, says Obama, provides merely “negative” rights. That’s his interpretation of rights designed by the founders to protect American citizens from raw and arbitrary government intrusion into their lives. This they did by setting limits to government power. But these rights only specify, Obama chafes, what the government cannot do to its citizens. Such rights do nothing, he complains, to guarantee what the government must do for its citizens. The “negative” freedoms of the Constitution just won’t do in a “transformed” America.

What Obama wants are new “positive” rights—all the good things the Democrat liberals can think of to give out “free” as new “entitlements.” Which will come wrapped in helpful new guidelines of thought, and mandatory rules of conduct. Not to mention whole mountain ranges of debt. Obama will teach us one thing; nothing that comes from the government is free. It takes only so many “entitlements” with strings attached to bind the recipients to the Giver and silence liberty forevermore in the Obamanation. The meaning of “positive” rights is one of the insights, gradually perceived by the people of this great country, which began setting the table for the Tea Parties to come.

Almost at once Obama made it clear that he does not appreciate, and to the extent possible will not tolerate dissent. Nationally syndicated columnist Michael Barone observes in The Washington Times that, though he basks in the adulation of nearly the entire mainstream media, Obama “whines about his coverage on Fox News.” The motto of Fox, “We Report You Decide,” is carried out reasonably well. But examining a debatable issue requires presentation of both sides. That means that in any fair report criticism from the opposition will be heard. On one occasion when asked whether he wanted to debate his health care plan, Obama replied no, he just wanted his opponents “to get out of the way.” One real Change Obama offers is to stifle opposing views until one united Obamedia chants in perfect rhythm with the Obaprompters all across the land.

The manner in which Obama attempts to rush legislation through an eagerly compliant Congress says as much about his intent as the content of the legislation itself—were the content ever to be revealed. What Obama’s rush to legislation says is that if the content of his “urgent” bills were widely known and understood the legislation would be likely to fail. So it must be “on my desk” now! Legislation that is rushed, sometimes in the dead of night, or on a weekend when few are paying attention, also says a great deal about Obama’s attitude toward the American people and their democracy. Not to mention the contrast between the “transparency” he promised as a candidate and his hidden machinations in office. On the campaign trail the candidate promised the most open and candid administration ever. No laws would be enacted until their full text had been posted on the Internet and fully debated in Congress.

First was the trillion-dollar “stimulus” bill. It was some 1100 pages that no one in Congress had seen. But the matter was urgent. At the President’s blunt insistence the stimulus bill, though not one member had read even one of its 1100 pages, was jammed through Congress in about 48 hours. The bill then languished on the President’s desk for three or four days while he took an extended weekend off before he signed it. The urgency was not to get its contents working to stimulate anything. It was to get the thing passed before anyone knew about the takeovers of companies, the favors to unions, and the special interest pork hidden in it. Some call it the “porkulus” bill.

Before the “stimulus” could be digested, or even read, the next urgent matter was health care, Obamacare as it has come to be called. Consisting of more than 2000 pages, Obamacare was said to be even more urgently needed than the stimulus bill to get the economy going. This would occur by relieving businesses of onerous health care obligations. And it had to be done before the August recess of Congress in the summer of 2009. That didn’t happen. Even the President’s own Congress didn’t see the need to rush this time. Then it was to be on the President’s desk early in the fall. That didn’t happen either, though the House of Representatives passed its half of a bill. The machine was slowing down. The Senate had to be satisfied with “debating” its version of the bill while its contents remained locked up in Majority Leader Harry Reid’s office. On the Senate floor, and behind its closed doors, there developed what some call the Democrats’ own little civil war among themselves. Republicans were excluded from any participation whatsoever in any of these proceedings. But something had to be done before Christmas. The President had already hung up his stocking.

What he found in it on Christmas morning were two irreconcilable versions of Obamacare, promising a brutal Democratic family feud that would have to be settled in the New Year. But never mind, the President had got something of a down payment on what he wanted in time for Christmas. Later on arms would be twisted, promises would be made, and bribes would be paid with taxpayer money. The 60-vote filibuster-proof Senate majority would take care of the Senate, and Nancy said she could manage the House. Not to worry.

The Constitution requires that the President’s appointment of principal policymaking officers of government shall be subject to the advice and consent of the Senate. This is normally done through confirmation hearings. The Obama administration is circumventing this constitutional requirement by setting up a parallel government of “Czars” to do the President’s bidding without congressional approval. The Glen Beck Fox News show counts 32 Czars, while the Capitol Hill newspaper Politico finds approximately 28 czars, 22 of whom are neither confirmed by the Senate nor authorized by statute. These 22 include Czars for International Climate, Pay, Science, and WMD Policy.

As to the WMD Czar, can there be a more vital concern than dealing with weapons of mass destruction? Should this function be carried out with no congressional review and no public assurance that it is designed to be effective? This is especially sensitive since President Obama abolished the security structure of President Bush, which had kept us safe for eight years, in his first 48 hours in office. The failed Christmas airline bombing over Detroit and the Fort Hood massacre by an obvious Islamic militant suggest holes in the security net, to say the least.

Obama’s Czars structure would appear to be an integral part of what might be called the President’s Plan 2. Programs that cannot be enacted through Congress can more quietly be implemented in the shadows of these invisible Czars. The Czars govern with the consent of no one except the President and his immediate staff. As an added affront the administration has announced that it will not allow its self-appointed Czars to be called to testify before Congress to justify their existence or describe their functions.

The Democrats’ handling of the Stimulus and Obamacare matters tore the glaze of rhetorical bliss from the public eye. That revealed a better understanding of what “Change,” never defined by Obama the candidate, actually means. In fact, everything about this administration revolves around a single subject, a single goal: the acquisition of centralized power as absolute as can be achieved. An Obama brigade in the Midwest, speaking more candidly than most of the lubricated voices of Washington, raised these issues to the BAMN level: By Any Means Necessary.

When the Democrats won the 2008 election it was not the misty dreamers who occupied the front ranks. It was the fervent warriors at the Party’s base who stood at the gates of power. If reminded that those in the past who have won absolute power have led their countries only to repression and misery, they may admit that is so. But the reply is always, “Next time we’ll get it right.” When the inaugural doors were flung open on January 20, 2009, those who had waited so long rushed in to command machine they had finally captured. As they grasp the levers of power, wearing the benign countenances of a Dr. Kevorkian ushering his “patients” to their destiny, their quiet smiles tell us, “Next Time is now.”

Civil War

“We are the ones we have been waiting for.” With that revelation Barack Hussein Obama took command of the Civil War even before he had won the nomination, and as President his prophecy is confirmed. At first diverse, scattered geographically, philosophically, and practically, the Civil War against America grew slowly but steadily, became more conscious of itself as such, and inevitably more subversive. The War now has a centralized, coherent plan, and an undisputed Field Marshall in command, with the others “we have been waiting for” at his side. The direction of the Civil War against America under Obama’s and his chosen people is not hard to discern.

Columnist and editor emeritus of the Washington Times, Wesley Pruden, points out that Obama is the only President who has no “instinctive appreciation” of the law, history, literature, or tradition out of which America is made. “The genetic imprint writ large in his 43 predecessors,” says Pruden, “is missing from the Obama DNA.” So what is in the Obama DNA? Quite an interesting mix, once examined. According to Islamic law a man born to a Muslim father, as Obama was, is a Muslim.

In his June 2009 speech to the elite of the Muslim world at Cairo, Egypt, Obama referred to his “inner Muslim,” and associated himself with the Muslim Brotherhood. The Brotherhood, founded in Egypt in the 1920s, is the first and most radical of the Shariah movements dedicated to establishing global hegemony for Islam by whatever means required. In practice this means the terror of the jihad using whatever weapons available. A Muslim Caliph would rule the world. Universal application of cruel and coercive eighth century Shariah law would replace democracy, constitutionalism, and the rule of law as we know it. Why would an American President associate himself with such a movement? What does this imply for our efforts to expose and defeat the terror of Islamic extremists dedicated to our destruction?

Alluding to Obama’s early childhood, his long affinity with a radical anti-American church, his favorable view of the Muslim Brotherhood, and his now happy propensity to feature his middle name, Hussein, forbidden during his campaign for office, Frank Gaffney offers a “stunning conclusion” that he sees as “increasingly plausible.” Gaffney, President of the Center for Security Policy, concludes that Obama may have managed “the most consequential bait-and-switch” since Adolph Hitler played the trick on British Prime Minister Neville Chamberlain at Munich in 1938. After giving Hitler a chunk of Czechoslovakia as, “the end of Germany’s territorial ambitions in Europe,” Chamberlain went back to England proclaiming “Peace in Our Time.” That gave Hitler one more year to complete his massive rearmament of Germany before he invaded Poland in September 1939 to launch World War II.

We are slowly discovering the full extent of who the real Obama may be. There appears to be in his makeup neither a drop of American pride, nor a wisp of the spirit that has made America great and exceptional. This man who has become President of us all might as well have been dropped down from another planet, so alien is he to the soul of the country he heads. It is this essentially non-American individual who wants to “remake” America. He has already informed us that, “We are no longer a Christian nation.” Obama’s bait was his soaring rhetoric. The switch is to the destruction of America as we have known it. “Transformative” indeed.

The ultra liberal Civil War administration of Barack Hussein Obama is effecting a “coup d’etat.” Charlemagne crowned himself King of the Franks in 751 AD, there being in his opinion no higher power to set the crown on the royal head. In the same manner the self-anointed in Washington today crown themselves the best and brightest. At last they are on a fast track to sate their appetite to tell those beneath them how to ruin their lives. The Shepherd speaks, and the sheep shall lie down and bleat in acquiescence to government power, along with the Hollywood crowd already down on its knees. And power is nothing if it is not power over other people.

This is liberalism in betrayal of the last remnants of this nation’s founding ideas of liberty, individual freedom, and a prosperous free market economy. All that is to be sacrificed to those who have ascended to the heights of government and now grasp for supreme power. A multi-billion dollar stimulus bill is proposed to encourage small businesses and entrepreneurs to hire more people and get the economy moving. The main provisions of the bill contain enough pork to kill a billion pigs and pile up a trillion dollars of debt. It becomes ever more apparent what the strategy of “transformative” change will mean for this country’s economy. A few bags full of pennies will be thrown to private business as decoy assistance, while the rest is to disappear into the belly of the insatiable sociofascistic monster this administration is patiently breeding.

As to tactical means it is helpful to bear in mind an observation of Bill Clinton, who has some familiarity in such matters. Clinton reportedly cautioned that Obama has “the political instincts of a Chicago thug.” Or consider this tactic as stated on television by Service Employees Union International president Andy Stern. To advance their interests, Stern says, the SEIU uses “the power of persuasion.” But if that doesn’t work they turn to “the persuasion of power.” Mr. Stern, who claims to have been the biggest contributor to Obama’s election, is recorded visiting the Obama White House more than any other individual outside the White House Staff. His enforcers, the SEIU logo prominently displayed on their purple shirts, have been caught on video using the persuasion of power to beat up dissidents, both black and white.

When George W. Bush was President the media and other Civil War brigades endlessly criticized, and not infrequently lied about, his policies. When chastised about their more extreme statements the Democrat response was, “Protest is patriotic.” Fast-forward to Tea Party protests about Obama’s policies and spending binges. Keith Olbermann on MSNBC called the protesters “worse than racists.” In the New York Times Paul Krugman said the protesters were motivated by “cultural and racial fear.” Prominent Democrats have characterized protesters as fascists, un-American, and worse. Why? Well, they have the audacity to disagree with policies of the Prophet we have been waiting for, and are expressing their constitutional right to say so. Attempts to vilify its critics when it cannot justify itself on the merits of the issues raised are the trademark of liberalism under pressure.

In a speech in Colorado on July 2, 2008, then candidate Obama called for a vastly enlarged internal security force. In a speech taped and available on YouTube, this is what he said: “We cannot continue to rely only on our military in order to achieve the national security objectives we’ve set. We’ve got to have a civilian national security force that’s just as powerful, just as strong, just as well-funded” as our military. Just what the national security objectives “we’ve set” might be were not articulated. So it is not clear why internal security forces would have to be “just as powerful, just as strong, just as well funded” as the existing forces of the United States Army, Navy, Air Force, and Marine Corps.

We do know of similar internal police forces in recent history. One was called the Gestapo in Nazi Germany, another the KGB in the Soviet Union, with like agencies in similar societies today. These forces were to keep the masses totally subservient to, and incapable of rebelling against, the state. What internal threat can be imagined that would require the American nation to be so heavily policed? Is the use of such massive forces contemplated to meet resistance to as yet undisclosed “national security objectives we’ve set?” That the idea of such a massive internal force lies in the back of the mind of a President of the United States is not a reassuring insight into this President’s intentions and objectives.

Author and columnist Andrew Breitbart points out that Democrats have a developed a coherent strategy to defeat “their enemy.” The “enemy,” Breitbart reminds us, is “precisely how they view the Republican Party,” and the Democrats “play for keeps.” Breitbart cautions there is no longer a possibility of true bipartisanship between the parties. Bipartisanship as now practiced by the Democrats works only when Republicans abandon their core principles. A prominent Republican Senator from Arizona has never seemed to learn that lesson. His penchant for “crossing the aisle” in a “bipartisan spirit” was rewarded as might have been expected in the 2008 presidential election. He was defeated by a then virtually unknown, untried, and inexperienced young man from the streets of Chicago.

For the Democrat liberals to be defeated in their plans to take over America, Breitbart warns, that the GOP must recognize them as its enemy, just as Democrats think of the GOP as their enemy. It is unrealistic, and selfdefeating, to think of these Democrats as merely “adversaries” or “antagonists.” Breitbart emphasizes that we must not only name the names, but also learn to play “for keeps.” Though Breitbart does not use the term, he clearly accepts the fact that we are engaged in a great Civil War for the soul of this country. What are the implications when Barack Hussein Obama proclaims, “We are no longer a Christian nation?” It is the Christian religion that holds all souls to be equally precious. If we are no longer a Christian nation can the American people any longer rely on the “unalienable” right to “life, liberty and the pursuit of happiness” “endowed by their Creator” as set forth in the Declaration of Independence? If we are no longer a Christian nation can we expect to be told next that we are also no longer a constitutional nation founded on the principles of that Declaration?

The nineteenth century British Prime Minister William Gladstone proclaimed the American Constitution “the greatest work ever struck off at a given time by the brain and purpose of man.” Barack Hussein Obama thinks he has better ideas for this country. Peter Wehner, a senior fellow at the Ethics and Public Policy Center, thinks not. The Constitution, he says, “does not bow before a president in a hurry—even a young, charismatic, and impatient one.”

During the campaign in the summer of 2008 candidate Obama, in a rare unprompted moment, let slip a glimpse of an ideal America remade according to the gospel of Saint Barack: “We can’t drive our SUVs, and eat whatever we want, and keep our homes at 72 all the time, whether we live in the desert or the tundra, and keep consuming 25% of the world’s resources with just 4% of the world’s population, and expect the rest of the world to say you just go ahead. We’ll be fine.” No, said the Prophet, “That’s not leadership.” And under his leadership, “That’s not going to happen.” That goes to the heart of Obama’s attack on America and its incredibly productive experiment, both spiritual and economic. The human spirit let loose is remarkably creative. And that is an unacceptable threat to those bent on command and control. Obama wants America chopped down to a size that fits him.

The intent is to stretch the tentacles of government until they touch every inch of your body, and penetrate every niche of your brain. This may not succeed entirely, but those whose eyes gleam at the thought of power—power over you—will not give up. Frank Gaffney, founder of the Center for Security Policy, reports a video released in mid-2009 featuring “dozens” of Hollywood celebrities. Actors Demi Moore and Ashton Kutcher urge viewers to join with them as they “pledge to be a servant to our president and to all mankind.” Liberalism decaying into sociofascism flourishes where hunger for servitude exists.

The battle lines of the Civil War are clearly drawn. On one side are the masses marked out to kneel at the altar of government beneficence. On the other side are the few who would anoint themselves their priests and wardens: “those we have been waiting for.” We are being moved toward the condition in which, as the poet William Butler Yeats predicted, “the center cannot hold.” And should the center fail, tyranny displaces liberty.

The truth behind Obama’s intent is obscured, as it is calculated to be, by what is widely credited as his great oratory. A great orator speaks from substance, with a vision to inspire the people of his country. As German bombs fell on London in the darkest hours of World War II the great oratory of Winston Churchill rallied his people to save their country. Pericles, building upon the exhilarating Greek victory over the invading Persians in 480 B.C., called the citizens of Athens to greatness. Abraham Lincoln at Gettysburg, as the Union’s existence tilted in the balance, vowed that “this nation, under God, shall have a new birth of freedom.” Ronald Reagan, cheerful and confident, spoke of a shining city on a hill, and led the nation out of the despond of Jimmy Carter’s gloom to victory in the cold war, a new era of genuine hope, and prosperity beyond any the earth had ever seen.

These were men of deep beliefs, secure in themselves, with no need for prompting to recall who they were pretending to be on a given day. These great men of history spoke to the patriotism and creativity of their people, and inspired them to believe in themselves and their country. Is Mr. Obama, dour, deadpan, and soulless, with an arrogant tilt of the head, a great orator? Is stroking with soothing words those whose wonderful country he intends to drag down into sociofascist poverty quite the same thing? Is a wigwagging mist of rhetoric, soaring from alternate TelePrompTers, if seemingly enticing upon emission, but vaporizing when pursued for substance, great oratory? There must be another word for it.

The pattern of Obama’s contemplated redistribution of wealth is at its core a redistribution of power. The model is that enunciated by Karl Marx, the spiritual father of the worst totalitarian horrors of the last century. Marx speaks and our President listens: “To each according to his needs, from each according to his ability.” That is the hymn Marx teaches the choir of believers in his commune, dressed up to make it an idealistic “ism.” Communism with a small “c” was tried on American soil in Plymouth Colony over four centuries ago, long before Marx picked up the idea of a “dictatorship of the proletariat.” Colonial communism was rejected simply because it didn’t work. Obama, like Marx, is interested in redistributing wealth only to the extent necessary to accomplish his ardent desire for accumulation of power dressed in a new label. Redistribution of wealth is an attempt to create a mask of populist legitimacy to cover a grasp for power. In the end this leaves a new “proletariat,” its usefulness as a tool for achieving power ended, at the bottom of the heap as always.

Obama aligns himself with the climate changers for the same reason. The climate changers attempt to hide their plan for power as a plea on behalf of “the planet.” How convenient to have a client to speak for that cannot answer back. What the redistributors and the climate changers both want is the long tentacles of government, manipulated by themselves, to touch and corrupt the dreams and the opportunities of every American. None is to succeed as an individual. All are to exist only as supplicant members of favored groups, grasping for government “welfare” in one form or another.

A President in these times must have our best wishes, and our hopes and prayers that he will put the nation first. Our deepest need is that President Obama should come to recognize traditions to be honored, families to be fed, dreams of a better life to be fulfilled, and a great nation to be preserved beyond his own transformative ambitions. Edwin J. Feulner, president of the Heritage Foundation, reports an arresting analysis of the alternative:

“It must be said, that like the breaking of a great dam, the American descent into Marxism is happening with breathtaking speed.” This analysis comes from Pravda, Russia’s leading newspaper, and the official organ of the Communist Party in the former Soviet Union. Would it seem reasonable to assume that those people recognize the shape of Marxist tyranny when they see it coming down the road?

Is this the “Change” Americans were led to “Hope” for?
The Two Commandments

Moses was given Ten Commandments for his people to assure their obedience to God’s will. In the public squares of America, long since cleared of God by the Supreme Court, the Ten Commandments have been trashed. In their place, the field marshals of the Civil War, serving as its high priests as well, and perhaps anticipating a shorter attention span in the modern mind, require only Two Commandments: Be Not Judgmental; and: Be Politically Correct. Incessantly repeated and rigorously enforced, these are the psychological weapons that subvert the defences of American civility and break down the walls of individual judgment and restraint. Through the resulting gaps the legions of Civil War advance and conquer.

Ragged patches are torn, one after another, from the fabric of Judeo-Christian culture, leaving swatches of decay and disintegration. Sustaining institutions are changed, falter, or fail before the onslaught levelled against them. Large segments of the American public have fallen into careless, sometimes willing, obedience to the simple imperatives of those two Commandments. The passion of the 1960s student rebellion against America could not have solidified into a revolution, and the revolution into Civil War, without widespread pubic acquiescence or indifference. Essayist, author, and National Review columnist Florence King laments the “insipid pride” so many take in being non-judgmental. She faults our eagerness to rely on “maudlin excuses” for almost any sort of aberrant behavior. To which she adds our “bottomless capacity for suffering fools.”

A result is that new “rights” are invented in profusion to favor the deviant, the unproductive, the anti-social, the subversive, and the criminal. It is surprising that we have so far been able to sustain and enforce what’s left of criminal law. To institute and enforce law requires a judgment—yes, a judgment of (sorry, this just won’t go away) right and wrong. But despite their denial and repression by the Civil War the time-tested judgments of our history and our culture are there. They whisper to the inner recesses of our spirits. But too often cower mute in the shadows to avoid ridicule or retribution for being politically incorrect if spoken aloud.

Some notice through the smog of political correctitude that the injunction against being judgmental does not apply to those who mandate it. The ministers of revolt, as the curators and enforcers of the First Commandment—Be Not Judgmental—are not so constrained. They are not merely free to judge. They must judge, harshly and relentlessly, if they are to succeed. And they are succeeding. They are judging the sustaining values of America, that shining city on a hill, out of existence. We do not live merely in an immoral age, but in an age of a newly imposed perverted morality. Andrew Ferguson, a senior editor of The Weekly Standard, fears that a substantial segment of the American public feel a need to judge others. This they do under cover of political correctitude. Their need to censor their neighbors, to be alert to the failings of others is “underlying and ineradicable.” That need, Ferguson says, is nearly as vital to them as the need for food and warmth. It is these who flock to the legions of politically correct judgementalism and leave their neighbors no peace.

A free society must have values and standards of behavior designed to keep it free. If these are to be judged into oblivion, the process must be called something other than judgment. That is the job of political correctness. A free society is also a society of toleration. The commanders of the Civil War, the radical strategists at the core of the Democratic Party, and often a majority of the Supreme Court, have cleverly shaped the civilizing element of toleration into the debilitating requirement of political correctitude. It is the thoughts and acts of political correctitude as moulded by the revolutionaries that are not to be judged. It is judgments revealing that the Civil War is designed to destroy the values of liberty that must be interdicted. It is those who would expose the subterfuge of this system of perverted judgment who are forbidden to be judgmental against the horror of that system.

In the meantime the legions of rebellion are judging how best to complete the extinction of the civilization against which they wage their holy Civil War. The insurgents have become the new moralizers. It is they and their supporters and sympathizers who now guide and shape the “underlying and ineradicably” human tendency to judge their neighbors. And, says Ferguson, “the new moralizers, like the old, can’t shut up.” The new morality of enforced political judgments works to drive the capacity for individual judgment to its grave. This leaves the insurgents free to enforce the dictates of the Second Commandment—Be Politically Correct—to do our judging for us. It is the politically correct judgments of a revolutionary ascendancy that are ushering American society toward the deconstruction of itself. More than one student of history has observed that civilizations are not murdered, they commit suicide.

Still, human nature, maligned concept though it is, may yet speak out from beneath the falling structures of the West.

21. Tea Parties
Spontaneous Combustion

On the morning of February 19, 2009, CNBC business reporter Rick Santelli was giving a talk on the floor of the Chicago Mercantile Exchange shortly before the opening hour. No one, certainly not Mr. Santelli, had any notion that he was about to ignite a nation wide political firestorm of immense potential. Santelli was opposing President Obama’s mortgage-relief plan when he suddenly launched into a four-minute “rant” (his term) against bailouts and high taxation in general. Then it was as though an image of the Boston Tea Party crossed his mind.

In the early 1770s a major complaint against Britain’s King George III was a very high tax imposed on tea imported to the colonies. It was another case of taxation without representation, a flashpoint in the revolution soon to follow. The tea tax greatly incensed the colonists who, being British subjects at the time, were ardent tea drinkers. In the winter of 1773 three ships carrying tea arrived in Boston harbor and moored side by side. The colonists decided to act. On December 16, 1773, dressed as Mohawk Indians three parties of men simultaneously boarded the three ships and went to work. To the cheers of crowds ashore altogether they dumped 342 chests of tea into the harbor, after breaking them open to be sure all the tea would be ruined. The harbor water ran brown for several days.

Inspired by his vision of this early act of American independence and anti-tax spirit, Santelli shouted for a new “tea party” to protest the horrific budget deficits already rolled up by the new Obama administration, with more to come. As it happened the Drudge Report was linked to the Chicago Exchange floor and from there to YouTube where, within hours, the tea party idea had awakened a national sensation. Tea Parties began organizing across the country. Their “coming out party,” as it were, was held on income tax day, April 15, 2009. On that day Tea Parties asserted themselves on a national scale and they have never stopped. They have just kept going and growing.
On the Fourth of July some 500 more Tea Parties were held throughout the county. During the 2009 summer recess of Congress the Tea Party Express was organized. A cross-country caravan hosted rallies in 35 cities, ending in Washington D.C. Many other groups, sharing the Tea Parties’ growing horror of what was going on in Washington, associated with the Tea Party movement and supported the Tea Party Express. These included Freedom Works, headed by former House majority leader Dick Armey, and the National Tax Limitation Committee founded in 1975. Quin Hillyer, who writes for Washington Times and The American Spectator, quotes Ned Ryun of the Tea Party Patriots on the need for active participation at all levels. Ryun calls for “at least 10 percent” of Tea Party participants nationwide to “at least think about running for office,” or at any rate “become serious activists at the local level.”

The climax of the 35-city Tea Party Express was a taxpayer rally in Washington, D.C. on September 12, 2009, the day following “9/11.” There a million voices informed those whose minds are locked up inside the Washington beltway that there is a country out there. The real America. It is an angry country that will insist on being heard. Homemade signs abounded. One suggested, “Ropes and chains, not hope and change.” Other signs spoke to the bailouts of banks and auto companies: “Let the Failures Fail.” The sign of a teenager pleaded, “Stop Spending My Future.” A senior’s placard took up a theme from Sara Palin’s warning that Obamacare would lead to “death panels” deciding when the sick and elderly had outlived their contribution to society. That placard simply asserted, “Gandma’s Not Shovel-Ready.”

It was immediately apparent from the panicked reaction in Washington and the media that the Tea Party Express and the Washington rally had done their work. According to a report by syndicated columnist Mark Steyn, the “ruling Democrat-media complex” denounced the Tea Party people as “confused,” “angry,” “Nazis,” “racists,” “evilmongers,” and “right-wing domestic terrorists.” Secretary of Homeland Security Janet Napolitano called the Tea Parties “radical subversives” endangering the security of the country. She had to moderate her evaluation when their remarkably diverse composition and their patriotism became undeniable.

But from her perspective Napolitano was quite right. Her vicious response, and the similar responses of many others supporting the Obama administration, gave away the game. The Tea Parties are subversive to them and their intentions. The Tea Parties are about as subversive as they could be to the policies and future intent evident in this radicalized White House and its similarly ultra left supporting Congress. It was evident that the Tea Party Express and the Washington rally had touched nerves of raw fear that this government was being exposed for what it is, and for the sociofascistic goals it has in mind.

Lighting the Fire

The Tea Party protest, so threatening to the Obama administration, reaches back to the foundations of these United States. As Mark Steyn puts it, the “intellectual heft” of the Tea Party uprising rests on the founding principles of the American nation. The wielders of illicit power are reminded that it was “We the people” who founded this country. The Tea Parties are the voice of the people crying “STOP!” to a runaway quest for power in Washington. And, as Internet news publisher and columnist Andrew Breitbart observes, the mockery, recklessness, and libel being used against the Tea Parties isn’t working. The Tea Parties and similar protests “have only gotten bigger and stronger.”

The website of the Tea Party Patriots states that the Tea Parties stand for three core principles: fiscal responsibility, constitutionally limited government, and free markets. Adherence to these principles also implies a stand for individuality, integrity, a sense of responsibility, observance of the law, honesty in government, and—dare we say so?—honor in public office. All of which the Obama administration stands against and wishes to destroy. Of course the protesters are subversive. They are a threat to unconstitutional government. Vive la Subversity! The Tea Parties are people a tyrant must seek to control or destroy—by any means necessary.

The Tea Parties have furnished stimulus to existing organizations and inspired the establishment of new efforts to rein in out-of-control federal government. The American Majority created a new website called AfterTheTeaParty.com urging its adherents to activism, to run for elective local and state offices, and to learn more effective use of internet resources such as Twitter and Facebook. The blog Sunshine Review stimulates accountability and transparency in government. Judgepedia.org is tracking judicial activity of the nation’s 338 state Supreme Court justices, and plans to set up files on every state court of appeals judge as well. The Pelican Institute in Louisiana and the Alabama Policy Institute perform similar functions. The Tea Party Patriots was organized as an umbrella group to coordinate the more than 800 local organizations that had formed the original Tea Party movement.

The insights that stimulated founding of the Tea Parties continue to gain strength and influence. The true motives of the Obama administration in health reform and other grandiose plans have surfaced by forcing a degree of “transparency” the planners had promised but had never planned on delivering, and the planners have become hysterical. Secretary Napolitano’s intended home run blast at “radical subversives” was soon reduced to not much more than a leadoff single as more batters came to the plate. The Tea Partiers became “extremist mobs,” “un-American” “brownshirts” (the SEIU prefers purple shirts), “pawns of the insurance industry,” “Astroturf” (rather than grass roots), and “political terrorists.” Senate majority leader Harry Reid picked up the term “evilmongers.”

These and similar terms were used by Democrat House leaders Nancy Pelosi and Steny Hoyer, other Democrats in Congress, the Democratic National Committee, and Senator Dick Durbin. According to Pelosi, “An ugly campaign is underway.” A Democratic House member from New York accused Iowa Sen. Charles Grassley of “treason” for criticizing Obamacare. This sounds like something out of the old Soviet mouthpiece Pravda preparing the country for arrests and show trials. After such Democrat responses as these Pelosi and Hoyer wrote in a joint column in USA Today that opposition to their policies amounted to attacks to silence them. They added, apparently not intending the irony, that, “Drowning out opposing views is simply un-American.”

Vituperation such as this is an attempt by the revolutionary left to stifle debate on the radical measures this administration intends to adopt. They know they do not have public support when their intent is exposed and understood. Their recourse seeks to avoid the issues by subjecting those who disagree with them to smears, name-calling, and character assassination. It is a vendetta against citizens of this country who dare to express doubt and opposition to policies being considered by their elected representatives. As though to certify these attempts to smother democratic debate as the official policy of his administration, the President of the United States accuses the Tea Party people of “fear mongering.” He may be half right in the sense that Secretary Napolitano was right. There is plenty for the Tea Party and its supporters to fear from what is going on in Washington today. And those in charge also have plenty to fear as the vault of their hidden revolutionary intentions is pried open for all to see and understand its contents.

When the Tea Parties attended town hall meetings during the summer recess of Congress in 2009 the fight against them in some cases turned from verbal to physical. A black man protesting the President’s health plans at a Missouri meeting was physically attacked by Service Employees International Union members wearing their trademark blue shirts. One called him a “nigger.” Commentator and online columnist Andrew Breitbart reports that these “union thugs” were directed by the White House to go to such meetings and “punch back twice as hard” as they were allegedly being punched. That’s the Chicago way of governing.

The student revolts of the sixties struck a spark into tinder that was, in the mood of a substantial minority at that time, waiting for the fire of revolution. The result was to ignite a Civil War that after some four and one-half decades has culminated in a Marxist inspired takeover in Washington. The Tea Party response is a counterstrike. It is the awakening of massive resistance against Civil War usurpation that slowly built up in the rear of its advancing armies as the revolution developed, and its intentions became more apparent.

The Tea Parties are, in one sense, a spontaneous recognition, as University of Texas professor J. Budziszewski would say, of that which “we can’t not know.” The Tea Parties grow and prosper in the face of the vilest attacks. They prevail because their protests show that what is happening to them and their country poisons the roots of their existence, and would crush the foundations of their great nation. The Tea Parties see power that would replace the rule of law. They see government of the people becoming government of a self-chosen few who think they are “the ones we have been waiting for.” The Tea Party people are sickened at the arrogance of such messianic blasphemy.

The massive chords of sympathy and support being struck by the Tea Party protests reawaken awe and respect for the miracle of the late eighteenth century: the unique revolution won after bullets and battlefields had faded away that is the American Constitution. The Tea Parties remind the nation that it was “We the People” who did ordain and establish a Constitution to replace the face of tyranny. We don’t hear much about the American Constitution from Democrat administrative or congressional Washington today. Reminders of its existence, and of its ingenious structure of limited government shock and fill with fear those that great document will ultimately defeat.

Conservatives in American have been bullied into believing they should hide their faces, speak softly, and give assurance that they are as “compassionate” as the revolution claims to be. Yet the 2009 Gallup poll annual survey showed that when asked whether they considered themselves liberal or conservative, 40% said conservative, 35% said moderate, and only from 17-20% said liberal. Similar polls report virtually identical results. Since the middle 1950s classic liberalism as described by Lionel Trilling at that time has lost its way. Over the decades liberalism has mutated into a radical revolution, and a Civil War. It ended up installing in the White House an occupant of Marxist orientation; a sociofascistic President. The disconnect between that reality and the innate conservatism of the American public, as affirmed in the Tea Party revolt, could hardly be more astonishing.

The timidity of conservatives in asserting what they purport to believe is particularly glaring among Republican members of the Congress. As a consequence Mark Steyn points out there is “no detectable enthusiasm” among the general public for the Republican Party as such. One reason may be, Steyn thinks, that such as John McCain, Lindsey Graham, or Orrin Hatch are likely “panting to ‘reach across the aisle’” in a bipartisan spirit. And before they know what has happened they have had their pockets picked of true conservative principles. Republicans such as these become, says Steyn in characteristic form, “the factory-produced cookie-cutter craven RINO-squish reach-across model.”

Steyn cites the tepid reaction of some of his colleagues at National Review to too strong an assertion of conservative ideas. He described some as being “sniffy” about Sarah Palin’s Facebook posting of August 7, 2009, warning that Obamacare would lead to “death panels” of bureaucrats. These panels would decide when patients might not qualify for further health care due to their insufficient “level of productivity in society.” Steyn agrees that were government health care to be fully established there would indeed be “death panels,” just as there are under Britain’s health care system. In Britain they are referred to as “NICE” panels, the nauseating acronym for the National Institute of Clinical Excellence. We can almost hear the ghost of George Orwell suggesting the slogan, “Death Is Life’”

The “sniffy” reaction to Palin’s imaginative “death panels” revelation that Steyn refers to is all too common among “conservatives” who fail to recognize the enemy for what he is. So they fail to take off the gloves and fight back. The Civil War cannot be turned back with gentle “tut-tuts” directed at ideas and behavior that threaten the existence of the American republic we have known. Steyn suggests that the entire Obamacare plan is a death panel. It will subject the body of every American to “the jurisdiction of government bureaucrats.” The coining of the term “death panels” reduces the argument over government health care to its essential and unforgettable reality. The Tea Party demonstrator whose placard read “Grandma’s Not Shovel Ready” gets the point exactly. It’s time for the “sniffy” conservatives to follow the Tea Party lead on this and similar life and death battles against “transformative” Change.

Conservative is not a dirty word.
A Tipping Point

Over the Christmas Season of 2009-2010, with Obamacare at least half-passed by the Senate and a House bill done, relative peace reigned within the squabbling Democratic Party. The President and Congress looked forward to a Happy New Year finishing off their work of dismantling the American Republic behind the barricade of their 60 Senator filibuster proof majority. America was faced with a coup d’etat, a revolution to change the state; in this case the United States of America. The coup was to make way for a newly “transformed” order. The old order of the Constitution, the rule of law, individual liberty, and all the rest, was to be ignored or destroyed.
The Constitution strikes fear in the hearts of those who detest and abuse it. They cannot succeed so long as that document is in place and there are Americans of all races and creeds who know it is there, and are willing to defend it. The Tea Party movement understands this. Yet with the Democrats in full control of the federal government, the checks and balances of each branch of government against the others as provided for in the Constitution have become inoperative. At least one commentator has called the Tea Party the only check on government excess that remains. The malignant machine had seemed unstoppable.

Until a tsunami rolled down upon Washington from the bluest of the usurpers’ blue states, Massachusetts. In a special election to fill the seat of the late Senator Edward Kennedy, himself the bluest of the blues, Scott Brown, a Republican, had won! His main campaign theme was his promise to be the 41st Republican vote to break the filibuster-proof Democratic Senate and kill Obamacare. At the news of Brown’s election that monstrosity went into intensive care with poor prospects of survival. The President’s signature piece of legislation, for which he had fought for a year, was all but dead. The wreckage of the Massachusetts vote, as the Democrats see it, will continue to haunt them. In the meantime, with the Obama Express, packed with the rest of Obama’s transformative intentions, sidetracked for major overhaul, or even on the way to the scrap heap, there was a prospect of derailing Obama’s entire anti-American program. Unfortunately, Scott Brown’s 41st vote to kill Obamacare would have done that, and more, only if Speaker Nancy Pelosi and Senate Majority leader Harry Reid had played by the rules. That includes the Senate rule requiring the magic 60 votes to stop a filibuster that the Democrats lost when Brown’s election reduced their majority to 59. So the Democrats changed the rules, and used the now infamous “reconciliation” process, which requires only 51 votes in the Senate, to squeak Obamacare through, When sent back to the House, as required, Ms. Pelosi won a squeaker there too, by five votes.

The Daily Bell, whose Internet logo is a big black bear with a red tongue standing on his hind legs ringing a big black bell, is a publication based in Appenzell, Switzerland. Its more detached perspective away from the political battlefront of America seems to sharpen that publication’s frequent comments on America. Following the Scott Brown victory The Daily Bell editors analyzed the Tea Parties’ prominent role in his election. They describe the Tea Parties as a “process,” rather than simply a movement. They predict that the process will only continue to grow and become more powerful, will be well attuned to the Internet, and will be propelled by “the winds of economic and social discontent.” This in an environment in which the technology of mass communication “is just beginning to bite.”

The Gutenberg press began operation in 1450 making literature of all sorts widely available for the first time in history. Even so, it was a century or more before mass circulation of the documents of Western thought began to influence political and social movements. The spontaneous, and almost instantaneous rise of the Tea Parties was possible because of the instant communication technology of the Internet. History shows, The Daily Bell contends, that movements of mass communication like the Tea Parties, once taken hold, can “roll forward for decades and even centuries.” Once set in motion, such movements can level previously dominant social controls exerted by narrow interests and create more opportunity for development of free societies.

The Daily Bell Newswire in an article titled, “The Fall of the House of Kennedy,” observes that Obama and the Democratic Party are caught in a trap of their own making from which they cannot escape. In the Bell’s view the Scott Brown victory was the inevitable result of a “Faustian bargain” President John F. Kennedy made with the public employee unions nearly half a century ago. The deal was for the Democrats to provide everincreasing expenditures on wages, salaries, and related benefits for unionized government employees, at the expense of other needs of federal, state, and local governments. In return the unions would provide union money, support, and votes for the Democrats.

And that is exactly what has happened from the federal government on down. There are local jurisdictions in California, such as Vallejo north of San Francisco, in which over 70% of the city budget goes to fire and police union active and retired members, which has bankrupted the city. The many special benefits for unions tucked away in Obamacare highlight this cosy arrangement at the federal level. The enormous federal deficits, that would be further enlarged by the huge prospective costs of Obamacare, The Daily Bell points out, are in large part a result of serving JFK’s Faustian bargain.
By the end of Obama’s first year in office anger over the huge costs of his new federal government, greatly aggravated by this preferential treatment of unions, was beginning to show. In off year gubernatorial elections in New Jersey and Virginia Republican candidates won despite those States’ generally Democratic leanings. Appearances by President Obama to support the Democrat candidates had no effect. Appearances by Tea Party supporters did.

The Democrats are in a state of panic because the loss of their filibuster proof Senate majority means they may no longer be allowed by a Tea Party aroused public to pay off the public employee unions as their part of JFK’s Faustian bargain. The whole “machine,” as JFK himself labelled it, is badly crippled under the scrutiny of a populace with access to vastly more information through the Internet than has previously been the case through a biased establishment media. This is exactly where the millions of the Tea Party “process,” as the The Daily Bell calls it, can energise public awareness regarding the corruption of both parties in Washington. The Faustian bargain JFK made, the Bell observes, has “smothered” government at state and local as well as federal levels. The same bargain now threatens to smother the Democratic Party itself.

The Firestorm

When Massachusetts voters mobilized for genuine change on that momentous Tuesday in January 2010, with the Tea Parties in the front ranks getting out the vote, a tipping point of American politics was reached. The Obama plan to complete enactment of a sociofascistic agenda behind a filibuster proof Senate majority was dealt a heavy blow by Scott Brown’s election.

But as President Obama’s last minute rescue of Obamacare, and his doggedly partisan 2010 State of the Union address attest, he is not giving up. The Civil War is not over, and the Tea Party millions are in position to wield the most deadly weapon that can be deployed against the enemy in battles to come: information. They are positioned at the inception of what The Daily Bell calls a new age of information that is “just beginning to bite.” Lies and deception are built into politics as usual in Washington. It is in large part through lies, distortion, and above all suppression of honest and accurate information that the tireless forces of Civil War have advanced to the dangerous heights they now hold.

The mainline media have willingly certified information based on those deceptions, and the ordinary American has to a great extent been deceived. The Internet is changing all that, just as it did on the day Rick Santelli’s rant about the need for a new tea party was picked up by the Drudge Report, and YouTube made it into a national sensation all in the same day. The Tea Party idea struck a chord of inchoate, unconscious, deeply felt counterrebellion. It touched a reservoir of anger against an enemy that had been able to conceal its true intent behind a barrage of false promises, personal attacks, and misrepresentation.

Through access to the Internet The Daily Bell predicts that in the near future more masses of people will begin realize how they have been “lied to, impoverished and cowed” by the power elite’s “dominant social themes” such as the “evil” of corporations or “inequality” that only government can solve. Whole populations will become more knowledgeable than ever before about how free markets are essential to sustain personal liberty. Standard liberal tirades against corporations, capitalism, and free markets will be stood on their heads by ridicule. New information, the Bell predicts, will continue to be guided by the Tea Party example however the movement may evolve. As to its evolution the article identifies three discernable Tea Party “camps.” There are political organizers mainly steering the movement toward Republican candidates and causes; a libertarian antitax, pro-freedom movement; and an army of anti-government “give ’em hell” contingents, probably the largest of the three.

Newsmax magazine presents a more comprehensive picture of the vitality and reach of the growing Tea Party movement, including previously established anti-big government organizations, with varying emphasis on specific policies or programs. The Tea Party Patriots work at establishing PACs and backing political candidates. The 9-12 Project, inspired by Fox TV talk show host Glenn Beck, holds educational conventions and plans to introduce a 100-year plan for America on the National Mall in August 2010. The Tea Party Express sponsors bus trips across the nation to inform people how to be effective in establishing or reinforcing traditional American values in their communities. The Nationwide Tea Party Coalition hosts leadership conferences to identify and train prospective future leaders in the movement.
The largest of the groups now affiliated with the Tea Party phenomenon are Freedom Works and Grassfire Nation. Freedom Works, founded in 1984 and now led by former House of Representatives majority leader Dick Armey, takes a pro liberty and fiscal conservative stance, and co-sponsored the massive Tax Day Tea Party on April 15, 2010. Grassfire Nation, founded in 2001 by Steve Elliot, works with Republican precinct committeemen to return the Republican Party toward grass roots conservatives. Similar though smaller groups within the Tea Party movement include American Liberty Alliance, Tea Party Nation, and Smart Girl Politics.

So far the Tea Parties have remained cool toward the GOP, and understandably so. The problem for the GOP is to disconnect itself from the political legacy of the Democrats’ government-public employee machine. The Daily Bell sees the Scott Brown victory as opening a “rare, narrow chance” for the GOP to do just that, and to align itself with an awakened electorate “that understands its anger.”

The Republicans had a similar “rare, narrow chance” when they took control of Congress in the midterm election of 1994 under Newt Gingrich’s Contract With America. Much was accomplished, but slowly the virus of Washington fever sapped the vigor of reform. Finally George W. Bush, with his “compassionate conservatism,” abandoned a core principle of conservatism in trying to out-Democrat the Democrats by “crossing the aisle” through false bipartisanship, and by spending, spending, and spending.

Polls show that only approximately half of Tea Party people call themselves Republicans. That augurs well for the Tea Party process to continue to develop as an independent force. Whether or not the Republicans score a win in 2010 comparable to that of 1994, the Tea Parties will remain a strong antidote for any Republican who might be catching Washington fever. The Republican Tea Party contingent is in a strong position to ridicule the tendency of “moderate” Republicans to wander “across the aisle,” lured by a “spirit of bi-partisanship.” Republicans so inclined will be pressed to understand that to accept that bid is bait for disaster when dealing with revolutionary totalitarians.

If there is to be any future “crossing the aisle” it must be based on the principles of the Constitution and on the three core principles laid out on the website of the Tea Party Patriots: fiscal responsibility, constitutionally limited government, and free markets. Spectacles such as the McCain-Feingold Act, a result of Senator John McCain’s “bi-partisan agreement” across the aisle with Senator Russ Feingold, will not be tolerated. Among other provisions that Act regulates certain “electioneering communications” by way of broadcast, cable, or satellite 30 days before a presidential primary and 60 days before a presidential election. That is bald censorship, shameful and inexcusable. It is well to recall that the free speech clause was included in the First Amendment specifically and precisely to protect political free speech as a vital necessity of the democratic process.

If future crossings of the aisle seem advisable, Ronald Reagan has set the pattern. The “One Way” arrow must be turned to point to a democratic right, not to a sociofascistic left. Reagan did deal with the enemy, our most dangerous enemy, the Soviet Union. He dealt wisely and firmly with the Evil Empire to avoid mutual nuclear annihilation, but only after achieving clear military superiority. It was the Soviets who “crossed the aisle” to meet Reagan, not the other way around. The Tea Parties are in position to help assure that any future crossings are in the Reagan mode.

In time, as what is happening to them slowly seeps ever more deeply into their consciousness, the great majority of the American public (even some liberals), once informed will become as outraged as the original Tea Party people are. The enemy’s final defeat will then be at hand. The truth that there could exist a regime so corrupt and monstrous as Barack Obama, Harry Reid, and Nancy Pelosi have created is hard to absorb. The American people do not want to believe their country is being stolen from under them. Most have been unaware of the long Civil War that finally installed such a misbegotten triumvirate to govern in Washington. Nevertheless Americans will insist on knowing the truth. It was the truth of the Civil War that slowly, and to a great extent unconsciously, building over the decades reached the ignition point of the firestorm that the Tea Parties have now become.

The awakening of the Tea Parties by Rick Santelli’s “rant” occurred on February 19, 2009, only a month into the new President’s “honeymoon.” That timing indicates that the depth of the “transformative” change that Obama intends was even then visible and under way. A truth so incredible and ominous as this takes time to penetrate into the bones and marrow of a nation. A cup of tea helps to digest the brutal facts.

In a broader sense the Tea Parties and their associates represent an effort to pull at least the American segment of Western civilization back from the precipice of savagery. Civilization, says twentieth century libertarian author Ayn Rand, is about working toward a society where individual privacy is recognized and supported. Every aspect of savage existence is public, ruled by the laws of the tribe. That is the condition the present Washington power structure, racing toward the fantasy of sociofascism, intends to impose. It does this through herding the nation into passionate multicultural tribes, separate from all else, anti-American, and easily controlled by those who “know” they know best. By contrast, Rand sees civilization as “the process of setting man free from men.”

The liberalism of which Lionel Trilling spoke in the 1950s, since pulverized and pressed into a totalitarian mold, rejects privacy and the individualism privacy implies. Privacy (for others) liberals detest to the depths of their being. Privacy provides a sanctuary their schemes cannot penetrate, an incubator for repudiation and rejection of their intent. Perverted liberalism aims, not to free man from men, but to subject man to men—their men and women. Ayn Rand’s understanding of privacy, written in her wildly popular 1943 novel The Fountainhead, is a timeless commentary on the timeless battle between liberty and tyranny.

The liberalism of yesterday, as it inexorably mutated from campus riots to Civil War and betrayal of America, has itself set the table for the Tea Party sociofascistic firestorm that now lowers upon it, massive and unrelenting. The tyranny liberals would substitute for the liberty they have betrayed must be drummed to the same graveyard of history where lie its Twentieth Century predecessors.

To avoid that fate the Obama regime launches massive, vicious, and unrelenting cannonades of smears, calumny, and personal attacks to destroy the character of Tea Party people. Officials of the regime fabricate lies and attempt disruptions regarding Tea Party events. This is hardly the response of democratically elected officials respecting the constitutional right of their constituents to free speech. The Obama regime fears the people of this country. They are in a state of panic that Tea Party protests will expose them for the antiAmerican betrayers they are. Ironically, the drumbeat tactics the regime adopts to prevent the Tea Party from telling the truth simply affirm how alien to America, and how fascistic, their beliefs and programs are. It is the essence of any illicit regime to suppress, expunge, or annihilate its opposition. As Benito Mussolini said, “Everything within the state, nothing outside the state, nothing against the state.” The vendetta against the Tea Party underscores how much there is to fear about the nature of the Obama regime. These attacks reveal what we can expect if the truth about that regime is not exposed by forcing the “transparency” the candidate once promised, but, as we see, never intended to implement.

The glorious mandate of the Tea Parties is to raise high once again the image of a vibrant, democratic, and exceptional republic; a model of hope for suppressed people everywhere, as it has always been, “from sea to shining sea.” The pattern of Venezuela or Cuba, oppressed and impoverished, does not fit the United States of America.

So, please pass the teapot and drink to the Party.
The battle cry is Liberty!
22. The Audacity of Liberty
Three Revolutions

There were three revolutions at the end of the eighteenth century, resulting in three distinct outcomes. The American Revolution of 1776 was a bloody uprising, yet limited in scope and duration by its single goal: independence from the tyrannical power of Great Britain. When that was achieved the fighting stopped, the troops went home, and the result was thirteen Colonies free to decide their own fate.

The French Revolution of 1789 also began to right the wrongs of a tyrannical power: their own monarchy. But the French uprising, having set no goal at which victory could be declared and the carnage stopped, became mesmerized by an idealistic vision of Liberty, Equality, Fraternity. The French Revolution disintegrated into fanaticism, hatred, butchery, and finally into the Napoleonic dictatorship. The result of the French Revolution was a prototype for the modern totalitarian state.

The third revolution of the late eighteenth century took place, not on the battlefield, but in the creative minds of men debating how the thirteen free Colonies of America should go forward. Out of their debates came novel ideas and new institutions of government: the American Constitution. The revolutionary credo of the Constitution, almost unthinkable in its time, was boldly set forth in the first three words of its first sentence: “We the people.” It was, “We the People of the United States” who did “ordain and establish this Constitution for the United States of America.”

The American Constitution is not the gift of one of Plato’s benevolent philosopher kings. Nor is it a promise of limited rights wrested from a reluctant King John, such as the Magna Carta in England in the year 1215, crucial though that document is in the growth of liberty. The legitimacy of the American Constitution is based squarely on the consent of the people whom it is to govern. That was the amazing and challenging statement thrown out to the world of the eighteenth century from the American constitutional convention at Philadelphia. The Constitution was adopted by the Philadelphia convention on September 17, 1787, and ratified by the Colonies, which became States in so doing, in 1789.

The revolutionary nature of the American Constitution was a shock to the settled Western world of kingdoms, princedoms, and royal families. A constitution made for ordinary people? Incredible. And to claim they have rights the government can’t alter or withdraw as their betters see fit? Anarchy! “We the People” indeed! Such audacity!!

The American Dream

Strangely enough, it worked. The promise of America is based not only on the Constitution, but also on the Declaration of Independence, signed on July 4, 1776, that formalized the American break with England. There Thomas Jefferson boldly states that all men are created equal, and certifies that “they are endowed by their Creator with certain inalienable rights, that among these are life, liberty and the pursuit of happiness.”

In the Declaration of Independence and the Constitution the United States of America laid before the eyes of common people of the world a vision of the American Dream. It is a dream built on the foundation and enabling spirit of natural law and human nature. That Dream has fired the imagination, drawn to these shores, and released the energy of ordinary people everywhere who were, as the Statue of Liberty beckons, “yearning to breathe free.”

Horace Greeley famously advised Americans of adventurous spirit, “Go west, young man, go west.” Like the “shot heard round the world” fired at Concord Bridge on April 4, 1775, which signaled the Revolutionary War to follow, Greeley’s summons was also heard round the world. Immigrants by the millions “went west” from their native lands in the Old World to find a better place in the New World of America. Later generations have come from every direction, from all over the world, to seek the same fulfillment. These are the dreamers who have come here to understand America, to be part of it, to live it, and to love it. How withered and pathetic, like emaciated corpses, those who hate America seem by contrast.

Yet the truth remains, unpleasant though it is, that to the self-styled elite in this country who mount the Civil War against America the American Dream is a nightmare. The rise of the common man, yearning for a better life for himself and his family, offends them because it displaces their preeminence as the directorate of social and political values. Those filled with hatred for America gather to lay siege to the liberating institutions that support the common man they despise. The commanders of the Civil War, augmented by the chronically discontented, attack and lay waste where they can. In much of the American establishment, in the U.N., and in many countries around the world, the commitment is not to democracy or freedom. It is not to release the energies of free men and women to create wealth, set the goals of their own lives, and become proud and selfsufficient beings. The most lethal weapon of the Civil War is to destroy the common man through seemingly compassionate social programs that kill the spirit of the individual and make of free citizens a nation of beggars dependent on government.
The Dream of “We the people” was never dreamt for these elites in the first place. Their kind had it all. America was built for those who want a shot at sharing some of it, and the opportunity to create in ever-greater abundance that which is to be shared. For those who feel the sting of elite compassion, and allow themselves to be herded into new corrals of victimhood, the American Dream is unlikely to reveal its sparkling reality.

The American Dream is not immortal. There is no law of nature that says it must exist forever. Attacks against this Dream that go unanswered are blood of the wounded for the circling Civil War battalions, seeking to destroy and expropriate America. Weakness, cowardice, and conciliation excite in such an enemy only a more urgent appetite to strike. The angry purpose of the Civil War is to darken the glow of the American Dream and take command in the shadows. With that purpose exposed and threatened by the counter-revolution of Tea Party activists, and those they inspire, the final battle is joined.

My Country

 

What “my country” means to its inhabitants begins with its children. What they learn about their inheritance will eventually course through the whole of society and define its future.

Young minds barely beginning to form a sense of identity can be softened for later acceptance, not of American patriotism, but instead as “citizens of the world.” Such “citizens” will never know the lost vision of the American Dream. They will never learn that a cohesive core of national identity, not some utterly undemocratic “transnational” institution, is essential to protect their freedom. “My country tis of thee sweet land of liberty of thee I sing...from every mountain side let freedom ring” is a song no longer sung in America’s grade schools. In the public schools today students will be taught “diversity,” “multiculturalism,” and “transnationalism.” But they will hear no hint of the basis for genuine diversity, as practiced for centuries by many a proud cultural heritage within the unity of a proud nation. They will know nothing of the institutions needed to define and protect freedom.

Young minds properly “educated” can be robbed of democratic citizenship before they ever hear of such a concept. They do not know how profoundly different people are who value the reward of independently guiding their own destinies from those who turn themselves into government supplicants. They will have liberty stolen from them before they know what they have lost. Most children are already trapped in the indoctrination of public schools from K-12. The next tactic is to get them away from contamination by parental experience and authority even earlier. That requires new programs of “pre-school” holding tanks for most of the day.

Once the children are safely under state control the victorious battalions move on to eradicate the family. Hillary Clinton proclaims, “It takes a village to raise a child.” Sorry, parents won’t do. The idea of any child belonging to parents in a home must be proclaimed archaic and anti-social. The essential base in which a child can be loved, and can learn and grow, must be destroyed. That the “village,” charming euphemism that it is, in reality is an intrusive omnipresent authoritarian government must be shrouded in a cynical concern for “the children.”

For the revolution of the sixties to cement the victory of its Civil War it must suck the juice of liberty out of the hearts and minds of the free men and women of America, if only by one drop at a time. Under the tutelage of Civil War America is becoming a nation beaten into submission by sensitivity, compassion, fear of offending, and the thousand and one other interdictions and prohibitions of the politically correct mantra. Those who wield power, now including the top levels of government, work to seduce a proud people toward the status of lost and wandering souls. The pattern is that of docile animals seeking each day their daily bread from the hands of the State, grateful for their subsistence, afraid to speak out.

Writer, economist, actor, and lawyer Ben Stein fears that freedom of speech is not just “going” in America: “It’s gone.” True free speech was designed to debate pubic issues. That concept, Stein observes in his American Spectator column “Ben Stein’s Diary,” has been perverted into licentious, corrupt, and debilitating “freedom.” Free “speech” now allows a child to see on a computer screen any extreme of violence and pornography that might be imagined. For the dictators of revolutionary political correctness this is a very useful weapon against children and adults alike. It directs attention away from their efforts to suppress and punish those who are not seduced by their global warming hysteria, or the madness of government spending, but speak of the horror that is building for future generations.

Past invitations to self-destruction issued by the radical left have been deflected against the moderation and common sense of the great middle-class of America. Now segments of that solid buffer of liberty have become debilitated through political correctitude, loss of a moral compass, or the bribery of government “assistance.”

In his book Who Are We? The Challenge to America’s National Identity Samuel P. Huntington, Harvard professor of government, relates how powerful American elites have since, the 1960s, sought to “deconstruct” the American heritage. These elites strive to destroy the ideology of freedom, and to condemn traditional American culture to ruin. The insidious weapon of multiculturalism advertises equality but decries both individuality and patriotism. The multicultural snare is the basis of a new tribalism, and a major instrument of the rebel assault. The Civil War elites coerce Americans to divide themselves into sub-national identities in schools, the workplace, politics, and elsewhere. Huntington finds this device comparable to the divide and conquer tactics of former colonial powers.

These “post-national” pastors of destruction indoctrinate their tribal subjects to think of themselves no longer as Americans, but to embrace “transnationalism” or “democratic humanism.” That is, to give up their identity with an America they can see, taste, and feel for something that has no hard, real, or usable definition, no reality or base for identity and allegiance. These “post-national” clans and tribes are a telling signal of sociofascism well on the way. The Great Middle has been greatly weakened, and it is uncertain how many tough “country boys”— both men and women, tough in the sense of integrity, vigor, and honor—are left in the land to set it right again. But each day the Tea Parties advance, the count grows.

Military historian Victor Davis Hanson reminds us how close all human beings are to savagery “and how precious is their salvation through law, religion, science, and custom.”

Liberty is not a universal endowment. It is a rarity, almost a miracle, in human experience. It is a unique and precious opportunity that flourishes only in an identifiable political entity that recognizes and protects it. In ancient Greece it was the city state. Today it is the nation state. Freedom has survived, and is likely to survive, only where there are free and sovereign nations and corresponding national institutions designed to guarantee liberty.

Concern about pollution of the earth abounds, but the pollution that threatens this nation today lies not in stagnant rivers or dirty air. The American psyche has been contaminated from the sixties onward by an epidemic of lethal ideologies that decay thought and erode individual and family integrity. It is the poisonous refuse dumped into the environment of the mind that is the toxic and contagious weapon of the Civil War. It is when that contamination is allowed to multiply, and the mind to decay, that the elite commanders of the Civil War step forward to grasp the authoritarian opportunity in which “my country” will become their country.

Yet under the surface of rebel victories against America there was slowly accumulating the hot magma of a volcanic eruption.

 

A Moral Base

Democrat pollster Peter Hart identifies the “essential core values” of America as “family, faith, and decency.” Texas professor of philosophy J. Budziszewski echoes those core values when he insists there are basic and essential values “we can’t not know.” Hart’s core values of family, faith, and decency are a sort of earthly trinity that expresses what “we can’t not know.” The rebels seek to destroy those core values. They fear those values would reappear instinctively to the American mind should it be cleared of the counterfeit images of their revolution. The rebels of the Civil War know that if the family remains intact and healthy, and if the moral base of faith upon which the nation was founded withstands the unrelenting assaults against it, they cannot prevail. The deceitful promises of the revolution must be pressed insistently before the eyes and drummed incessantly into the ears of America to make the revolutionary charade of the rebels seem credible.

As to the “decency” that Hart includes in his core values, that issue is clear enough. There is no decency in the blind passion that drives this Civil War toward the goal of absolute power in America. The danger is that there will come a time when the deep instincts that support pollster Hart’s essential core values can no longer be sensed in the stench of a decaying social order. As Supreme Court Justice Antonin Scalia has noted, societies do not always “evolve” in positive ways, “sometimes they rot.”

The American Constitution and the Declaration of Independence embody the moral basis of the Judeo-Christian heritage. This includes a foundation of natural law, recognition of a basic human nature, and the need for enforcement of long tested standards of public conduct. And there may be a hard-wired basic moral structure in the human makeup to support that need. In an online survey Harvard University presented some 60,000 subjects with a variety of situations requiring a moral choice. One test assumes a building engineer who discovers toxic gas in a school’s forced air heating system that would kill five children in a room if not diverted. By throwing a switch he can save the five but would turn the gas into another room with a single student, who would be killed.

Former Wall Street Journal science writer Sharon Begley reports that the respondents elected to throw the switch without hesitation, and with no agonizing guilt about killing one child while saving five others. In similar situations, as in this one, the online survey finds that people make “very, very rapid judgments” about moral dilemmas. And there is “very little variation in what they consider permissible.” This, the study notes, is contrary to what it terms the “prevailing theory” that people live in a state of moral uncertainty and must consciously think through such questions. Perhaps there is more common sense morality around than there often seems to be.

The Harvard study suggests that at some basic level there is an innate intuitive recognition of good and evil. This is perhaps a reaffirmation of that long demeaned and overlooked element of nature, natural law. Nature that includes natural law would be a very different sort of nature than such as the anti-human Romantics or the current Holy Green Crusaders of the Civil War imagine. It would include a return to human nature. A natural law of morality may be an instrument of nature, an antidote for survival to countaract the poison of anarchy and tyranny. San Francisco longshoreman-philosopher Eric Hoffer says that the “crumbs” of every thought and act ever known or done by a human being are inherent in the makeup of every individual. He speaks a truth that is to many an unwelcome truth. That truth is that both good and evil lie in the nature of being human, and of every human being. Therein lies also the basis for knowing right and wrong, and for providing ideals and building institutions to identity and maintain right over wrong.

The Harvard study supports J. Budziszewski’s belief that it is the truth of things we “can’t not know” that is the basis of morality. That elemental knowledge makes the individual responsible to act accordingly, and makes society obligated to build institutions to reinforce individual responsibility. This is what was once commonly thought to be part of “the laws of nature and of nature’s God” as Thomas Jefferson phrased it. But societies such as the fascist horrors of the last century, and the fascist replicas of radical Islam and elsewhere today, demonstrate that an innate and natural moral structure can be overwhelmed. Morally responsible conduct cannot endure in the absence of a climate of values that recognizes what conduct is moral and what is not, what is civilized and what is not, and sets rules accordingly.

The British philosophic writer Theodore Dalrymple is not sanguine about the moral prospects of the West. In his book Not with A Bang But A Whimper: The Politics and Culture of Decline Dalrymple observes that when a population becomes dependent on government for the needs, and even the whims of life, it becomes “infantilized.” In this condition both the rewards for responsible behavior and the penalties for irresponsible behavior have vanished. What follows is a concept of individualism so extreme that each individual is free to invent or adopt his or her own private morality. This would allow so complete a privatization of morality that no code of conduct could be generally accepted. Dalrymple cautions that the only rule of behavior left would be that “you should do what you can get away with.” In such an environment there is no concept of right or wrong, good or bad, accomplishment or failure. And there is no peace or security.

It is exactly the moral climate, and the institutions that support the individual’s moral responsibility, at which the Civil War legions take dead aim. Too many Americans have been bludgeoned into being “non-judgmental” about the horror they see committed about them. Their conditioning to stand mute in the face of evil holds them silent, even if an inherent moral code might still whisper, “speak out,” in some inner corner of conscience. It may well be, as one observer has said, that Budziszewski “fills out the progression from the denial of natural law to the abolition of man in clear and easy-to-follow steps.” The death wish that seems to underlie what Pope John Paul II calls the anti-Western “culture of death” will then have been fulfilled.

The question is whether we choose to recognize or to reject the concept of morality; whether we agree that conduct, both moral and immoral, is something we have to deal with. Theodore Dalrymple, writing from his experience as a psychiatrist in a British prison, offers this sobering thought: “Men commit evil within the scope available to them.” In a fascist society where evil is institutionalized in the state the scope of available evil is enormously enlarged. Under fascism evil is expanded to implicate the whole of society, and the number of those caught in its net as willing perpetrators is vastly increased. Freedom and democracy depend upon the existence of morally responsible individuals as a base constituency. There must be some essential core of minds able to reason, to think clearly about, and to articulate the true conditions of a free existence.

Michael J. Behe, professor of biological science at Lehigh University and a senior fellow at the Discovery Institute, examines what it means to think. Behe sees the power to think as the basis of the human power to reason, which he terms “the greatest possible attribute of life.” So much so that the only greater talent would be “the ability to reason better.” Thought, Behe observes, is a prerequisite to understanding, an “immaterial ability,” perhaps even “something beyond nature.” His analysis also presents by implication the horror of being induced not to think, which is the aim of every totalitarian regime for its subjects.

Behe’s insights imply yet another basis for moral order. The capacity for thought and clear analysis is the truly endangered species in today’s warped environment of the mind. Thinking itself is the endangered species needing protection from extinction. The implications are clear should the Civil War’s intent to destroy the moral order based on thoughtful reason and human nature succeed. Neither all the politically correct substitutes designed to fill the void of lost belief, nor cold science turned into dehumanizing rejection of the metaphysical will suffice. Neither a politically correct regimen nor the icy realms of neuroscience pause to ask where humanity would be without the metaphysical qualities of thought, reason, morality, and all that follows.

Children of The Sixties

Yale computer scientist and author David Gelernter warns that anyone who thinks the student revolution of the sixties has run its course “should think again.” Gelernter asserts that the issues dividing America today are moral issues based on the manners, customs, and laws of American and Western experience. And perhaps, as American philosopher Russell Kirk says, at bottom they are religious issues as well. To accept those manners, customs, and laws is one choice. The other choice is a nation captivated by a mirage of perfection shimmering over the moral desert of revolution: the dark shroud of Civil War that denies the American Dream.

Gelernter terms the generation of the sixties the Cultural Revolution (CR) generation. Those revolutionists rebelled against their country, its history, its morality, and its culture. They knew the long and difficult road to the rule of law, to limitation of government power, to constitutional guarantees for citizens, and to democratic institutions. And they knew the amazing release of creative energy that flows from a free people. They rebelled anyway, and chose the shimmering ideal over the sometimes flawed reality.

The rebels of the sixties graduated from impudent intimidation of campus authority into their long revolutionary march through the institutions of American liberty. That generation, Gelernter points out, “is now in full flood and coming on strong.” The generation of the sixties set the model for its successors of the present generation. Once the long march of the sixties generation had captured command posts in the media, the universities, the schools, and throughout society they had no reason to refer to the history of those ideals and institutions they had conquered or co-opted. No one was left to teach their progeny anything about what had been rebelled against. The culture the sixties generation loathed was lost. Emptied of the history of its legacy, the new generation had nothing to oppose or to rebel against. The political philosophy of those now in charge of the country has been dragged far to the left, and their core legacy is now mainly a Marxian vision of perfection through power.

The young radicals of today have few mentors who do not, themselves, remain trapped in something very like the motto of the FSM at Berkeley decades ago: “Never trust anyone over thirty.” And so the model solidifies at that magic age (or even younger) in fads rather than verities, in “celebrities” rather than heroes. The legions of today’s Civil War are filled with refugees from civilization whose minds will never break 30 in their lifetimes, no matter how long their physical shells may endure. And they are claiming the pulpits, presses, TV screens, lecterns, classrooms, and offices of authority that will shape yet another new generation.

The rebels of the new generation, ignorant of the history of their own rebellion, are left to consolidate as their new absolute the only philosophy they know. That is the philosophy of revolution; the acquisition of unquestioned power. The New generation inherits and is destined to continue the assault of the original rebel generation against the capitalist, democratic, and moral systems of the country. The great irony is that the rebels of the New generation do not know they are rebels. As Gelernter puts it “self-conscious leftism is replaced by unconscious leftism.” The new leftists of the present generation believe their ideas are “innocuous and mainstream—just like the New York Times.” And there has been a popular belief that, just like the New York Times, the entire country has been dragged leftward.

But that belief was shattered as the volcano bubbling beneath the victories of the Civil War erupted in the firestorm of the Tea Party process. What has been revealed is not a whole country dragged to the left. The true reality is that those who are essentially conservative believers in America are beginning to wake up. They are beginning to revisit their inheritance, and stay right where they are without being dragged anywhere. Rather, they look across a vast no man’s land at the radicals drifting ever farther toward the left cliff. To continue the power drive inherited from the sixties generation is the credo of the new Civil War generation. That is all they know. They would not understand the ancient Roman motto carved in stone over the entrance to the library at the University of Colorado in Boulder: “Who Knows Only His Own Generation Remains Always A Child.” They do not know that the United States of America is now governed by children—or that they are the children.

Legacy

The true nature of the revolution this nation now endures can no longer be denied. It is to make America a moral desert. It is to leach away the spirit and substance of humanity at the altar of a shallow yet passionate secular religion of governmental power. America is hostage to a creed whose base is no deeper or more profound than satiation of its lust to command and control other people. This is a truth that glittering oratory can no longer obscure. How does civil humanity relocate itself in such a scene? What guidance can the common sense and profound intuition of the Tea Parties offer to make our way out of this moral desert?

John Lukacs, a Hungarian born historian, anthropologist, and author of several dozen books, advances an interesting theory. He asserts that the universe is such as it seems to be because located at the center of it there exist “conscious and participant people who can see it, explore it, study it.” Lukacs asks is there meaning to anything, the universe included, if there is no conscious mind to perceive it, study it, and try to understand it? This world-view places humankind at the center of the universe. That would seem at first encounter to fit comfortably with the godless secular mentality of god-like certainty that guides the Civil War rebels.

But the human being Lukacs envisions is thoughtful, not aggressive; contemplative, not arrogant. That person would be more likely to add, with humility, that after his best efforts he cannot be sure how much of all that he finds about him he understands with any certainty. Lukacs offers the thought that his insistence on the centrality and uniqueness of human beings is not arrogance, but in fact humility. It is, he says, “a recognition of the inevitable limitations of mankind.” If Lukacs parts definitively with the arrogant certainty of revolution, he leaves the humane and inquiring spirit, shall we say the soul, wandering amongst unanswered questions.

The human seeker of truth will find much that is tested and true in science, the arts, and the humanities. Still unsatisfied, on some starry night he stands and gazes into what he calls infinity, and wonders how it all began, and where it is going. What is it that has created and set forth all that he can see, that he has learned, or has imagined? He calls to the depths of his knowledge and ability, and receives no answer. At last he speaks softly to himself: “I don’t know.” Then after a pause something wired deeply in the human spirit murmurs, “But I am going to keep on looking.” This sort of inquiry reflects the best spirit of the West, and is the basis for humility and toleration as well as determination to explore and know more.

Still, if humankind has limits the upper limits can nevertheless reach to astonishing heights. In his book Human Accomplishment Charles Murray recognizes both the limitation and the potential of humankind in the opening statement of his Introduction: “At irregular times and in scattered settings, human beings have achieved great things.” Not often, or everywhere, and not predictably. But sometimes. That is the humane and fertile interpretation of such a philosophy as Lukacs advances.

To contemplate the work of such giants as Michelangelo, Beethoven, Mozart, Shakespeare, Confucius, Sophocles, Aristotle, Newton, or Einstein is a daunting, possibly depressing experience. It is exhilarating at the same time. To visit the great works of mankind is to realize how small the contribution of one individual salted among the billions is likely to be. But even so it is to find joy in partaking of the same humanity that has soared so high. A free individual can assess these giants and say with justifiable pride of Beethoven’s Ninth Symphony or Shakespeare’s King Lear that a human being did that. Murray offers a vision of how civilization is made, absorbed, and transmitted.

Murray’s measure also applies to the evaluation of the common man, of common people everywhere. There is no way to predict in what “scattered settings” or at what “irregular times” ordinary people will rise up to create a better dream for themselves and their fellow beings. It was the insight of those who wrote the American Constitution to recognize that “We the people” contain within us enormous powers of creativity. It was their genius that created the institutions that allow all men and women to strive for a better and more beautiful existence for themselves and their families. It was in one amazing “scattered setting” that the political genius of the framers of the Constitution for the first time formed in words a society that allows the better attributes of each individual to be encouraged and released to exploit the potential that is there. Once in a while, at “irregular times,” from out of the masses left free to develop themselves, there does emerge true greatness, and much else that is beneficial as well.

In displaying the work of his few thousand, topped by only a few dozen, Murray affirms Lukacs’ view of how fragile civilization is. If our civilization is at the end of an age and in decline, as seems possible, such thinkers as Lukacs and Murray touch the keys to resurrection. The first key is a revived recognition and unapologetic reassertion of the sanctity, uniqueness, worth, and unknown potential of every human being. The second key is to form a society in which each individual is free to explore and to exploit that which is within him unfettered by oppressive authority. Finally, there is the affirmation that when we stand on that starry night and gaze into the unknown we face not only a physical unknown, but a spiritual enigma as well.

The spirit nags with questions about the origins and incredible complexity of life forms from the human on down. How did we get here? On what pattern were we formed? If the answer remains, “I don’t know,” we must remain free to ask further questions in the faith that an ordered universe will one day yield a few more of its secrets. Recall Einstein’s reply when asked how he could be sure the universe is an ordered affair that can be studied and understood: “Because God does not play dice.”

Against the creative power and potential of the American and Western experience there is now set the awful drumbeat of hatred against that same civilization. “The West revels in… a hatred of itself, which is strange and can only be considered pathological.” So wrote a German priest, Joseph Cardinal Ratzinger, shortly before he became Pope Benedict XVI. The Cardinal observed that the West no longer loves itself, its achievements, or its history. Rather, the West “now sees only what is deplorable and destructive, while it is no longer able to perceive what is great and pure.” The Cardinal’s generalization “the West” is as disturbing as the content of his remarks. The pervasiveness of the pathological hatred of which he speaks is all too true. It is the rot of selfhatred that now Pope Benedict XVI identifies at the core of the West that threatens its survival. It is the pervasiveness of that rot in the West that his predecessor, Pope John Paul II, terms “the culture of death.”

The eighteenth century English philosopher Edmund Burke suggests that when radical departures from custom and experience are set before our eyes and pounded into our ears we must ask questions. Does what is offered vex rather than soothe, corrupt rather than purify, debase rather than exalt, or barbarize rather than refine? Measured against that test Americans “can’t not know” that something terrible has happened to them and to their country in the decades following the uprising of the sixties. Most Americans do know, says David Gelernter, that a “catastrophic deterioration” in morals and values has occurred that threatens to destroy their children’s legacy of freedom, and to condemn them to misery.
The air we breathe has a different character, a changed odor, a cloying feel about it that it didn’t have before the rebellion of the sixties and the sanctification of the results in secular correctitude and Civil War. The Civil War militants are, by their doctrine, blinded to history and unconditioned by reality. That which is now offered and pressed upon us does not soothe, purify, exalt, or refine. Released from custom and restraint we are enticed downward on the ladder of civilization, ever closer to unleashing the beast that lurks in the depths of every human being. The beast is invited to indulge without limits the insatiable lust of sexual abandon; to dither over pollution of the earth while its mind is filled with rot; to allow science to extract from it its precious humanity; to yield up its individuality and dignity one small drop at a time until one day it is noticed that something is missing. Yet so many Americans remain afraid to speak what they know.

In the previous Chapter The Daily Bell sees the Republicans as having a “rare, narrow chance” to disconnect from the Faustian bargain of a Democratic course of action leading to disaster. It may well be that the American nation as a whole has the same “rare, narrow chance” to avoid descent into a revolutionary catastrophe on a global scale. The Obama administration is rushing the nation toward the status of a sociofascist “banana republic” nonentity, entangled in transnational commissions, boards, courts and assemblies that would render America unable either to identify or defend itself.

In a letter to his friend Thomas Jefferson, American statesman and diplomat Gouverneur Morris, stationed in France at the time, described the French Revolution as “a vast volcano. We feel it tremble, we hear it roar.” Today we feel the trembling of a different sort of volcano. The spontaneous uprising of the Tea Party millions roars its affirmation that there is vitality remaining in this country and this civilization. These millions of Americans who share the horror of what is now happening to them speak to those who are, as they themselves once were, not yet aware of the depth and reach of the revolutionary events taking place before their eyes. They are the next to share enlightenment about this new darkness; to understand that their country is being stolen from them. The hard core is perhaps untouchable. But how many, even of the twenty percent who identify themselves as liberals in national polls, really want to betray their country? The Tea Parties and those they inspire give hope that the alarm has sounded in time.

The Statue of Liberty, radiating a light seen round the world, a beacon of hope and freedom, is a terrible threat to the Civil War. She is a commanding repudiation of those who betray the liberty to which she beckons with “my lamp beside the golden door” held high in her right hand. She is a magnet attracting to her side all who “yearn to breathe free” in her great citadel of liberty. For the Civil War to prevail the promise of that magnificent Statue must be broken. Her light must be extinguished, her arm severed, her head crushed, her whole being cut into scrap, melted down, and cast into images of servitude. That feat the Civil War in its betrayal of America has not as yet accomplished; and the counter-revolution against the attempt is strong and rising.

The great Statue still presides over the legendary entrance to America, but the light of her torch flickers unevenly into the souls of those she guards. She must prevail, for if the Dream to which she beckons should be shattered, the legacy of her children may well be scratching at rubble for the lost pattern of civilization.

You may also like...

  • India Reunited
    India Reunited Politics by Shiva
    India Reunited
    India Reunited

    Reads:
    48

    Pages:
    76

    Published:
    Jul 2017

    I am Shiva. I am Vishnu. I am Krishna. I am Rama. I am Brahma. I am Buddha. I am Mahavira. I am the unceasing leader of Bharat. I don't support of these allot...

    Formats: PDF, Epub, Kindle, TXT

  • United States of North America: an e-Direct Democracy
    United States of North America: an e-Direct Democracy Politics by Ben Caesar
    United States of North America: an e-Direct Democracy
    United States of North America: an e-Direct Democracy

    Reads:
    26

    Pages:
    50

    Published:
    Jul 2017

    We need to evolve from the bipartisan bullshit politics to direct democracy using the internet. The future is now. United States of North America is our futur...

    Formats: PDF, Epub, Kindle

  • 7 Most Potent 'Anti-Nazi' Ways To Live Like GOD In Trump's Regime
    7 Most Potent 'Anti-Nazi' Ways To Live Like GOD In Trump's Regime Politics by TheNutCrackr
    7 Most Potent 'Anti-Nazi' Ways To Live Like GOD In Trump's Regime
    7 Most Potent 'Anti-Nazi' Ways To Live Like GOD In Trump's Regime

    Reads:
    72

    Pages:
    60

    Published:
    May 2017

    Trump is just a Pawn in this game. This 'Game' that is played around you, exploits you and takes you for granted.

    Formats: PDF, Epub, Kindle

  • Mid-East Dilemma & WW-3
    Mid-East Dilemma & WW-3 Politics by Alexander Zielinski
    Mid-East Dilemma & WW-3
    Mid-East Dilemma & WW-3

    Reads:
    323

    Pages:
    130

    Published:
    Apr 2017

    Part-1 Bible Prophecy: Anti-Christ, WW-3, ‘Jesus Christ’s’ return. Part-2 Involved Nation’s and Leader’s: Iran, USA, Israel, ISIS, Russia, Palestine, China...

    Formats: PDF, Epub, Kindle, TXT