Sunday, December 21, 2025

Our Athenian American democracy

 

merica may be still a constitutional republic in name, but recently it has operated more as an unchecked Athenian-style democracy. Americans may appreciate the rich culture and the brilliant minds of classical Athens. But rarely do they learn that much of the money fueling the fifth-century-B.C. Athenian renaissance derived from tribute coerced from imperial subject states, and that Athenian democracy was inherently unstable and often quite self-destructive.

Democracy—the word itself means “people (demos) power (kratos)”—originated in late sixth-century Athens through the reforms of Cleisthenes. The Athenian popular leader transferred political power from the traditional tribal clans to the general Assembly of citizens. What followed, however, was a historic but insidious growth in power of the new citizen Assembly, which made, interpreted, and enforced laws—usually without many executive or judicial checks and balances.

America’s art of war

 

he Continental Army is often stereotyped as a ragtag group of irregulars. Thanks to the French, we are told, the army was sufficiently supplied, funded, and trained to defeat British professionals and win the war. Yet throughout the American Revolution, Patriots exhibited a singular ability to create their own capable military forces ex nihilo. And they ingeniously supplied and equipped local militias in addition to their eventual establishment of a Europeanized and professional army. That national legacy of military adaptation and innovation has characterized almost all subsequent American wars. Indeed, the spirited “army” that began the war at Lexington, Concord, and Bunker Hill in 1775 bore little resemblance to the multifaceted and lethal military that defeated the British at Cowpens and at Yorktown in 1781 to end the war.

Such radical evolution has also been characteristic of the later American military, which has often been faulted for entering a war unprepared, naive, overconfident, and poorly led—but never remained so for long. Despite democracy’s innate unease with military culture and standing armies, within a year of America’s entering wars, the nation’s assets of individualism, fluidity between classes, openness to experimentation and innovation, meritocracy, affinity for technological change, economic dynamism, and love of liberty have consistently resulted in rapid adaptation and eventual success over much more experienced enemies.

The American Revolution established the tradition of a flexible military that quickly evolves from a disorganized and confused peacetime militia to deadly wartime soldiers. For example, at the First Battle of Bull Run (First Manassas, July 21, 1861), in the initial engagement of the Civil War, an imposing but disorganized and poorly led Union force of 35,000 raw recruits was routed and humiliated by a smaller but more spirited Confederate army. Colonel William Tecumseh Sherman, one of the few heroes that day on the losing Union side, and soon to be a general, left the flotsam and jetsam of Bull Run utterly depressed but still eager to bring about changes. He feared that the North’s armies were so disorganized, poorly equipped, and badly trained that they might never be able to invade and defeat the Confederacy, then about three-quarters the size of Europe.

Yet just over three years later in the autumn of 1864, General Sherman’s Army of the West had accomplished just that, after cutting a destructive swath through the heart of the Deep South from Georgia through the Carolinas before approaching Lee’s army in northern Virginia to help General Grant end the war.

The Macon Telegraph memorably described the terrifying approach of Sherman with his army throughout Georgia: “unsated still in his demoniac vengeance, he sweeps over the country like a simoon of destruction.” Later, on the grandstand overseeing the huge Union victory parade through Washington, D.C., in May 1865, the German ambassador remarked of Sherman’s army, as its war-torn veterans marched by: “An army like that could whip all Europe.”

When the United States declared war on Germany in April 1917, the American military was little more than a frontier constabulary of 120,000 irregulars. Two years later, after the armistice of November 1918, the new American Expeditionary Force had landed two million doughboys in France with only two hundred enlistees or so lost to enemy naval action. The U.S. economy went from essentially zero pre-war production of artillery shells to manufacturing over 50 million shells a year—the greatest per annum output of any belligerent in the entire war.

When war broke out again in Europe on September 1, 1939, the Depression-era U.S. Army was only some 170,000 soldiers strong—seventeenth in size of the world’s militaries, behind even tiny Portugal’s. By September 1945, the military had swelled to 12.2 million soldiers.

The U.S. wartime economy, battered by the Great Depression, was creating by late 1945 a greater gross domestic product than all of the other major belligerents put together. At war’s end, the American fleet, in both total tonnage and number of ships, was larger than the rest of the world’s major navies combined. Americans created the two most expensive, sophisticated, and deadly weapons systems of the war—the B-29 long-range heavy bomber and the Manhattan Project’s atomic bombs. And GIs, with little prior military training, in horrific battles on D-Day, at the Battle of the Bulge, and on Okinawa proved themselves among the most skilled and daring on either side.

The precedents of this extraordinary, near-frenzied American military resilience and recovery—which persisted, for most of the country’s history, without an institutionalized military culture—were established during the Revolutionary War. The first Americans quickly embraced military innovations, welcomed adaptation, and were humble enough to learn from more experienced European commanders.

They rarely saw unconventional forces as antithetical to traditional armies or as connoting lesser social status. Instead, they envisioned irregulars as complementary and ideally suited to the geography and local populations of Colonial America. The key was to draw on all manpower available without prior conventional restrictions. Washington, who believed victory would be achieved only when his orthodox infantry matched the caliber of the British army in set battles, nevertheless encouraged irregular and ad hoc militias, raiders, and guerrillas to work in tandem with his own Europeanized army.

General Nathanael Greene, perhaps second only to Washington in tactical and strategic insight, gained Washington’s confidence to form an irregular hit-and-run force of about a thousand infantry and eight hundred cavalry, to be supplemented by local militia. Greene himself avoided pitched battle with the better-equipped and larger British contingents in South Carolina. He wrote Washington, “I must make the most of a kind of partizan war.”

Washington was impressed, and quickly responded, “[I] approve of your plan for forming a flying army.” Greene’s “flying army,” soon working in tandem with Brigadier General Francis Marion (the “Swamp Fox”), saw a greater number of battles and engagements in South Carolina than took place in any other theater of the war. Indeed, one in five American war deaths occurred in South Carolina, a state that the much-larger British forces could never completely subdue.

In the Northern theater, Ethan Allen’s Green Mountain Boys of Vermont similarly wore down British forces. They proved central in capturing Fort Ticonderoga at the critical junction of Lake Champlain and Lake George, during a now-famous late-night attack on May 10, 1775. The capture of the fort allowed General Washington to send Henry Knox to the stronghold to transfer its critical arsenal of cannon to Boston, where the captured artillery proved decisive in forcing the British to evacuate the city in 1776.

Traditional American generals such as Benedict Arnold, Horatio Gates, and Charles Lee all argued for the value of the militias and “special forces.” They sought to integrate these autonomous bands of infantry and cavalry that brought their own food, owned their horses, and served without regular pay into their own regular armies. The revolutionary impetus to use these assets was partly ingrained, given the successful prior use of militias and rangers in the earlier French and Indian War (1754–63), and partly tactical, drawing on colonials with specialized knowledge of terrain, local populations, and enemy intentions.

Later, John Adams famously noted that the keystones of the revolutionary America that had provided the new nation its stability and security were small towns, religious congregations, schools—and militias.

This early tradition of encouraging raiders to work in concert with regular forces also became a hallmark of later American militaries. In the Civil War—one of the rare major conflicts fought on American soil—both sides relied on guerrilla forces. Perhaps the most famous were, on the Union side, the use of sometimes brutal units such as Richard R. Blazer’s “Blazer’s Scouts” and Samuel C. Means’s “Loudoun Rangers” and the activities, on the Confederate side, of Elijah V. White’s “White’s Comanches” and “McNeill’s Rangers” commanded by John Hanson McNeill. In Virginia, General John Singleton Mosby (the “Gray Ghost”) and Turner Ashby for years tied down thousands of Union troops.

The mounted “Partisan Brigades” of “the devil” Nathan Bedford Forrest and John Hunt Morgan (sometimes called “the South’s Francis Marion”) worked throughout Tennessee and Virginia in conjunction with the Army of Tennessee throughout 1862 and 1863. Sherman felt that Forrest had caused more havoc to Union infantry operations than any other Confederate force.

In response, the Union’s devastating 1864 raids by Philip Sheridan’s mounted troops into the Shenandoah Valley of Virginia, and Sherman’s self-supporting infantry with its notorious “bummers” in Georgia and the Carolinas, detached from all logistical bases, were themselves unorthodox. In the spirit of the Revolutionary War’s flying columns, they were mixtures of raiding and traditional marching, designed to war on Confederate infrastructure and property, to free slaves, and to force civilians to withhold support from Southern troops.

The use of insurgents, guerrillas, and special forces continued from World War II to Iraq and Afghanistan. The idea of an “irregular war” involving all imaginable sorts of forces, both unorthodox and conventional, sometimes behind enemy lines, targeting infrastructure and manufacturing as well as armed forces, was a direct legacy of the revolution. From the long-range penetration efforts against the Japanese in Burma by Merrill’s Marauders to the efforts of Green Berets to organize Hmong resistance during the Vietnam War, such special forces had their American origins in the Revolutionary War.

Just as importantly, the outmanned and outgunned American forces in the revolution were especially open to ad hoc technological innovation, keen on scientific breakthroughs, and reliant on novel military protocols—in ways unexpected by the wealthier and better established British military. While the extent of the colonials’ use of the somewhat clumsy and difficult-to-load Pennsylvania Long Rifle has been exaggerated, the novel, rifled muskets nonetheless tripled the traditional hundred-yard range of the British “Brown Bess” smooth-bore musket. Soon American snipers learned to pick off British officers at the then-unheard-of distance of over three hundred yards. Such targeting of British commanders played a key role at the monumental American victory in 1777 at Saratoga, New York, a turning point that many historians cite as the catalyst that finally convinced the French to intervene on behalf of what they now saw might be the winning side.

Perhaps the most effective use of early rifles was by “Morgan’s Riflemen,” a militia led by Daniel Morgan. At the small but critical Battle of Cowpens (1781) in South Carolina, Morgan brilliantly combined a diverse force of regular troops, militia, and his rifled sharpshooters virtually to destroy the infamous “British Legion,” led by the feared and often reckless Banastre Tarleton. The crack British troops were led into a veritable American ambush of feigned retreats, multilayer defenses, sharpshooting snipers, and double envelopment.

Lacking the industrial base or scientific infrastructure of Britain, the Americans proved in many ways more innovative. General Washington was alarmed that his army suffered inordinate losses from smallpox epidemics over the first two years of the conflict. Nearly two decades before Edward Jenner’s use of safe and effective mass cowpox inoculations to prevent smallpox, Washington gambled by ordering thousands of soldiers of the Continental Army to be innoculated during the winter of 1777.

The Americans employed an early but dangerous method known as “variolation,” one that collected the pus from the lesions of infected smallpox patients and inserted the live effluvia into the skin of the healthy. The risky practice worked. Not only might the resulting immunity have saved the American army, but it also provided advantages over mostly unvaccinated British adversaries. After Washington’s vaccination program, fewer than 2 percent of American soldiers were lost to subsequent smallpox epidemics.

Some entrepreneurial efforts to find new weapons and protocols were encouraged even when they proved far ahead of their time and impractical. Tradition has it that the world’s first military submarine, “The Turtle,” was designed and built in 1775 by the Yale graduate David Bushnell. In conjunction with the Continental command, Bushnell had been experimenting with ways to attach mines to harbored British warships, and the effort soon led him to explore submarine delivery systems.The result was the Turtle—a one-man, egglike capsule, built of oak with reinforced iron bands. The idea was to submerge the hand-powered ovoid shell, steer it near to docked British ships, attach a fused mine, and then paddle away before the bomb went off. Allegedly the first use of a submersible craft in actual combat, the Turtle failed in its three attempts to place an explosive device near hms Eagle, docked in New York Harbor.

Finally, the Continental Army proved eager to incorporate military knowledge, tactics, and experience from more experienced European militaries. Seeking to hone a European-type army, Washington welcomed in 1777 the help of the expatriate veteran Prussian officer Friedrich Wilhelm von Steuben. He had served as the aide-de-camp to Frederick the Great and was intimate with the modern Prussian army, then the most innovative and disciplined force in Europe. Von Steuben not only brought Prussian order and drill to the American army, but he also schooled it in the use of sequential fire by rank, the famous Prussian “oblique order” of rapidly marching diagonally across the battlefield to concentrate the mass of the army on a fixed spot on the flank of the enemy, and the tradition of positioning the supreme commander near the front so he could lead by example.

Von Steuben’s training manual Regulations for the Order and Discipline of the Troops of the United States (1779) was widely read by Washington’s officers, who adopted its disciplined training regimen and fire control—and it even served as a blueprint for U.S. Army handbooks up to the modern era.

In July of 1781, Jean-Baptiste Donatien de Vimeur, comte de Rochambeau, arrived in America with four thousand French troops—soon to be joined by the French fleet. But unlike some prior French commanders with far greater experience than Washington, the fifty-six-year-old Rochambeau, a veteran of the War of the Austrian Succession and the Seven Years’ War, collaborated with him as an equal. He correctly advised that the ultimate American objective was not recapturing New York but destroying the British army at Yorktown.

The better-known Marquis de Lafayette, Marie-Joseph Paul Yves Roch Gilbert du Motier, joined Washington’s staff, fought in a number of early battles, was wounded, spent the hard winter with the troops at Valley Forge, and returned to France to help raise money, arms, and French soldiers for Washington. He returned to fight in the last battle of the war, the victory at Yorktown. Lafayette, like Rochambeau, got along well with Washington and the Americans, and like von Steuben he labored to make the American military comparable in numbers and skill to the British. Indeed, the later American institutionalization of separate infantry, cavalry, and artillery corps, the idea of a formal general staff, and the practice of breaking military divisions into regiments and battalions can be traced to French and Prussian advisors.

From the beginning of the U.S. military, foreigners found it easy to work with affable Americans, who were mostly devoid of class pretensions, titles, and hierarchies—a tradition that endures today. It is no surprise that the Americans in both World War I and II were seen as the most congenial of their allies, eager to learn from their more-experienced partners.

Indeed, this initial willingness to adopt, modify, and improve foreign military practices under allied consultation and instruction was enshrined in all later American armies. In the American Civil War, both sides had been schooled in Napoleonic practice and doctrines, which, albeit outdated, still formed the basis for the early training of the Union Army.

When the American Expeditionary Force arrived in France in 1917, after initial discord with the British and French over the preservation of American autonomy, it soon learned from its allies’ prior bitter experience that the machine gun and artillery, not well-trained American marksmen, were the key to victory in the trenches. The British also wisely informed the late-arriving Americans in 1942 that unescorted daylight bombing missions over Europe were recipes for disaster. The application of the British Merlin engine to the sluggish American P-51 fighter, and of the seventeen-pounder gun to the under-armed Sherman tank, made both weapons among the best in the war.

At the end of the Revolutionary War in 1783, the formal American army was trained to European standards. But it was also energized by indigenous minutemen, snipers, and special forces—all of which have parallels today. Contemporary American military mastery of machines and gadgetry also is a Revolutionary War inheritance. The French and European advisors certainly aided the Americans as part of their own self-interested efforts to weaken Britian. But they also did so in admiration of American idealism, openness, familiarity—and willingness to learn, consult, and collaborate.

Can the Dark Ages return?

 

Western civilization arose in the 8th century B.C. Greece. Some 1,500 city-states emerged from a murky, illiterate 400-year-old Dark Age. That chaos followed the utter collapse of the palatial culture of Mycenaean Greece.

But what reemerged were constitutional government, rationalism, liberty, freedom of expression, self-critique, and free markets – what we know now as the foundation of a unique Western civilization.

The Roman Republic inherited and enhanced the Greek model.

For a millennium, the Republic and subsequent Empire spread Western culture, eventually to be inseparable from Christianity.

From the Atlantic to the Persian Gulf and from the Rhine and Danube to the Sahara, there were a million square miles of safety, prosperity, progress, and science — until the collapse of the Western Roman Empire in the 5th century AD.

What followed was a second European Dark Age, roughly from 500 to 1000 AD.

Populations declined. Cities eroded. Roman roads, aqueducts, and laws crumbled.

In place of the old Roman provinces arose tribal chieftains and fiefdoms.

Whereas once Roman law had protected even rural people in remote areas, during the Dark Ages, walls and stone were the only means of keeping safe.

Finally, at the end of the 11th century, the old values and know-how of the complex world of Graeco-Roman civilization gradually reemerged.

The slow rebirth was later energized by the humanists and scientists of the Renaissance, Reformation, and eventually the 200-year European Enlightenment of the 17th and 18th centuries.

Contemporary Americans do not believe that our current civilization could self-destruct a third time in the West, followed by an impoverished and brutal Dark Age.

But what caused these prior returns to tribalism and loss of science, technology, and the rule of law?

Historians cite several causes of societal collapse — and today they are hauntingly familiar.

Like people, societies age. Complacency sets in.

The hard work and sacrifice that built the West also creates wealth and leisure. Such affluence is taken for granted by later generations. What created success is eventually ignored — or even mocked.

Expenditures and consumption outpace income, production, and investment.

Child-rearing, traditional values, strong defense, love of country, religiosity, meritocracy, and empirical education fade away.

The middle class of autonomous citizens disappear. Society bifurcates between a few lords and many peasants.

Tribalism — the pre-civilizational bonds based on race, religion, or shared appearance — re-emerge.

National government fragments into regional and ethnic enclaves.

Borders disappear. Mass migrations are unchecked. The age-old bane of antisemitism reappears.

The currency inflates, losing its value and confidence. General crassness in behavior, speech, dress, and ethics replaces prior norms.

Transportation, communications, and infrastructure all decline.

The end is near when the necessary medicine is seen as worse than the disease.

Receive Daily Headlines FREESign up today!Do not fill this field

Such was life around 450 AD in Western Europe.

The contemporary West might raise similar red flags.

Fertility has dived well below 2.0 in almost every Western country.

Public debt is nearing unsustainable levels. The dollar and euro have lost much of their purchasing power.

It is more common in universities to damn than honor the gifts of the Western intellectual past.

Yet, the reading and analytical skills of average Westerners, and Americans in particular, steadily decline.

Can the general population even operate or comprehend the ever-more sophisticated machines and infrastructure that an elite group of engineers and scientists create?

The citizen loses confidence in an often corrupt elite, who neither will protect their nations’ borders nor spend sufficient money on collective defense.

The cures are scorned.

Do we dare address spiraling deficits, unsustainable debt, and corrupt bureaucracies and entitlements?

Even mention of reform is smeared as “greedy”, “racist”, “cruel”, or even “fascist” and “Nazi.”

In our times, relativism replaces absolute values in the eerie replay of the latter Roman Empire.

Critical legal theory claims crimes are not really crimes.

Critical race theory postulates that all of society is guilty of insidious bias, demanding reparations in cash and preferences in admission and hiring.

Salad-bowl tribalism replaces assimilation, acculturation and integration of the old melting pot.

Despite a far wealthier, far more leisured, and far more scientific contemporary America, was it safer to walk in New York or take the subway in 1960 than now?

Are high school students better at math now or 70 years ago?

Are movies and television more entertaining and ennobling in 1940 or now?

Are nuclear, two-parent families the norm currently or in 1955?

We are blessed to live longer and healthier lives than ever — even as the larger society around us seems to teeter.

Yet, the West historically is uniquely self-introspective and self-critical.

Reform and Renaissance historically are more common than descents back into the Dark Ages.

But the medicine for decline requires unity, honesty, courage, and action – virtues now in short supply on social media, amid popular culture, and among the political class.

Blocking Canada’s Silent Suicide From Creeping Into America

 

America’s neighbor to the north legalized euthanasia in 2016, and since then, more than 75,000 Canadians have participated in Canada’s Medical Assistance in Dying program. Canada also has no restrictions on abortion and only considers a baby a human after it passes through the birth canal. 

In 2021, Canada took its euthanasia laws a step further, allowing people not only facing a foreseeable death to end their lives, but anyone with a serious medical condition. And Canada is considering going even further in 2027 to allow those with mental illness to die by euthanasia. 

Canada has become a “totalitarian wild west,” according to Liana Graham, who grew up in Canada and now works as a research assistant in domestic policy at The Heritage Foundation. 

Areas that Canada should regulate, such as abortion and physician-assisted suicide, it does not; instead, it has created stringent regulations around freedom of speech and religion, according to Graham, who says America should heed a warning from Canada.  

Eleven U.S. states and Washington, D.C., allow physician-assisted suicide, and Illinois might soon become the 12th. Canada has proven, Graham argues, that once a society begins to deny the value of all life, policy can quickly devolve into something that looks and sounds like it is straight out of a George Orwell novel.  

On this week’s episode of “Problematic Women,” Graham joins the show to discuss the ways America can keep itself from becoming Canada 2.0 and protect the value of life that was intrinsic to America’s founding. 

Also on today’s show, we wrap up the year by discussing President Donald Trump’s Wednesday night address to the nation. 

Wednesday, December 17, 2025

Silicon Valley’s moral bankruptcy

 

The state of California is unrecognizable now from how it appeared in the era of Governors Ronald Reagan and Pete Wilson, when it was a model of conservative governance, had an expanding population, and was the site of visionary infrastructure developments. Much of the state’s decline has been driven by the metamorphosis of its Democratic Party into a political organization focused on favoring the richest citizens, massive global corporations, and elite, prestigious universities. This strange left-wing concoction of politics, influence, and wealth is centered in the corridor of Silicon Valley, Stanford University, and San Francisco. And it is now driving the national Democratic Party, and with it the nation, steadily leftward. How, then, does it operate?

By any classical definition, these companies operate as near monopolies.

Four of America’s ten largest corporations are headquartered in Silicon Valley. Three of these are in the top five, including the world’s wealthiest company, Apple (number one, with $2.5 trillion in market capitalization), along with Alphabet (number three, with $2 trillion) and Meta (number five, with $1 trillion). By any classical definition, these companies operate as near monopolies. Alphabet, for example, controls about 84 percent of all global daily internet searches. Apple sells about half of America’s cell phones, and accounts for almost a quarter of the world’s purchases of mobile devices. Facebook has captured 65 percent of the world’s social-media market. Nvidia (number eight, with $550 billion) currently dominates 80 percent of the graphics-processing-unit market. Never has so much money been concentrated in such a small place so quickly.

Tech companies forged these dominant market positions in the usual way: by squeezing out and bankrupting smaller companies or, more commonly, by buying them out. Over a decade ago, the former Google ceo Eric Schmidt, in slightly exaggerated fashion, joked that Google, now known as Alphabet, bought one company a day. The company still owns well over two hundred major internet concerns, the vast majority of which were once start-ups or rivals. Apple has absorbed over forty large corporations. Meta (formerly Facebook) has swallowed over ninety companies.

The modern world of technology certainly has the means of effecting monopolies well beyond those of even the nineteenth century’s deregulated industrial cartels. They have created their own online stores that are the sole vendors for the applications that are often essential for consumers to access social-media sites and email servers. Consumers also need their platforms to buy services and products. It only takes a few of these gatekeeper corporations to strangle any online company they choose.

Big Tech’s monopolistic ethos is insidious. Entrepreneurs intentionally seek to found new tech companies for the sole purpose of selling out for billions of dollars to Apple, Meta, Alphabet, or Microsoft. Often they assume that their product will be eliminated simply for the purpose of removing competition.

The result is that in a matter of hours a few tech companies can demolish any targeted business or ideological competitor. An upstart like Parler, a conservative free-speech alternative to Twitter, was on a trajectory in early January 2020 to sign up twenty million new users. Yet it was virtually strangled in a single weekend by the coordination of Amazon, Apple, and Alphabet. They colluded, operating with the shared idea that the tiny Twitter rival did not appropriately censor user content in the post–January 6 climate. The tweets did not suit the tastes of the left-wing Big Tech industry.

Parler went in hours from a rising conservative answer to Big Tech’s social-media monopoly to a nonentity.

Accordingly, Alphabet kicked Parler off its Google Play Store. That exclusion made it impossible for millions of mobile-phone users to download access to the application. Apple banned Parler from its Apple App Store, even though it was already ranking as the number-one free application. Amazon cut Parler off entirely from its Amazon Web Services. As a result, Parler went in hours from a rising conservative answer to Big Tech’s social-media monopoly to a nonentity.

Silicon Valley has so far been able to avoid antitrust legislation by its now-well-known political two-step. When conservatives and Republicans are in power and considering antitrust enforcement, Big Tech boasts of its all-American, free-market success. It brags that, if given free rein, it will easily outpace our foreign competitors. It poses as the best modern capitalist representative of the tradition of can-do Yankee entrepreneurialism.

Yet when Democrats and leftists achieve control of the White House or Congress, tech lords use a different tactic for achieving exemption. With a wink and a nod, Silicon Valley reminds the kindred Left that its monopoly-driven riches are inordinately put in service of Democratic candidates and their left-wing causes. And indeed they are.

Some fifteen Silicon Valley rich people sent more than $120 million to left-wing candidates between 2018 and 2020 alone.

For example, of those seventeen U.S. tech companies claiming a value of $100 billion, 98 percent of their aggregate donations are directed to Democrats. Dustin Moskovitz, a cofounder of Facebook and worth a reported $11 billion, gave Hillary Clinton $20 million in 2016 and Joe Biden $24 million in 2020. Karla Jurvetson, the former spouse of the tech mogul Steve Jurvetson (SpaceX, Tesla), sent some $27 million to Elizabeth Warren, Barack Obama, Hillary Clinton, and Joe Biden. Reid Hoffman (LinkedIn) pledged nearly $5 million to stop Donald Trump in 2020. Some fifteen Silicon Valley rich people sent more than $120 million to left-wing candidates between 2018 and 2020 alone.

Yet Silicon Valley, along with Big Tech in general, does not just monopolize the market and pour hundreds of millions of dollars into the Democratic Party. Its monopolistic control over social media also allows it to use shadow banning and account cancellation to prune away the online expression of millions of conservatives deemed unhelpful to the progressive project.

Once Facebook and Twitter kicked Donald Trump and many of his supporters off their social-media platforms (but not the Taliban or the Iranian theocracy), there was a stampede of other social-media entities such as Snapchat and TikTok to do likewise. Recently, Microsoft’s advertising subsidiary Xandr, under pressure, finally promised that it would cease blocking sites deemed conservative from earning their advertising revenues. Xandr had used a third-party leftist “disinformation” service, the “Global Disinformation Index”—a United Kingdom–based organization affiliated with two other U.S. nonprofit groups—to bankrupt conservative websites, in efforts to censor their supposedly improper expression.

Often social media’s partisan suppression of First Amendment rights serves at the beck and call of government agencies that are otherwise barred from violating the Bill of Rights. Recently, among the so-called Twitter Files, a trove of internal corporate communications released by Twitter’s new owner, Elon Musk, the fbi popped up as having hired Twitter for a sum of over $3 million to suppress unwanted sites and individuals supposedly spreading “disinformation.”

Of course those rules don’t apply when political enemies are the target.

The fbi has served as a sort of social-media doorkeeper for additional censorious government agencies such as the Department of Defense, the Centers for Disease Control, and the cia. They all wished to pass through to Twitter their respective enemies lists—those deemed worthy of censorship or banning. Even congressman Adam Schiff (DCA) sought to use Twitter to silence his media critics. In the case of the cia, it disguised its improper role by adopting the euphemism oga (“Other Government Agency”) in its dealings with Twitter. It is illegal for the agency to surveil domestically, much less suppress the expression of U.S. citizens, but of course those rules don’t apply when political enemies are the target.

The operative progressive vocabulary was not “censorship” and “suppression” but rather fighting “disinformation.” Translated, that meant accusing political opponents of either engaging with “the Russians” or simply peddling “conspiracy theories” in their ignorance. Take, for example, the recently revealed project known as Hamilton 68, whose board of advisors includes Stanford professors and fellows as well as Bill Kristol, John Podesta, and an array of former government officials. Hamilton 68 sought to smother 644 accounts purportedly tied “to Russian influence activities online.” Yet even that ruse proved too much for Twitter’s notorious auditors. In the words of Twitter’s former chief censor, Yoel Roth, the Hamilton 68 list was “bullshit.” In this incestuous relationship, the Washington bipartisan establishment, at best, goes easy on Silicon Valley, and, at worst, hires it out as a third-party contractor to suppress individual citizens’ First Amendment rights.

Indeed, Silicon Valley has devolved into a revolving-door landing pad for former government officials.

The state actors also develop expectations that lucrative sinecures await them after government service. Indeed, Silicon Valley has devolved into a revolving-door landing pad for former government officials seeking a profitable retirement or breathing space between government appointments. Some twelve high-ranking fbi officials who worked closely with Twitter employees to supply the names and accounts of perceived political enemies of the Biden administration eventually found high-paying jobs with the social-media concern. The former fbi general counsel James Baker is now facing a storm of criticism and congressional inquiries about his agency’s role in seeding the mythical Steele dossier to the media and within government. In 2018, he simply retired and rotated into a role as head legal counsel at Twitter. Once there, he garnered a reported $8 million a year in total compensation. That was likely fifty times what he earned per annum at the fbi.

The array of liberal luminaries ending up in Silicon Valley is quite astonishing. Big Tech invests in them either on the surety that they will again be useful when they return to the administrative state, or as compensation for their past ideological service, or simply for their celebrity left-wing bona fides. Indeed, Silicon Valley does for Democratic grandees what General Dynamics, Lockheed Martin, Northrop Grumman, Raytheon, and other defense contractors do for retiring three- and four-star generals and admirals who join their lobbying teams or become board members, on the premise that their contacts in the Pentagon will help win lucrative arms contracts.

Ryan Metcalf, a former senior analyst in the Obama White House, on leaving government gravitated immediately to PayPal. Nick Shapiro, the deputy chief of staff at the cia, landed at Airbnb in “crisis communications.” The top Obama advisor David Plouffe went to Uber. Lisa Jackson, the Environmental Protection Agency head, found her way to Apple. Attorney General Eric Holder also ended up working for Airbnb. The Facebook executive Sheryl Sandberg was the former chief of staff to Treasury Secretary Larry Summers. Michelle and Barack Obama in their last year in the White House negotiated a deal reportedly worth $100 million to work as “content creators” for the Los Gatos–based Netflix.

Yet the left-wing–tech fusion involves more than just freedom from regulation, huge political donations to Democratic candidates, and cushy sinecures. More disturbing are the gargantuan cash infusions from Silicon Valley into “nonprofits” and legal firms that seek to alter the way Americans vote. In a now-notorious 2021 essay in Time, Molly Ball boasted of the way in which Silicon Valley had sought to brand its efforts to alter the 2020 balloting process as being in the service of “election integrity.” In truth, Silicon Valley’s money and expertise ensured that nearly 70 percent of the balloting in many states did not occur on Election Day. Yet the ensuing flood of early and mail-in ballots somehow resulted in a reduction of the usual rejection rate for improper, incomplete, or illegal ballots.

Ball praised the effort as a salutary “cabal” and “conspiracy.” Indeed, she bragged that her essay was “the inside story of the conspiracy to save the 2020 election.” This noble complot was “unfolding behind the scenes” and “touched every aspect of the election.” It “got states to change voting systems and laws and helped secure hundreds of millions in public and private funding.”

Ball turned giddy when detailing how Democratic operatives “successfully pressured social media companies to take a harder line against disinformation and used data-driven strategies to fight viral smears.” Again, note that “disinformation” and “smears” are the usual euphemisms for incorrect expression deemed worthy of censorship.

Ball noted that one Laura Quinn, a progressive founder of something called Catalist and former deputy chief of staff for Al Gore, organized a “secret project,” which “she has never before publicly discussed.” It

tracked disinformation online and tried to figure out how to combat it. One component was tracking dangerous lies that might otherwise spread unnoticed. Researchers then provided information to campaigners or the media to track down the sources and expose them.

Ball gleefully details how leftist-funded groups smeared social-media expression as “disinformation” so as to curtail narratives they found unhelpful to their electoral agendas. For that end, Ball notes that activists like Quinn agreed that the “solution was to pressure platforms to enforce their rules, both by removing content or accounts that spread disinformation and by more aggressively policing it in the first place.”

And it worked. Silicon Valley indeed took a “harder line,” although one asymmetrically focused on conservatives. After a meeting of activists with Facebook’s Mark Zuckerberg and Twitter’s Jack Dorsey, one participant said it

took pushing, urging, conversations, brainstorming, all of that to get to a place where we ended up with more rigorous rules and enforcement. . . . It was a struggle, but we got to the point where they understood the problem. Was it enough? Probably not. Was it later than we wanted? Yes. But it was really important, given the level of official disinformation, that they had those rules in place and were tagging things and taking them down.

The most famous result of their “struggle” was to censor the truth about Hunter Biden’s laptop, on grounds that it was “Russian disinformation.” One poll taken after the election suggested such suppression may have swayed 2020 presidential voters. The New Jersey–based Technometrica Institute of Policy and Politics reported that nearly 80 percent of Americans who knew of the Hunter Biden laptop scandal prior to the 2020 presidential election believe that “truthful” coverage would have changed the outcome of the election.

As part of this effort to alter pre-election expression and the traditional way Americans vote, Zuckerberg injected over $400 million into key voting precincts in 2020. Ostensibly, his nonprofit recipients “just wanted to help” overworked registrars. In fact, they often coordinated their own work with left-wing poll workers paid for by groups that had received his enormous donations.

Silicon Valley has been successful in using its clout and money to mask its interference in American elections, its fusion with governmental agencies, and its suppression of First Amendment rights by adopting its own Orwellian vocabulary of “protecting election integrity,” combating “voter suppression,” and fighting “hate speech,” “misinformation,” and “disinformation.” These efforts seem more redolent of 1920s Chicago politics than the hipster image of techie Silicon Valley.

Yet nearby Stanford University helps to offer Big Tech a patina of academic credibility for their election meddling.

Yet nearby Stanford University helps to offer Big Tech a patina of academic credibility for their election meddling. It often does so with joint projects bearing utopian titles such as the “Stanford Internet Observatory” (directed by the former Facebook security head Alex Stamos), which is joined to the “Election Integrity Partnership,” supposedly to bring together “misinformation researchers.” And thus we come to the second leg of the powerful troika of Silicon Valley, Stanford, and San Francisco Democratic Party politics.

Stanford University often reminds the nation that it birthed Silicon Valley’s high-tech miracle. Such boasts are certainly credible. The legacy of combining cutting-edge university research with risk-taking entrepreneurship was the idea of the Stanford engineering professor, dean, and provost Frederick Terman (1900–82). His visions fueled the landmark careers of two notable students, William Hewlett and David Packard, and birthed university-sponsored and spin-off tech enterprises such as the Stanford Research Institute, the Stanford Office of Technology Licensing, and the Stanford Industrial Park.

That partnership continues today. Stanford graduates have established more start-up companies, raised more Silicon Valley capital investment, and led more tech companies than any other university’s graduates in the United States over the past fourteen years. A recent study of one hundred and fifty of Silicon Valley’s largest publicly traded corporations found that a fifth of all the chairmen of their boards of directors were Stanford alumni.

Until recently, Stanford contented itself with offering a steady stream of engineering and computer-science graduates and future financiers and corporate managers to the adjoining high-tech companies. The public face of this campus–corporate merger is not just the legions of current Stanford professors and administrators who enjoy a number of lucrative Silicon Valley board memberships. Grandees from neighboring tech companies, venture-capital firms, and investment companies also serve on Stanford’s own array of university boards and advisory committees.

Yet lately Stanford’s role has expanded from jump-starting and fueling the growth of Silicon Valley to adding a prestigious intellectual gilding to the raw money, power, and excess of a tech world that has grown to a market capitalization worth some $9 trillion. Within just a fifteen-mile radius of the campus hub are the spokes of major tech headquarters in Palo Alto, Menlo Park, Mountain View, Cupertino, and Sunnyvale.

The left-wing campus culture that then permeates the tech companies masks the greatest concentration of wealth in the history of civilization.

The university’s complex and layered associations with Silicon Valley also goad the companies into promoting university-prompted commitments to woke social priorities. The left-wing campus culture that then permeates the tech companies masks the greatest concentration of wealth in the history of civilization. The osmosis, now characteristic of the Democratic Party in general, can appear bizarre. Tech billionaires in faded jeans, tie-dye T-shirts, and flip-flops stroll on University Avenue. They are surrounded by the Audis, bmws, Mercedeses, Lexuses, and Teslas—a few with blm stickers—that crowd Stanford’s student parking lots.

Yet in the last decade, the Stanford pipeline to Silicon Valley has been turning malodorous. In part, the unease is due the precipitous decline of the reputation of the university after a series of scandals. Stanford’s long-serving president Marc Tessier-Lavigne—an accomplished neuroscientist, multimillionaire, and former executive vice president of the San Francisco–based bioengineering firm Genentech and cofounder of Denali Therapeutics, also of San Francisco—is currently reportedly under investigation for prior scholarly misconduct.

The accusations involve a group of papers that Tessier-Lavigne coauthored that were accompanied by supposedly enhanced and misleading illustrations. Or, as the left-wing Stanford Daily put it:

The European Molecular Biology Organization (embo) Journal was reviewing a paper co-authored by Tessier-Lavigne for alleged scientific misconduct. Science misconduct investigator Elisabeth Bik and other experts also identified “serious problems” in three other papers, including two where the president was the lead author.

Like the former Stanford president John L. Hennessy, who now serves as chairman of the board of Alphabet, Tessier-Lavigne is active in corporate governance, currently serving on the boards of Denali Therapeutics, Regeneron Pharmaceuticals, and Agios Pharmaceuticals. Before his presidential tenure, he was on the Pfizer board.

Stanford’s undergraduate admissions also continue to be a source of controversy after the 2019 scandal in which it was revealed that several wealthy families had paid millions of dollars in attempting to gain their children’s acceptance by being recruited as “athletes” through the university’s sailing program. Stanford’s current website boasts that, with its reparatory admissions policy, its incoming freshman class of 2026 has only 22 percent American whites (who account for some 67–70 percent of the U.S. population). The makeup is 29 percent Asian American, 17 percent Hispanic, 7 percent black, 10 percent “multiracial” (likely Asian-white), and 13 percent “International.” The black admission rate of 7 percent is identical to that of Harvard, Yale, and Princeton, suggesting a consensus among these schools. Stanford’s Asian and Hispanic admission rates are roughly twice that of the schools in the Ivy League, probably reflecting the demographic differences between the two coasts.

Meanwhile, Stanford’s graduate engineering program, perhaps the best in the country, has the following demographic mix: 28.5 percent American white, 17.2 percent Asian American, 1.8 percent black, 7.8 percent Hispanic, 6.5 percent multiracial, and a whopping 38.2 percent international. The engineering graduate school supplies Silicon Valley with its talent on demand. And, so far, Stanford has not applied much of its woke diversity, equity, and inclusion quota formulas to its graduate programs in Silicon Valley–related fields. Tech moguls still rely on the university to supply them with qualified Ph.D.s, and, in turn, they want to send their kids to Stanford’s undergraduate school.

Yet Stanford strangely is less reticent in boasting that it now rejects 96 percent of all applicants.

So Stanford, along with other schools, recently made the sat and act optional for undergraduate admissions, giving the admissions office complete freedom to use its own subjective criteria—a freedom that it had not enjoyed since the 1950s. The university will not disclose how many of those who were accepted chose not to take the now-optional standardized tests as part of the application process. Yet Stanford strangely is less reticent in boasting that it now rejects 96 percent of all applicants. In the past it has publicized that it has rejected 60 to 70 percent of the 1 percent of sat takers who received a perfect score and applied to Stanford.

A cynic might conclude that the university has good reason not to disclose how many admitted applicants who were willing to take and submit the sat scored poorly. And it now attempts to square the circle of lax admittance standards and high rejection rates by bragging that even a perfect test score usually won’t get one into Stanford—at least for some students.

As a result, members of the white working classes, who may be superbly qualified but without such connections or money, are de facto barred from admission.

The basic truth is that Stanford uses the prestige of its research graduate schools to “monetize” its undergraduate program into a way to reward its powerful friends in corporate America, especially Silicon Valley, and politics, especially Democratic Party politics. Stanford’s engineering graduate school sends its product to Silicon Valley, Silicon Valley sends money to Democratic politicians who protect Stanford and Silicon Valley, and Stanford admits the children of tech moguls and politicians to their undergraduate school. And as a result, members of the white working classes, who may be superbly qualified but without such connections or money, are de facto barred from admission.

Silicon Valley’s trillions of dollars in capitalization may help explain Stanford’s huge endowment of $37 billion, the third-largest in the country (after Harvard and Yale). Yet such staggering sums in such a small radius have also contributed to an epidemic of financial and moral corruption that has brought disrepute to the university and its liaison with Silicon Valley.

Elizabeth Holmes—the charismatic and savvy founder and ceo of the now-bankrupt and disbanded Theranos—was recently sentenced to eleven years in prison for defrauding investors of hundreds of millions of dollars. Holmes applied for her first patent as a Stanford sophomore. And while she soon dropped out to found Theranos, Holmes remained a frequent visitor to campus.

For a time, that association helped fuel media puff pieces about the then-youngest female billionaire in the world. The twentysomething Holmes usually appeared on campus in a Steve Jobs–esque all-black getup, courting Stanford-affiliated luminaries. In the end, she successfully charmed the campus–corporate nexus into steering billions of dollars of investment into her mobile, miniaturized blood-testing device, the “Edison.” Holmes kept insisting, without any proof, that the Edison would revolutionize blood testing by using automatized and miniaturized kits requiring only microscopic volumes of blood.

Holmes proved no Wizard of Palo Alto.

But Holmes proved no Wizard of Palo Alto. Instead, she and her co-conspirator and paramour, the Theranos coo Sunny Balwani, at an early date realized that their Edison was a bust and a fraud. Yet for years the two still engineered a complex con by altering data, suppressing incriminating internal reviews, and misleading investors. Before its utter collapse, Theranos had reached nearly $9 billion in capitalization and lured in some of the great investing families in the nation—the Waltons, the DeVoses, and the Murdochs. They ended up losing hundreds of millions of dollars. Balwani, like Holmes, is appealing an impending long prison sentence.

How did the medically ignorant Holmes and Balwani pull off such a brazen scheme among the supposedly savviest investors on the planet? It surely helped that Holmes charmed her Stanford community, especially the renowned George Schultz of the Hoover Institution. The lauded statesman and economist brought to her board an all-star cast of Stanford and Hoover luminaries. None of them, however, had much experience in medicine or technology, and few had any in corporate governance. But it was likely this façade of Stanford legitimacy that explains why so much money was poured into such a patently suspect idea.

Sam Bankman-Fried easily outdid Holmes. He is the architect of likely the greatest financial scandal in U.S. history. His net worth peaked at $26 billion, before the collapse of his house-of-cards ftx cryptocurrency scam. Bankman-Fried grew up on the Stanford campus, the son of two noted Stanford Law professors, Joseph Bankman and Barbara Fried. Caroline Ellison—Bankman-Fried’s erstwhile partner, occasional girlfriend, and the ceo of ftx’s sister investment company Alameda—was a Stanford math major. She was also the scion of two professor parents, both at mit, where Bankman-Fried did his undergraduate work. Ellison has now pleaded guilty to various counts of fraud. And in exchange for a lighter sentence, she is working with federal prosecutors as they build their case against Bankman-Fried. He is currently confined to his parents’ home on the Stanford campus, while on bail awaiting trial.

Bankman-Fried purportedly stayed out of jail by posting a $250 million bond. But, in fact, he and his family and friends put up very little money, although they were helped by two Stanfordites, the former law school dean Larry Kramer and Andreas Paepcke, a Stanford senior research scientist who signed a $500,000 guarantee. Somehow that was enough to satisfy the government’s quarter-billion-dollar bond.

Stanford Law School itself has recently suffered a series of public-relations embarrassments. In early 2023, an ethics complaint was filed against Professor Michele Dauber for posting a series of ad hominem attacks on Camille Vasquez, the recent attorney of Johnny Depp. The Stanford law professor tweeted that Vasquez was a “Pick Me Girl” for defending the actor against sexual-assault allegations. The ethics complaint further alleged that Dauber posted death threats against Depp by fantasizing about the actor’s murder and hoping that his corpse would be devoured by rats.

Another Stanford law professor, Pamela Karlan, in testimony before the House Judiciary Committee’s hearing on the second impeachment of President Trump, out of the blue gratuitously attacked the president’s thirteen-year-old son, Barron Trump: “The Constitution says there can be no titles of nobility, so while the president can name his son Barron, he can’t make him a baron.” In 2021, a graduating Stanford law student sent the entire law school student body a fake invitation, appearing as if it were sent from the school’s small conservative Federalist Society. It read in part:

The Stanford Federalist Society presents: The Originalist Case for Inciting Insurrection. . . . Riot information will be emailed the morning of the event. . . . Violent insurrection, also known as doing a coup, is a classical system of installing a government. . . . Although widely believed to conflict in every way with the rule of law, violent insurrection can be an effective approach to upholding the principle of limited government.

In mid-March, the Federalist Society invited Fifth Circuit Court of Appeals judge Kyle Duncan to speak at the law school (as discussed in “Notes & Comments” in The New Criterion of April 2023). The judge was not allowed to finish his lecture. Law-school students drowned him out. They flashed obscene placards in his face. Some gave their pseudo-radical game away when the self-important protestors mocked Duncan for being without the supposed qualifications to be admitted to their own Stanford Law School. And then, mission accomplished, they smugly stomped out.

When an exasperated Duncan had called out for a university administrator to restore calm, the judge’s podium was instead arrogated by the associate dean for diversity, equity, and inclusion, Tirien Steinbach. She then gave her own pre-planned and scripted lecture, expressing empathy with the scheduled disrupters. Steinbach asked the startled judge whether it was even worth supporting his free-speech rights, given he and his views were deemed abhorrent by the new absolutist Stanford community. Stranger still, one Stanford administrator urged any hurt Federalist Society member to contact Dean Steinbach for consolation. And when Dean Jenny S. Martinez finally offered Justice Duncan an apology, her class was disrupted by her own furious law students. Martinez wrote several apologies for the debacle, claiming it was inconsistent with the law school’s commitment to freedom of speech. But never once did she suggest that any of the rowdy heckling students would face consequences.

Despite these contretemps, the Stanford brand continues to work for those privileged enough to bear it. It did for Bankman-Fried and Ellison what it had done for the similarly young Holmes. It helped to lure investors into handing over millions of dollars to a near adolescent who had allegedly mastered the esoteric world of cryptocurrency but otherwise simply shuffled money around investor accounts, always desperate to draw in more new investment capital from the naive than what was being withdrawn by the increasingly skeptical.

Bankman-Fried’s togs differed from those of Holmes. His look was more affected Stanford slob—cut-off jeans, flip-flop sandals, baggy T-shirt, and wild hair—and was intended to lend an air of Einsteinian mad genius to his otherwise age-old shell game. sbf, as he is sometimes known, currently faces a possible one hundred years in prison for multiple counts of campaign-finance violations, money laundering, and wire and commodities fraud.

How did Bankman-Fried, like Holmes, for so long elude scrutiny?

How did Bankman-Fried, like Holmes, for so long elude scrutiny? He too hit all the right Stanford, tech, and political buttons. Holmes hosted a Hillary Clinton fundraiser, while sbf lavishly donated millions of dollars to Democratic candidates—and, more importantly, promised hundreds more millions to come while touting his left-wing credentials and Stanford-parents pedigree.

His mother, Barbara Fried, is a dark-money bundler of Silicon Valley millions and the founder of “Mind the Gap.” That effort promised the Silicon Valley rich both anonymity and an approved list of needy progressive candidates and causes. Her husband Joseph Bankman is a well-known progressive tax-law expert and a frequent consultant to Democratic lawmakers like Elizabeth Warren. In the end both Stanford professors are facing federal investigations concerning purchases of property in the Bahamas, specifically their joint ownership of a $16.4 million home that was apparently transferred to their names by their son or his company or both.

Trillions of dollars and hundreds of tech companies have long drawn Chinese-government-affiliated companies to Silicon Valley, and thus inevitably also to Stanford. In July 2020, a Stanford visiting neurology researcher named Chen Song was arrested for not disclosing that she had been an agent of China’s People’s Liberation Army. Song had managed to become a temporary faculty member at Stanford, tasked with engaging in military espionage. As far back as 2014, Stanford had come under pressure to shut down its Confucius Institute, funded indirectly by the Chinese Communist Party and allegedly monitoring Chinese student activities on campus.

Nonetheless, such laxity in campus security made little impression on the university faculty, given that in September 2021 some 177 professors and researchers signed a petition to the Department of Justice demanding it stop investigating potential Chinese spies at U.S. universities. A year earlier, Stanford had been investigated by the Department of Education for some $64 million in alleged Chinese-affiliated donations over a decade, all from previously unidentified and anonymous Chinese donors, most of them believed to be government-associated. Song eventually had all charges dropped, was reissued a passport, and silently returned to China.

Sometimes, however, the campus’s left-wing politics go too far, even for Stanford, causing as much scandal as those infusions of Chinese money did. Recently, the university quietly took down from a school-affiliated website the embarrassing “Elimination of Harmful Language Initiative” list of taboo vocabulary “compiled by a group of administrators working over some eighteen months.” In Orwellian fashion, they boasted of an effort to excise “harmful” words from campus usage:

The goal of the Elimination of Harmful Language Initiative is to eliminate many forms of harmful language, including racist, violent, and biased (e.g., disability bias, ethnic bias, ethnic slurs, gender bias, implicit bias, sexual bias) language in Stanford websites and code.

The list included words such as “immigrant,” “citizen,” and “American.” Yet Stanford acted to disassociate itself from the list only after The Wall Street Journal mocked the campus thought police. The embarrassing wsj exposé ended with a Parthian shot by implying such nonsense was the logical result of a bloated staff full of idle woke administrators:

We can’t imagine what’s next, except that it will surely involve more make-work for more administrators, whose proliferation has driven much of the rise in college tuition and student debt. For 16,937 students, Stanford lists 2,288 faculty and 15,750 administrative staff.

As a footnote to the vocabulary embarrassment, the Elimination of Harmful Language Initiative also included ways to offer Stanford snitches “financial rewards for finding/reporting” any who violated the provisions of the list. The language policing was known in Stanfordspeak as “The Protected Identity Harm (pih) Reporting” system.

The most egregious faculty and administrative embarrassment, however, involved Stanford’s response to the high-profile commentaries of three of its most renowned immunologists, epidemiologists, and public-health experts, Drs. Scott Atlas, Jay Bhattacharya, and John Ioannidis. Very early in the covid epidemic—whose onset resulted in government lockdowns, mandates, and quarantines—all three questioned the wisdom of policies by local officials and President Trump. They kept that criticism up when the administration changed but America’s covid policies stayed much the same.

The three experts cited various scientific data that suggested the quarantines would not stop the pandemic. They argued that natural immunity was as effective as or superior to acquired protection via vaccination and that those under eighteen were at little risk of serious infection, but that young men under forty might be inordinately at risk for side effects from the mrna vaccinations. Therefore, they argued that medical therapies and isolation protocols should be focused primarily on the most vulnerable, those over sixty years old. They wrote that the vaccinations would not prove to offer absolute defense against either infection or infectiousness. And most importantly and controversially, the three were not shy in insisting that the government-shutdown reaction to covid would do far more damage and eventually might kill far more people than the covid-19 virus itself—through increased suicides, spousal and familial violence, drug and alcohol abuse, economic recession, and the deprivation of the nation’s youth of two critical years of schooling.

Their voices often resonated in the conservative and libertarian communities. President Trump eventually brought in Dr. Atlas to offer independent assessments of the lockdowns. He was often at odds with those in the White House task force headed by Drs. Anthony Fauci and Deborah Birx, who insisted on closing the schools and shutting down much of the economy, all while pushing mandatory vaccinations on the entire population.

Rather than being honored that the nation’s three most prominent researchers of the quarantines were Stanford-based, the university attacked them. A faculty senate resolution and a petition from ninety-eight members of the medical school faculty impugned the integrity of Atlas, most likely because he had become a presidential health advisor to Trump. Indeed, Stanford’s faculty senate went on record condemning Atlas for various tweets blasting state and federal lockdowns and the damage those restrictions incurred without measurably stopping the spread or lethality of the virus.

University President Tessier-Lavigne, along with the provost and the dean of the medical school, criticized Atlas’s skepticism of then federal policies. Indeed, the medical-faculty petition alleged that Atlas has promoted “falsehoods and misrepresentations of science.” And the signatories accused Atlas of seeking to “undermine public health authorities and the credible science that guides effective public health policy.”

The government’s response to the pandemic indeed did more damage—just as Atlas had warned—than the virus itself.

Only when Atlas threatened to sue the university faculty for character defamation did the petitions and faculty senate resolutions cease. In retrospect, the consensus of the now “credible science” has found that the government’s response to the pandemic indeed did more damage—just as Atlas had warned—than the virus itself. As the three Stanford doctors had stated repeatedly, natural immunity did prove as or more effective in warding off serious covid illness and general infectiousness than the vaccinations. And they were presciently aware that the national shutdown of schools and businesses would do historic damage that will resonate for decades well beyond the toll of covid.

The final leg of California’s model triangle of power is political. More specifically, Bay Area progressive politicians both protect and benefit from the money, influence, and prestige of the Silicon Valley tech industry and its veneer of university high culture. And their resulting emergence over the last three decades ended southern California’s former hold on the state’s politics, in the eras of conservative governors Ronald Reagan, George Deukmejian, and Pete Wilson, all from the Los Angeles area.

More astonishingly, by the end of 2022, no single American city had produced more recent powerful and influential leaders than San Francisco. All were fueled by the rise of the Bay Area–Silicon Valley tech and financial corridor, and in turn drove the hard-left trajectory of the national Democratic Party.

San Francisco left-wing paragons include the recent Speaker of the House Nancy Pelosi (DCA) and Vice President Kamala Harris. The San Francisco–based Dianne Feinstein is the third-longest-serving senator and a former chairwoman of the Senate intelligence and judiciary committees. Gavin Newsom, the former San Francisco mayor and current California governor, is a likely 2024 Democratic presidential candidate. The former Bay Area resident Barbara Boxer was a U.S. senator and congressional representative for thirty-four years. Jerry Brown, a four-term California governor, was also a four-time Democratic presidential candidate. The San Franciscan Willie Brown, the mentor of Vice President Harris, was a thirty-year veteran and former speaker of the California assembly and a two-term mayor of San Francisco.

In other words, the Bay Area has birthed the second- and third-most-powerful officials in the U.S. government in Harris and Pelosi, and both of California’s senators during the last thirty years. If Newsom runs for president, he will be the third recent Bay Area politician to do so. These nationally known politicos have collectively served six gubernatorial terms, and Willie Brown, Dianne Feinstein, Gavin Newsom, and Jerry Brown spent an aggregate thirty-three years as recent mayors of San Francisco or Oakland. Many have Stanford ties. Feinstein is a Stanford graduate. Harris’s father was a longtime Stanford professor. Newsom’s father graduated from Stanford Law School, and Jerry Brown’s sister Kathleen was a Stanford undergraduate.

Yet their prominence offers a paradox of sorts, when one considers that annually three hundred thousand residents are fleeing from their California. Meanwhile, San Francisco has become synonymous with the ongoing national epidemic of blue-city crime, homelessness, vagrancy, health concerns, vacant downtown office space, high taxes, and declining population.

Indeed, in the last half century, Bay Area politicians brought to the national scene their hard-left California politics on an array of issues such as climate change, immigration laws, identity politics, environmental regulation, restrictive zoning, the prosecution of violent crimes, bail laws, gun control, border security, gay marriage, abortion, and transgenderism. As such, they marked the dividing line between a former California of centrist and conservative governors and senators, and a now-permanent left-wing state that claims itself as the Democratic model for a new American nation.

The marriage of San Francisco politics, corporate culture, and wokeness—the “Green New Deal,” dei (diversity, equity, and inclusion), and esg (environmental, social, and governance)—is perhaps best typified in the recent collapse of Silicon Valley Bank, the second-largest bank meltdown in American history. At its height, the bank invested billions of dollars in Silicon Valley technology and green-energy start-ups. All were eager to capture easy contracts from budget-busting massive “infrastructure” bills during the Obama and Biden administrations. Bank executives boasted of their diversity profiles, their green investments, and their contributions to left-wing groups and Democratic candidates and pacs—all meant to mask the reality that they were issuing subprime business loans to risky start-ups, ignoring the reality that their locked-in, long-term government bonds paid little interest in a period of 6–8 percent annual inflation. That time bomb went off when edgy depositors demanded higher returns and clients grew fewer and were increasingly buffeted by stagflationary layoffs. A March run on the bank by depositors drained the institution of cash, while a massive sell-off of newly low-yield government bonds incurred huge losses and further fueled the panic.

The old network of liberal San Francisco politicians and Silicon Valley moguls immediately lobbied the Biden administration to cover their huge losses, given 96 percent of all svb depositor accounts vastly exceeded the Federal Deposit Insurance Corporation’s $250,000 insurance limit. Governor Newsom badgered the Biden administration the hardest, saying that a massive multibillion-dollar bailout for the depositors was necessary. He omitted the fact that the bank had given $100,000 to his wife Jennifer Siebel Newsom’s California Partners Project. He did not disclose that the Newsoms had numerous personal accounts with the bank, and that his three wineries—cade, Odette, and PlumpJack—were clients of svb.

Like Newsom, most of these progressive politicians are multimillionaires, having raised enormous amounts of money from Silicon Valley sources, having spousal or personal ties to lucrative California businesses, and having profited from Chinese investments.

California leads all states in Chinese investments.

Indeed, California leads all states in Chinese investments, totaling well over $5 billion in the last twenty years. So it is no wonder that the former senator Boxer had been a registered agent-lobbyist for a number of Chinese-government-controlled companies. Senator Feinstein was clueless that her chauffeur of some twenty years was a spy for the Chinese communist government, all while she served as chairman of the Senate intelligence committee. The Bay Area congressman Eric Swalwell was romantically involved with a Chinese spy called “Fang Fang” (Christine Fang), either before or during his tenure on the House Intelligence Committee.

Nancy Pelosi’s son has sizeable investments in Chinese-government-related companies. Gavin Newsom greenlit a $1.4 billion contract to a Chinese consortium to provide the state with protective KN-95 masks during the covid lockdowns. That Chinese group included byd, a company that has lavished money on Sacramento politicians, including $40,000 to Newsom’s own gubernatorial campaign. The former governor Jerry Brown in 2019 launched a joint Chinese/American climate-change think tank at the University of California, Berkeley—in partnership with China’s top climate official, Xie Zhenhua.

All of these politicians have treated China as a friendly partner to California and kept mum about vast Chinese Communist Party espionage operations in Silicon Valley. Nancy Pelosi may now talk grandly in declaring that “the era of self-regulation is over,” but she proved the chief impediment to bipartisan legislation that, inter alia, would have banned tech companies from giving preference to their own products and companies in search results and monopolizing markets.

The signature policies of the California Democratic Party establishment helped to welcome in millions of impoverished people from south of the border, to ensure trillions of dollars in Big Tech market capitalization and thousands of new tech employees using H-1B visas in Silicon Valley, and, over the last forty years, to drive out over ten million largely middle-class and conservative Californians who could no longer afford the high taxes and onerous regulations that paradoxically seemed to guarantee dismal schools, ossified infrastructure, high crime, and soaring home prices. That demographic trifecta—welcoming in the foreign poor, courting the rich, and ousting the middle class—explains in large part current-day California and the direction of the national Democratic Party itself.

The Democratic Party has adopted a hard-left progressive agenda, one far more radical than at any time in its history. It is now a party dominated by bicoastal globalized wealth. It is deeply embedded within and compromised by Chinese investment. It has become both the protector and beneficiary of monopolistic Big Tech. Much of its woke politics were inherited from pet elite universities.

It owes a great deal of this metamorphosis to the current politics of California that were birthed in San Francisco and Silicon Valley, all burnished with the sheen of Stanford University. The Democrats now seek to make California politics the operating principles and ethos of the United States.

Leukophobia & other obsessions

 

he personal history of Dan-el Padilla Peralta is by design inseparable from his scholarly work. So a brief review of his biography is needed to make sense of his latest offering, Classicism and Other Phobias.1 Like his earlier book Undocumented: A Dominican Boy’s Odyssey from a Homeless Shelter to the Ivy League, the new work is autobiographical and self-referential. Sadly, it otherwise constitutes one of the most poorly organized, badly written, and racially obsessed books to appear recently from a university press.

Padilla may have ample reason to assume that the small circle of his readers—classicists especially—are familiar with his personal details, given the widely published profiles, features, and puff pieces about his past. Indeed, for more than twenty years the currently forty-two-year-old Princeton classicist has been the subject of dozens of laudatory stories focusing on his remarkable ascent from a young Caribbean immigrant to a stellar Ivy League classics scholar.

In Undocumented and in several other venues, Padilla has written extensively and dramatically about his personal journey to America from the Dominican Republic as a child with his middle-class parents. Both of the latter then overstayed their visas. That decision rendered the four-year-old Padilla an undocumented alien, who was subsequently raised largely by his mother.

Padilla’s life changed radically at nine. Jeff Cowen, a young prizewinning photographer and scion of an old Manhattan investment-banking family, met him and discerned talent in the young Dan-el. Cowen soon saw to it that Padilla was admitted with a full scholarship to his own alma mater, the Collegiate School—a prestigious, nearly four-hundred-year-old Upper West Side K–12 private school where Padilla subsequently excelled in history and languages. The current $65,000-a-year tuition and five-to-one student–teacher ratio explain why it is one of the most prestigious schools in the country, with an exceptional record of sending its graduates to the Ivy League—as Padilla can well attest.

After graduating from Collegiate, Padilla was embraced by a number of elite supporters. Admission to Princeton followed, fully funded by the university. Upon receiving his undergraduate degree, Padilla was next awarded a two-year fellowship to Oxford, winning exemptions (with the help of influential sponsors) for his still-illegal status that enabled him to travel abroad. His Oxford sojourn (M.Phil.) led to a graduate fellowship at Stanford, where Padilla completed his 2014 Ph.D. dissertation on aspects of Roman religion in the early republic.

He next received another prestigious two-year postdoctoral appointment, to Columbia’s Society of Fellows. Upon its conclusion, he was quickly hired by his alma mater Princeton as a tenure-track assistant professor of classics. Padilla’s meteoric academic climb (from assistant to full professor in eight years) and media adulation were clearly reflective of his own talent. Yet his success was also facilitated by the magnanimity of well-connected and influential patrons and champions, and by the prevailing atmosphere of diversity and elite higher-educational outreach to the underprivileged.

His Dominican heritage and illegal status were not drawbacks to academic advancement—though he has since argued as much—but in fact proved unique advantages, at least in his adopted and insular world of elite academia. In truth, it would be difficult to find any young classics student in America who has been the beneficiary of comparable largesse and attention, resulting in more than two decades of top-flight private education—with a likely aggregate million dollars’ worth of free tuition, stipends, grants, and fellowships.

Initially, at least, Padilla seems to have appreciated such unusual opportunities and benefactions in a field where unemployed Ph.D.s are now the norm. He once expressed an overriding fascination with the wonders and beauties of classical antiquity. Early on, he reminded interviewers that his love of the ancient world had been first instilled as a small boy.

But at some point in his graduate education at Stanford, Padilla experienced a road-to-Damascus revelation. He now rather abruptly embraced current academic orthodoxy, including a radical reinterpretation of his prior good fortune. At Stanford, Padilla soon proved a sharp critic of the West in general and in particular his newly adopted country. He came to look back at the former generosity of his many patrons and institutions as the typical condescension of the white establishment to the Other—as if they, not he, had cynically used his unusual background for virtue-signaling advantage.

After his makeover, Padilla became, in mock-heroic fashion, a supposed academic maverick on the barricades fighting the calcified and all-too-white classical establishment. Yet Padilla’s earlier expressions of gratitude to his multitude of benefactors and his demonstrated fascination with the complexity and power of classical literature were his true but fleeting unorthodoxies. In contrast, his graduate-school conversion embraced the stale postmodern academic status quo of the last half century.

Once the switch had been made, Padilla hit the accelerator on what he later called the “decolonization of my mind.” He now joined the trendy generation of careerist classicists, centered at Stanford, who call for the destruction of their own traditional discipline—complete with the dethronement of Latin and Ancient Greek language study from classics at the undergraduate level, and perhaps from the field itself.

The logical, and indeed likely anticipated, results of his radical conversion were even more laurels, which came thick and fast to Padilla from the very establishment that he had assailed. Padilla has often been praised by his peers as one of the most impressive classical scholars of his time. One Princeton classicist was quoted early on in Padilla’s career as predicting that he “will be one of the best classicists to emerge in his generation.”

Perhaps someday. But for now, Padilla has only authored one book in his field of classics—a revision of his 2014 Stanford Ph.D. thesis, issued in 2020 by his alma mater and current employer, Princeton. He also published the aforementioned autobiography in 2015, which details his triumph over poverty and his academic success, but also his gradual transition to victimhood. This present short book, then, is his third. But it is not really an analysis of classics and its pathologies, as the title might imply. Instead, Classicism and Other Phobias is mostly a rehash of his earlier embittered personal essays, now pasted together onto short excurses on Caribbean art and literature.

Yet for all his counterclassical posturing, Padilla still appears no maverick. Rather, he has become indistinguishable from the elite academics he attacks. In his biographical writings, Padilla meticulously cites every scholarship, every award, every fellowship, and every degree that he has won or earned. In the present book’s bibliography, he lists essays in online journals and unpublished lectures. He adopts the off-putting but revealing academic habit of citing a “forthcoming” work. And he then adds a novel sort of padded entry, for a work cited as “in progress.” This tic, if normalized, would ensure that academic bibliographies are longer than the texts themselves.

Classicism and Other Phobias suffers from three unfortunate characteristics of presentation and methodology, well aside from the repetitive and often wearisome content. First, it appears that Padilla can no longer write. His earlier autobiographical narratives about his own accomplishments were readable and occasionally engaging, perhaps to ensure a wider audience. But when he now offers scholarly analyses, his prose appears to be ossified in stale 1980s-style postmodernism—
characterized by unfathomable syntax, bizarre grammar, and invented vocabulary.

A few random examples suffice (these could be multiplied on every page):

Most urgent of all, though, is responsibly confronting the prospect that the forms of classicisms most legible to those of us situated in the twenty-first-century predominantly White Euro-American academy are—because of their historical imbrications and present-day vortices—incommensurate with levelly human recognition of Indigenous sovereignty, no matter how aggressively the “Indigenous turn” in ancient and medieval studies is pursued.

The development and fine-tuning of theory in close dialogue with this structure’s stabilization has steered my research along several axes in recent years, primarily in connection with the historical mechanics and institutional forms of epistemicide.

On an honest and searching examination of the research protocols and disciplinary genealogies of Greco-Roman classics, it emerges that the field is suspended in space-times either adjacent to or ensconced firmly within plantocracy and its direct beneficiaries.

The knowledge-work of these communities of inquiry has to be recognized as making a legitimate claim to unique forms of interpretive power that derive their potency equally from identitarian situatedness and fluency in theorizing that situatedness.

A lexicon is needed to translate all the made-up slang and silly neologisms throughout the book that Padilla passes off as either erudition or hip theorizing, such as copropolitics, doulology, problem/atic, ’tude, and toolsing.

Second, Padilla poses as a brave apostate. Indeed, he has earned attention for calling for the destruction of his own present field of classics, mostly as a way to expunge from history, in neutron-bomb fashion, its alleged racist white pedigree and innate ethnic and gender prejudices. Yet Padilla cannot compose a single page without virtue-signaling his solidarity with obscure postmodern scholars. Often he does so in a chatty fashion that is extraneous to his narrative and thus serves only as an obsequious appeal to a perceived senior authority.

Granted, Padilla boasts that he relies on the race and gender of scholars‚ and not necessarily the validity of their analyses or their depth of scholarship, to determine whether they will be awarded his prized references or citations. He apparently considers his race-based footnotes and allusions as valuable academic capital to be doled out to the like-minded in order to further a shared anti-whiteness agenda:

The first proceeds from the recognition that citation is one currency of the academic realm, where capital regularly accrues to those in senior/permanent positions—predominantly White and male.

But the fact that Padilla wishes to exclude so-called white authorities from his citations and references does not reduce the sheer monotony of his name-dropping. The result is that the book reads like a toadyish graduate student in the front row nodding for his professors’ approval:

After all, to channel Shelley Haley’s channeling of Ann duCille, the Euro-American discipline of Greco-Roman classics has had a nasty habit of boxing the histories and practices of Black classicists into the taming spaces of Whiteness.

In principle it is certainly possible, as Tilman Bechthold-Hengelhaupt has detailed with reference to the work of Jula Wildberger and Andreas Dörpinghaus among others, to fashion critically self-distancing pedagogies that deny exemplary authority to ancient authors and their texts.

My recourse to the language and framework of overrepresentation is a homage to and application of a core concept in Sylvia Wynter’s writings, as sharpened in the recent work of Mathura Umachandran.

Either as complement to or in place of the scheme of the “open field” that Brooke Holmes and Constanze Güthenke have proposed in response to the polar tendencies of “hypercanonicity” and “hyperinclusivity,” one could opt instead to search for and elevate an “oppositional set of communities of inquiry,” for exposition of a counternarrative that proceeds from the understanding that Greco-Roman classics in its Euro-American iteration is bedeviled by the same absences and silences that mark other White-centering epistemic projects.

Third, Padilla cannot cite evidence or sustained arguments to support his thesis of the needed destruction of “White” classics. Instead, he mostly recycles his tedious anger and racial bitterness. His fixations on racial chauvinism and anti-white invective become obsessive-compulsive. Indeed, on average they are found more than once on every page of his slim book: within a mere 148 pages of text, his pejorative “White” appears seventy-three times, “Whiteness” twenty, and “Blackness” forty-seven; “Black” itself shows up in more than a hundred fifty places. Such race-based vituperation in nearly every paragraph should remind readers that abject racialism is exactly what is not needed in these times of rising cultural tensions.

Otherwise, Padilla’s targets are seemingly endless. At times, his pantheon of evil includes not only the entire white academic community that suppresses blackness, but also the enemies of environmentalism; the old settler colonialists who ruined North America and now in new manifestations oppress the Palestinians; the nativist, xenophobic immigration restrictionists; the clueless black accommodators who lack his savvy; and countless more.

Note that Padilla does not even claim to live in New Jersey or work at Princeton. Instead, he dwells in the mythical land of Lenapehoking, the ancestral home of indigenous peoples—at least before New Jersey, along with much of the eastern seaboard, was stolen and exploited by the sort of whites who originally funded the ancient grant-giving Collegiate School and the fellowship-laden Princeton, Stanford, and Columbia.

Padilla assures us that “I research, teach, and raise family in Lenapehoking. It is to these lands and their ancestral caretakers that I owe first and lasting thanks.” Really? Yet he nonetheless often transitions back to his academic self, as when he signs off the book’s formal acknowledgments with the bylines “Princeton, NJ/American Academy in Rome, July 2024”—respectively, a state that illustrates his category of a “settler-colonialist structure” and a nation home to the archvillain of European colonialism, Christopher Columbus himself.

Within the first few pages of introductory material, Padilla already seems not to take his top-heavy, performance-art ideology too seriously, and so such paradoxes and hypocrisies are ubiquitous in what follows. We learn that whereas Padilla is properly proud of his fluency in Latin and Greek and the awards that it earned him, he can now afford to reinterpret such past praise as condescension—as a master’s pride in his accomplished pet.

So, when an elder classicist who was fond of Padilla brags about the latter’s non-white ancestry, Padilla bristles and finds therein proof of the sort of insulting head-patting that he has endured:

At the conference’s book exhibit, I ran into a retired White historian of ancient Mediterranean religions whose teaching and writing had mattered much to me. We fell into friendly memories. Then when one of his old friends approached, he made an introduction putting the old-boys network to work in the service of my advancement. This was the time-honored practice of an elder wielding his social capital to augment mine. But these transactions do not come without a cost, and there is always a tax to pay for upward conveyance within an academy that forever minoritizes folks like you. In this case, it was the handshaking introduction itself that assessed the tax. “Dan-el Padilla Peralta,” the retired historian said, enunciating the syllables of my name with relish, “how rare a name in our fields is that!”

And with that speech-act I was converted into the rarest of creatures.

The rarefied among us burn to escape. Indeed, when even the well-meaning Whites regularly demand from us the labor of circumnavigation, the only recourse is flight.

Not quite, Professor Padilla.

Padilla, after all, is now the Princeton full professor with clout, and the retired historian no longer such in the snobbish hierarchy of the academy. If Padilla now urges “flight” from such grandees, it is certainly late in coming. He apparently missed a multitude of earlier opportunities to “flee” from “the well-meaning Whites” at his various Ivy League and elite billets, where they “minoritize folks like you”—and no doubt do so with all of their insultingly white scholarships, grants, and awards.

It may be, however, that this “well-meaning White” was not remarking so much on Padilla’s Hispanic Caribbean nomenclature. Names like “Peralta” or “Padilla” are not necessarily uncommon in today’s academy. Rather, the supposed fuddy-duddy may have sincerely found rare Padilla’s stylized and hyphenated first name, “Dan-el,” which is certainly infrequent anywhere. One wonders what would be more offensive to Padilla: the retired scholar’s earnestly “enunciating the syllables of my name with relish,” or his making no effort at all to reflect proper Spanish accentuation and intonation.

Elsewhere, Padilla manages to kill two academic birds with one stone when he contrives to remind the reader of his own brilliance while mocking those who were unwillingly forced to recognize it:

I craved opportunities for showing off my skill at reproducing Latin and ancient Greek verb and noun forms, especially in front of those classmates and teachers whom I suspected of under- or misestimating me; and I wanted to bask forever and ever in the warm glow of report card comments that I was the pater familias of my class. . . . Nothing fired me up quite like the desire to vault right over my peers; like Suetonius’s Julius Caesar, I wanted to jump right on all their heads, and I was not deterred by their elitist derision—which, after all, was only another language to master.

Continuing at length, Padilla juxtaposes his earlier superiority over his peers in classical languages with his later liberation from their insidious patronizing:

[B]ut when the moment came to translate Latin or ancient Greek, I warmed to the heat of competing for authority: authority over texts, authority over language, authority over them. How I later unlearned the habits forged in this crucible, how I found kinship and strength in the move away from mastery and towards Julietta Singh’s vision for “unthinking mastery,” is a story for another day.

In fact, this moment supposedly “for another day” has become a repetitive theme of most of Padilla’s proliferating autobiographical diatribes. This current book itself, as noted, has very little to do with any sustained or coherent critique of classics. The phobias of the title, as we have seen, are all Padilla’s own, mostly racial, and not shared by many outside his small academic circle.

Chapter 1 is unfortunately incoherent. It appears to be a stream-of-consciousness opportunity for Padilla to unshackle himself from classical studies (“the overrepresented classicism of the plantation”). That way, he can take flight to Caribbean literature and culture (“a classicism of fugitivity and insurgence, grounded in the histories of the Caribbean”). What follows, however, consists of very few “histories” but instead a meandering chat of a dozen or so pages about a few notable Caribbean “maroons” in the Spaniard Juan de Castellanos’s sixteenth-century epic poem Elegías de varones ilustres de Indias.

Castellanos did, in fact, write of notable “maroons,” such as the insurrectionary leader “Lemba.” From the earlier chapters, we might have assumed that Padilla would judge the veteran Spanish conquistador turned “settler-colonialist” priest as deserving as much ostracism from his new pantheon as did his bête noire, the nineteenth-century Confederate, apologist for slavery, and veritable founder of Greek and Latin philology in America, Basil Gildersleeve. Absent Castellanos’s poem, however, Padilla would apparently have no sources to draw on for his heroic maroons, so he is forced to embrace this presumably suspect author.

Readers will have no idea what was the intended theme of Chapter 2. It appears to be mostly another rehash of various vignettes from Padilla’s early life, whose common denominators are people and things that might have been well-meant but harmed him personally, or at least offended his sensitivities.

The guilty prove a diverse bunch. They range from the Purdue University president Mitch Daniels (for “ill-considered remarks”) and the classicist Eric Adler for his book on contemporary struggles in the field (specifically, for claiming the culture wars were over by 2006), to the idea of the Western canon (he is left “chuckling” by those who ascribe his success to his “‘great books’ education”), the previously mentioned professor who labored too long over Padilla’s names, and that omnipresent elephant in the room, “Whiteness.”

The title of Chapter 3, “Let Me Clear My Throat,” is redundant, since throat-clearing has characterized the previous seventy-five pages. The rest of the chapter is a mishmash critique of the black intellectual and activist W. E. B. Du Bois. Padilla finds him guilty of trying to master, and then impress others with, the very Eurocentric norms and signs of learning that he failed to understand were also his shackles. Translated, this seems to mean that, unlike the unenlightened Du Bois, Padilla has proved savvy enough to use “Eurocentric” tools not merely to advocate for blackness but even to seek to destroy the edifices of “Whiteness.”

Padilla also judges Du Bois culpable of ignoring the original victims of European racism, the Native Americans. Obsessed as he was with slavery, Du Bois apparently was not fully up-to-date with trendy critiques of “settler colonialism” (a phrase Padilla uses no fewer than thirty-seven times in the book):

At the very least Du Bois’s case should be a warning to those who quest after the silver fleece of classics and classicism without taking the settler-colonialist preconditions of its existence and production seriously.

He ends his attack with a warning “not to trip where Du Bois tripped.” We learn for the umpteenth time that it was that old, insidious whiteness that foolishly prompted Padilla to excel in prep school, yearn for approbation, be proud of his tony education and degrees, and win fellowships and remuneration. Lest one believe that Padilla at this point might be an insincere grifter who lapped up beneficia in his ascendance and then, once he reached the acme of his academic quest, conveniently resituated his career within the emerging anti-white majority, he fires off a preemptive salvo: Dan-el Padilla Peralta will not be accused of “biting the hand that feeds you.”

The gentleman doth protest too much, methinks.

Chapter 4 centers on the second-generation Nigerian American artist Kehinde Wiley, best known for painting the official presidential portrait of Barack Obama. Wiley’s reputation as an artist centers around repainting the classical and biblical themes of the Old Masters with black men and women inserted in place of the white subjects. Wiley’s more recent notoriety—unmentioned by Padilla—arose from his remaking of a seventeenth-century painting by Giovanni Baglione, Judith and the Head of Holofernes. Wiley redid Baglione with a scowling black woman holding up the head of a white woman whom she had apparently just decapitated. As Wiley lightheartedly put it, “It’s sort of a play on the ‘kill whitey’ thing.”

Padilla does not elaborate much on the usefulness of Wiley for classics, but one can imagine how demonic whiteness might be cleansed from classics by the use of Wileyism. Could we learn to translate Homer into hip-hop? Or perhaps the more talented can redo the Elgin Marbles to show a Black Lives Matter march? Padilla ends his chapter with more of the impenetrable but now customary “end Whiteness” prose:

Hegemonic modes of White-centering classicism have got to go. Otherwise, they will continue to threaten the cultivation and elevation of other classicisms, by hogging the material and cognitive resources through which these can be most effectively and enduringly reproduced. A future where that hogging persists is no future for me.

This, of course, is nonsense. Elite campuses have created all sorts of new racial-studies programs, centers, and “safe spaces” and race-based admissions and fellowships, as well as de facto racially segregated theme houses and graduation ceremonies. In September 2020, the University of Chicago’s department of English, to take one example, announced that it planned to admit only black-studies Ph.D. candidates for its 2021 cycle, a decision, it claimed, that was inspired by the contemporaneous protests organized by Black Lives Matter and made in light of the field’s “complicated history” with regard to race.

A survey of yearly dissertation titles and university-press listings in classics reveals a plethora of “theory” titles focused on race. Until recently, admissions and hiring committees often demanded so-called diversity oaths that required applicants to document the degree of their prior commitments to diversity, equity, and inclusion in order to be considered seriously. Even Padilla seems to acknowledge that the idea of ongoing reparatory higher education is reaching critical mass, asking near the end of his booklet: “When (if ever) minoritized classicisms reach overrepresentation, will we know then the work is done?”

As for “hogging” resources, currently unemployed classicists from less-prestigious graduate programs might see Padilla himself as feeding amply and inordinately at the fellowship and award trough. Reading his boasts, they might pause to reflect, with Horace: Quid rides? Mutato nomine et de te fabula narratur.

Padilla’s brief conclusion doubles down on his phobia of “Whiteness” by offering halfhearted replies to critics of his various visions for a new non-white classics. Padilla singles out the German scholar Jonas Grethlein for criticism. He had apparently dared to suggest, correctly, that American research and higher education in general are often “seen less as the production of knowledge than as an expression of identities through which oppressed minorities can emancipate themselves.”

So what are we make of Padilla’s monotonous screed?

Unknowingly, Padilla has offered a morality tale about the current pathologies of academia itself. Modern higher education as practiced at the elite level possesses a uniquely destructive ability to transform even the most idealistic entering students. In the case of classics, they are quickly disabused of the notion that the ancient world could inspire a love of learning and a genuine appreciation of transcendent values. Perhaps Padilla himself once felt that the study of Greek and Roman literature was not bound by identities, but was rather a discipline devoted to exploring unchanging human nature across time and space—a pathway open to those of any sex, race, or class who wish to read the ancients.

Classical wisdom was not limited by or solely intended to appeal to a single culture. Nor was it a product of specific Greek or Roman tribal affinities and identities. Instead, the greatest thinkers of the ancient world sought to enlighten a common humanity, and thus could appeal to anyone of any background who came to appreciate such universal human tenets. This may explain why classics programs today exist in Japan and China, as well as in African and Latin American countries.

Again, Padilla at an early point in his academic life—one that he now seems almost ashamed of—seemed to understand that universality. It initially drew him to a distant classical world—at least until he was apprised by the academic brotherhood that his initial instincts were parochial, reactionary, and reflective of his own false consciousness.

As a result, Padilla has become exactly what he once might have feared—an elitist, name-dropping pedant, navigating to the top of the heap, now by traditional philology, now by voguish theories of identity and oppression. The one constant, then and now, has been the rewards that he now trashes as obsessively as he once eagerly coveted them.

Indeed, for someone who is so proud of his classical-language fluency and his expertise in the tools of classical scholarship, Padilla shows little insight into what classics was and is really about. In Padilla’s worldview, the racist eighteenth- and nineteenth-century European settler colonialists—Germans, British, and Americans especially—created the damnable field of classics not to establish vital professional and foundational methods of learning about the classical past through archaeology, epigraphy, manuscript studies, numismatics, papyrology, prosopography, and historical and textual criticism, but merely to reinforce their own fragile sense of white superiority over colonial subjects and conquered indigenous peoples:

Ancient Greece and Rome come to be lashed to the hegemonic projections of North Atlantic liberalism as White-supremacist and settler-colonialist structures of exploitation increasingly capitalized on their ideological and semiotic potency.

Even if that were correct, the Greeks and Romans are no more responsible for what later and often less enlightened cultures choose to appropriate from them than Haiti is for Night of the Living Dead. And Homer and Cicero are no more responsible for the toxic Klansman who thought the adaptation of the ancient Greek kuklos added a veneer of refinement to his hatred than they are for the German socialists and Marxists who near the end of World War I dubbed themselves the “Spartacist League,” playacting as classical slaves oppressed by warmongering capitalist masters.

What Padilla further fails to understand is that classical scholarship’s fascination with the Greco-Roman world rests upon that subject’s singular self-criticism of its own standards and values. The tools of mockery that Padilla employs—caricature, cynicism, parody, sarcasm, and satire—all derive from classical roots, which is to say that they were invented by the very Greeks and Romans he dismisses. Many of the Western pathologies that Padilla cites—class privilege, the “establishment,” male dominance—were long ago objects of criticism more virulent and yet more sophisticated than Padilla’s adolescent rants.

Misogyny? Read the Antigone, Medea, and Lysistrata.

Slavery? “No man is born a slave,” wrote the fourth-century polymath Alcidamas. Aristotle’s argument for natural slavery acknowledges a host of critics who felt otherwise. Slaves in drama from Aristophanes to Plautus often appear smarter than their masters.

The poor and the oppressed? From Solon to the Gracchi, there is plenty of classical admiration for the efforts of the underclass to get even with their exploiters.

Rather problematically for Padilla, the whitest people whom the Mediterranean Greeks and Romans met were often the most negatively stereotyped—whether the savage, milk-drinking, tree-worshiping Germani; the wild, tattooed, and red-haired Britons; the supposedly pathologically lying white-skinned Gauls; or the purportedly innately savage Thracians. In contrast, Homer names as the noblest of foreign peoples the black Ethiopians—a race Herodotus thought the tallest and handsomest.

Settler-colonialism? Recall what Tacitus had his Scottish leader Calgacus say about how the historian’s fellow Romans make a desert and call it peace. For all the “settler colonialism” of Alexander the Great, his ideas of race might be better described as “assimilationist” or as a sort of proto–melting pot, accomplished by forced Persian–Macedonian mass marriages to pave the way for his dream of a brotherhood of mankind.

Homosexuality? Transgenderism? The ancients for the most part had no intrinsic prejudice against either—except perhaps in worrying about the need for marriage and fertility in a premodern society where nearly half of all newborns did not live beyond the age of five. Long before gender-affirming care for the transgendered, the complexities of sexual dysphoria were described in the poems of Catullus and in Petronius’s Satyricon.

Padilla condemns classics not for what the ancients thought but for what some later Europeans claimed they thought, although the tools of classical scholarship are the best means to enlighten audiences about just such a disconnect. But even here, Padilla must cherry-pick his villains to produce a false picture of right-wing classicists pushing their whiteness down the throats of the non-white through all sorts of exclusionary standards and norms. Such paranoias have little resemblance to the reality of the modern campus.

Of the thirty or so undergraduate and graduate-school classics professors whom I knew in the 1970s and 1980s, perhaps 90 percent were left-wing. Unlike Padilla, however, they felt their politics were incidental rather than essential to their professional lives. In Greek history, we were assigned the leftists G. E. M. de Ste. Croix and Moses Finley, but rarely a conservative such as Raphael Sealey or Donald Kagan—an imbalance perhaps often to our disadvantage, but never one occasioning much worry.

In truth, even well before the contemporary, left-dominated Western university, classics was every bit as much a home for leftists, socialists, globalists, and radicals—such as George Grote, Gilbert Murray, Alfred Zimmern, and George Cawkwell—as for the whiteness-glorifying bogeymen of Padilla’s diatribes.

As for Padilla’s obsession with burning down the current field of classics in order to destroy whiteness, the reader wonders about the point of such fixations. Padilla seems to think a declining, endangered discipline is almighty and at the center of the universe. Thus, in a ponderous Classics did it! fashion, he faults a now nearly impotent field for all sorts of cosmic sins. Classics caused Dominicans to ignore Santeria, the folk religion of blacks in the Spanish-speaking Caribbean. Classics sowed confusion about the real nature of the Haitian Revolution. Classics made us valorize written texts and ignore all the other “non-textualized” information from the past that was not so privileged. Classics played a prominent role in everything from the Confederacy to racial violence.

Still, Padilla hopes that classics can become—like black studies—an instrument for revolution. America is a free country, at least for a while longer, and he might do better to found his own department of Caribbean classics or black classicism while allowing the ancient discipline of classics simply to go its own way and, as he wagers, to die in solitary peace.

If Padilla is wise, and if he does not prove successful in incinerating his chosen field and those less fortunate struggling within it, he might avoid such embarrassments as this recent book and return to the classical scholarship of his dissertation, where he displayed incipient talent. Surely there was some moment as a classicist when he was not driven to write so unhappily and endlessly about himself and his myriad white oppressors.

A final irony: only in fleeting moments amid his racial fixations does Padilla touch upon elite academia and class (classicism, he thinks, is a “cog in the machinery” of capitalism). Even then, class awareness mostly serves as a defensive trope to preempt any notion that he is an elite hypocrite. Indeed, he blasts such criticism as a “pretend class analysis.”

Given that class might serve as a barometer of privilege well outside his binaries of racial victims and victimizers, an overly sensitive Padilla carefully guards his hard-won privileged turf from criticism that is

directed at those [like himself] who are actively involved in the labor of disciplinary critique, that slyly or not so slyly seizes on the relative economic privilege and institutional security of some of these critics as a pretext for dismissing their criticisms.

So, if anyone should state the obvious about the substantial privilege that Padilla himself enjoys but trashes in others, let him beware:

But this analysis, which by seeking to swap out race for class reveals itself as a class-reductionist project for enacting White innocence, fails for the simple reason that it cannot honestly engage in what Fred Moten, improvising on Fred Hampton, calls “the problematic of coalition.”

In the end, an embarrassed and sputtering Padilla is reduced to exclaiming:

And just think for a moment about what would happen if I were not credentialed in the ways that I am: how hard it is for Black people to be so much as heard if they do not come swaddled in White-affirming capital and prestige?

I doubt Padilla would appreciate an honest answer to his rhetorical question—for if he were teaching at Cal State Stanislaus, then Princeton University Press would never have published this book, not because he is black, but because it would be found an embarrassment by accomplished scholars of all races.

Padilla knows that academia in general and classics in particular is a pyramidal structure predicated not on race but on class. Full professors like Padilla enjoy compensations undreamed of by adjunct lecturers and itinerant part-timers. Padilla believes that the university descends from the slave plantation, but if that is the proper metaphor, then he is far more akin to the master than to the servile class.

In truth, any poor white, Hispanic, or Asian student, put into the strange world of elite Upper West Side schooling amid rich kids, would feel the same natural unease as did Padilla. But unlike Padilla, others of many races might weigh the downside of their current discomfort about aristocratic and plutocratic pretensions against the generosity of a free elite education and with it a fast track into the professional classes.

But then that circumspection would require a small shred of gratitude, perhaps an obsolete concept in Padilla’s racialized cosmos. It seems instead so much easier and more sophisticated to envision opportunities not as calling for gratitude, but as proof of the insincerity or manipulative nature of the gifts, which nonetheless will be eagerly accepted and profitably exploited.

This confused anger at past benefaction and Padilla’s own racialist obsessions are the only real messages of Classicism and Other Phobias.