Search This Blog

Showing posts sorted by date for query technology. Sort by relevance Show all posts
Showing posts sorted by date for query technology. Sort by relevance Show all posts

Sunday, December 7, 2025

Kleptocracy, Militarism, Colonialism: A Counterrecruiting Call for Students and Families

The United States has long framed itself as a beacon of democracy and upward mobility, yet students stepping onto college campuses in 2025 are inheriting a system that looks less like a healthy republic and more like a sophisticated kleptocracy entwined with militarism, colonial extraction, and digital exploitation. The entanglement of higher education with these forces has deep roots, but its modern shape is especially alarming for those considering military enlistment or ROTC programs as pathways to opportunity. 

The decision to publish on December 7th is deliberate. In 1941, Americans were engaged in a clearly defined struggle against fascism, a moral fight that demanded national sacrifice. The world in 2025 is far murkier. U.S. militarism now often serves corporate profit, global influence, and the security of allied autocracies rather than clear moral or defensive imperatives.

This is an article for students, future students, and the parents who want something better for their children. It is also a call to pause and critically examine the systems asking for young people’s allegiance and labor.

Higher education has become a lucrative extraction point for political and financial elites. Universities now operate as hybrid corporations, prioritizing endowment growth, real-estate expansion, donor influence, and federal cash flows over public service or student welfare. Tuition continues to rise as administrative bloat accelerates. Private equity quietly moves into student housing, online program management, education technology, and even institutional governance. The result is a funnel: taxpayers support institutions; institutions support billionaires; students carry the debt. Meanwhile, federal and state funds flow through universities with minimal oversight, especially through research partnerships with defense contractors and weapons manufacturers. What looks like innovation is often simply public money being laundered into private hands.

For decades, the U.S. military has relied on higher education to supply officers and legitimacy. ROTC programs sit comfortably on campuses while recruiters visit high schools and community colleges with promises of financial aid, job training, and escape from economic insecurity. But the military’s pitch obscures the broader structure. The United States spends more on its military than the next several nations combined, maintaining hundreds of foreign bases and intervening across the globe. American forces are involved, directly or indirectly, in conflicts ranging from Palestine to Venezuela to Ukraine, and through support of allies such as Saudi Arabia and the United Arab Emirates, often supplying weapons used in devastating campaigns. This is not national defense. It is a permanent war economy, one that treats young Americans as fuel.

At the same time, Russian cybercriminal networks have infiltrated U.S. institutions, targeting critical infrastructure, education networks, and private industry. Reports show that the U.S. government has frequently failed to hold these actors accountable and, in some cases, appears to prioritize intelligence or geopolitical advantage over domestic security, allowing cybercrime to flourish while ordinary Americans bear the consequences. This environment adds another layer of risk for students and families, showing how interconnected digital vulnerabilities are with global power games and domestic exploitation.

For those who enlist hoping to fund an education, the GI Bill frequently underdelivers. For-profit colleges disproportionately target veterans, consuming their benefits with low-quality, high-cost programs. Even public institutions have learned to treat veterans as revenue streams. U.S. universities have always been entwined with colonial projects, from land-grant colleges built on seized Indigenous land to research that supported Cold War interventions and overseas resource extraction. Today these legacies persist in subtler forms. Study-abroad programs and global campuses often mirror corporate imperialism. Research partnerships with authoritarian regimes proceed when profitable. University police departments are increasingly stocked with military-grade equipment, and curricula frequently erase Indigenous, Black, and Global South perspectives unless students actively seek them out. The university presents itself as a space of liberation while quietly reaffirming colonial hierarchies, militarized enforcement of U.S. interests worldwide, and even complicity in digital threats.

For many young people, enlistment is not a choice—it is an economic survival strategy in a country that refuses to guarantee healthcare, housing, or affordable education. Yet the military’s promise of stability is fragile and often deceptive. Students and parents should understand that young Americans are being recruited for geopolitics, not opportunity. Wars in Ukraine, Palestine, and Venezuela, along with arms support to Saudi Arabia and the United Arab Emirates, rarely protect ordinary citizens—they protect corporations, elites, and global influence. A person’s body and future become government property. ROTC contracts and enlistments are binding in ways that most eighteen-year-olds do not fully understand, and penalties for leaving are severe. Trauma is a predictable outcome, not an anomaly. The military’s mental health crisis, suicide rates, and disability system failures are well documented. Education benefits are conditional and often disappointing. The idea that enlistment is a reliable pathway to college has long been more marketing than truth, especially in a higher-education landscape dominated by predatory schools. Young people deserve more than being used as leverage in someone else’s empire.

A non-militarized route to opportunity requires acknowledging how much talent, energy, and potential is lost to endless war, endless debt, and the growing digital threats that go unaddressed at the highest levels. It requires demanding that federal and state governments invest in free or affordable public higher education, universal healthcare, and stronger civilian service programs rather than military pipelines. Students can resist by refusing enlistment and ROTC recruitment pitches, advocating for demilitarized campuses, supporting labor unions, student governments, and anti-war coalitions, and demanding transparency about university ties to weapons manufacturers, foreign governments, and cybersecurity vulnerabilities. Parents can resist by rejecting the false choice presented to their children between military service and crippling debt, and by supporting movements pushing for tuition reform, debt cancellation, and public investment in youth.

It is possible to build a higher-education system that serves learning rather than empire, but it will not happen unless students and families refuse to feed the machinery that exploits them. America’s kleptocracy, militarism, colonial legacies, and complicity in global digital crime are deeply embedded in universities and the workforce pipelines that flow through them. Yet young people—and the people who care about them—still hold power in their decisions. Choosing not to enlist, not to sign an ROTC contract, and not to hand over your future to systems that see you as expendable is one form of reclaiming that power. Hope is limited but not lost.

Sources

  1. U.S. Department of Defense. Defense Budget Overview Fiscal Year 2025. 2024.

  2. Amnesty International. “Saudi Arabia and UAE Arms Transfers and Human Rights Violations.” 2024.

  3. Human Rights Watch. “Conflicts in Ukraine, Venezuela, and Palestine.” 2024.

  4. FBI and CISA reports on Russian cybercrime and critical infrastructure infiltration. 2023–2025.

  5. Cybersecurity & Infrastructure Security Agency (CISA). National Cybersecurity Annual Review. 2024.

Friday, December 5, 2025

The Ludwig Institute for Shared Economic Prosperity: Rethinking—and Challenging—America’s Economic Narrative

In a political moment defined by economic confusion, precarity, and widening inequality, the Ludwig Institute for Shared Economic Prosperity (LISEP) has positioned itself as one of the most forceful critics of how the U.S. government measures economic well-being. Founded in 2019 by Eugene “Gene” Ludwig—banking regulator, financier, and longtime critic of official labor statistics—the institute argues that the traditional indicators used by policymakers, economists, and the media no longer reflect the lived experience of most working and middle-class Americans.

LISEP’s core mission is straightforward: to replace or supplement conventional economic indicators with metrics that measure whether ordinary people can live decent, stable, self-supporting lives. In place of headline unemployment levels that minimize underemployment and wage suppression, LISEP developed the True Rate of Unemployment (TRU). Instead of accepting the Consumer Price Index as an indicator of affordability, it created the True Living Cost (TLC). And to evaluate whether households can achieve a baseline level of dignity, the institute introduced its Minimal Quality of Life Index (MQL).

Taken together, these indicators paint a sobering picture. LISEP’s most recent TRU data suggests that nearly one in four Americans—far more than the official unemployment rate—remains functionally unemployed or trapped in low-wage, unstable work. Its analysis of living costs shows that basic necessities such as housing, childcare, food, healthcare, and digital access are rising at rates that far outpace reported inflation. Its income distribution research finds that the bottom 60% of households fall severely short of the after-tax income required to meet even minimal quality-of-life thresholds.

In a time when both parties often claim economic success—pointing to record stock markets, low headline unemployment, and steady GDP growth—LISEP argues that these triumphal narratives obscure the steady erosion of working-class security.

But LISEP’s work does more than diagnose hardship; it challenges the legitimacy of the economic story that the United States tells about itself. That is precisely why its metrics have garnered attention—and controversy.
Methodological Innovations and the Pushback They Attract

Economists, policymakers, labor advocates, and academics have responded to LISEP’s work with a mixture of praise and skepticism. Some see LISEP as filling a critical gap—offering metrics that better capture the realities of gig workers, part-time workers, workers with unpredictable hours, and families priced out of life’s essentials. Others argue that LISEP’s approach risks injecting subjectivity into economic measurement and complicating long-established statistical frameworks.

One major point of debate centers on LISEP’s definition of unemployment. Traditional unemployment statistics only count individuals actively seeking work. LISEP’s TRU metric, by contrast, includes the underemployed, part-time workers who want full-time jobs, and discouraged workers who have given up looking. Critics argue that combining these groups creates a metric that resembles a policy argument more than a neutral measurement. Supporters counter that ignoring these groups produces an artificially rosy portrait of economic health and undervalues persistent structural inequality.

LISEP’s True Living Cost and Minimal Quality of Life indices face a different critique: they define “necessities” more broadly than some economists are comfortable with. Including internet access, basic technology, early childhood education, and modern transportation standards is, according to LISEP, essential to functioning in the 21st-century economy. Critics contend that because these standards go beyond subsistence, the metrics risk shifting from measuring need to measuring aspiration. The institute responds that “subsistence” is not an acceptable measure of human dignity in a wealthy nation.

Other scholars raise questions about transparency. While LISEP publishes summaries and explanations of its methodologies, some economists argue that its approaches would require broader independent replication and peer review to become standard tools. Yet others note that the Bureau of Labor Statistics itself has long used imperfect methods that were never designed to measure well-being—only labor market participation.

Where supporters and skeptics agree is on one point: LISEP has forced a deeply needed conversation about what economic dignity means in the United States today.
Why LISEP Matters for Higher Education and Public Policy

For institutions of higher learning—especially those that produce the economists, policymakers, and journalists who shape public discourse—LISEP’s challenge to economic orthodoxy is a call to scrutiny and humility. Universities continue to rely on traditional metrics in research, teaching, and policy labs, even when these metrics fail to capture the economic and social pressures facing students and their families.

Students at community colleges, regional publics, and underfunded institutions live the realities LISEP describes: multiple jobs, unpredictable hours, rising food and housing insecurity, and persistent underemployment after graduation. Yet their struggles are too often minimized by conventional indicators that suggest a thriving labor market.

If academia takes LISEP’s work seriously, it could shift research priorities, reshape debates on student debt, influence regional economic development strategies, guide labor-market forecasting, and elevate the experiences of the most economically vulnerable students.

For policymakers, LISEP’s metrics offer a different foundation for assessing whether economic growth is reaching ordinary people. They provide tools for evaluating whether wages are livable, whether childcare is accessible, whether housing is affordable, and whether the economy produces stable, family-supporting jobs. If adopted or even partially embraced, LISEP’s indicators could inform legislation on minimum wage, labor protections, social services, tax reform, cost-of-living adjustments, and more.

The institute’s broader message is simple: the United States cannot address inequality if it continues to celebrate misleading statistics.
A New Economic Narrative

Whether LISEP becomes a permanent influence or a dissenting voice will depend on how policymakers, journalists, and academic economists respond. If its metrics remain on the margins, they will serve as a moral indictment of traditional measures that ignore the reality of economic insecurity. If they are adopted, they could trigger a profound reevaluation of American economic policy—one grounded not in aggregate success but in shared prosperity.

LISEP insists that a healthy economy is not one that grows on paper but one that allows ordinary people to live decently. That premise alone places the institute on the front lines of the battle over how the United States understands its own economic health.
Sources



Ludwig Institute for Shared Economic Prosperity, “True Rate of Unemployment (TRU),” 2025, lisep.org.
Ludwig Institute for Shared Economic Prosperity, “True Living Cost (TLC),” 2025, lisep.org.
Ludwig Institute for Shared Economic Prosperity, “Shared Economic Prosperity (SEP) Measure,” 2025, lisep.org.
PR Newswire, “Majority of Americans Can’t Achieve a Minimal Quality of Life, According to New Ludwig Institute Research,” May 12, 2025.
Ludwig Institute for Shared Economic Prosperity, “Wage Inequality Grows With Low-Income Workers Losing Ground,” Press Release, April 16, 2025.




Thursday, December 4, 2025

Therapists Can’t Fix What Society Broke (Steven Mintz)

[Editor's note:  This article first appeared at Steven Mintz's substack.]

What the Classical Social Theorists Knew about the Price We Pay for Progress—and We’ve Forgotten

On a recent flight, a small child in the row behind me shrieked with piercing intensity. The passenger beside me leaned over and whispered, with assurance, “He’s autistic.”

Neither of us knew the child. What we had was a familiar modern reflex: reaching immediately for a diagnostic label.

Yet the scene likely had simpler explanations. Any parent knows toddlers often melt down. They have immature nervous systems, poor emotional regulation, and lack the linguistic tools to express their discomfort.

Air travel makes this exponentially worse: altitude pressure that feels like a drill behind the eardrum, bright lights, crowding, disorientation, loss of routine, confinement in an airplane seat, and helpless parents who cannot walk, rock, or soothe as they ordinarily would.

In such a setting, a screaming child isn’t a clinical puzzle. He or she is a human being overwhelmed by an environment for which their developmental stage is simply unsuited.

But what struck me wasn’t the child’s distress—it was my fellow passenger’s interpretive leap. We now default to pathology. Behaviors that earlier generations would have recognized as overtiredness, frustration, temperament, or physiological misery are now reframed as sensory processing issues, spectrum behaviors, and emotional dysregulation.

A century ago, William James or Émile Durkheim would have been baffled by our eagerness to see ordinary distress as a clinical symptom. They assumed a different relationship between individuals and their environments. They looked first to situational explanations, developmental stages, social settings, and institutional pressures—not to internal pathology.

The classical social theorists were exquisitely attuned to context. They understood that behavior is produced not just by minds but by milieus; not only by individual traits but by social expectations, institutional routines, physical environments, and cultural frames.

They would have asked: What was the situation? What were the constraints? What was the child’s developmental stage? What stresses shaped the parents’ responses? Why do modern societies interpret certain behaviors this way?

Those are the questions we increasingly fail to ask.

The Classroom Mirror

I see this reflex every semester. Many students arrive with formal diagnoses—ADHD, social anxiety, depression, autism spectrum traits—and often understand these labels as central to their identity.

I don’t doubt these conditions are real for many. But far more often than we acknowledge, their struggles stem less from an intrinsic disorder than from a structural mismatch between who they are and the environments we place them in.

Large lecture halls; nonstop digital distraction; relentless assessment; pressure to perform perfectly; overcrowded advising systems; erosion of in-person community; feeling constantly watched and perpetually behind—these aren’t symptoms of personal pathology. They’re central to how colleges are currently designed. They generate anxiety, cognitive overload, disconnection, and inadequacy in perfectly healthy young adults.

Yet in a culture where we no longer know how to talk about situational or structural problems, students understandably look inward. What earlier generations might have described as exhaustion, loneliness, discouragement, confusion, or developmental turbulence is now interpreted as a disorder to be treated.

We diagnose individuals when the real problem lies in the systems, structures, and expectations surrounding them. Classical social theorists understood something we’ve forgotten—that human beings cannot be separated from the worlds they inhabit, and what looks like personal failure is often the predictable result of social arrangements, institutional pressures, and cultural transformations.

Many problems we treat as individual psychology are, in fact, social. What feels personal is often produced by institutions, expectations, and culture.

The Lost Questions

There’s a paradox at the heart of contemporary social analysis. We have more data than ever—surveys tracking happiness, studies measuring loneliness, algorithms predicting behavior, and neuroscience mapping the brain. We can quantify anxiety rates, document declining social trust, and measure screen time to the second.

Yet for all this empirical precision, we seem less able than earlier generations to explain why wealthy, free, technologically advanced societies produce so much unhappiness, alienation, and despair.

Classical social thinkers—from roughly the 1880s through the 1950s—understood something we’ve forgotten. They grasped that modernity wasn’t simply adding new goods (wealth, freedom, and technology) to human life while leaving fundamentals unchanged. It was dissolving the very frameworks, rituals, and structures that had given life meaning, connection, and purpose.

Modernity was a package deal, and the price of its benefits was the loss of much that made life livable.

Contemporary social science has largely abandoned this tragic sensibility. We analyze discrete variables—income inequality, screen time, political polarization—without attending to deeper structural transformations that generate these symptoms.

We prescribe technical fixes—better mental health services, regulated social media, and reformed institutions—without recognizing that problems run deeper than any policy intervention can reach.

The classical thinkers knew better. They understood that modernity’s discontents weren’t bugs to be fixed but features of the system itself.

What the Classics Saw

A core insight runs through the writings of Weber, Durkheim, Simmel, Tönnies, Polanyi, and others: modern life systematically dissolves the dense webs of meaning, obligation, and continuity that structured pre-modern existence. This dissolution wasn’t avoidable—it was the necessary condition for everything modernity promised.

Tönnies on the Shift from Community to Society

Ferdinand Tönnies’s distinction between gemeinschaft (community) and gesellschaft (society) captures what changed. Gemeinschaft described life organized around kinship, locality, tradition, and unreflective bonds that made people part of something larger than themselves. You didn’t choose your village, extended family, place in the social order, or obligations to neighbors. These were given, woven into existence’s fabric.

Gesellschaft described modern life organized around contract, choice, rational calculation, and instrumental relationships. You choose your career, residence, and associations. Relationships are voluntary, revocable, and organized around mutual benefit rather than organic solidarity. This brought enormous gains in freedom and opportunity. But it also meant nothing was given, everything was optional, all relationships were contingent rather than fixed.

The real loss wasn’t some sentimental yearning for village life. It was the disappearance of what Robert Nisbet called “intermediate institutions”—the extended families, congregations, civil associations, unions, and community networks that once connected individuals to one another and gave daily life structure, support, and meaning.

Church, guild, neighborhood, extended family, and craft tradition weren’t just social organizations but ontological anchors. They provided identity, purpose, standards of excellence, and narratives connecting past to future. When they dissolved or became voluntary lifestyle choices rather than unchosen obligations, something irreplaceable was lost.

Durkheim on Anomie

Émile Durkheim argued that people need moral frameworks—not in the sense of strict rules or puritanism, but shared expectations that help us decide what goals are reasonable and what counts as “enough.” Without those external standards, our desires have no limits; we keep wanting more without knowing why or to what end.

This breakdown of guiding norms is what Durkheim meant by anomie. It’s not just chaos or “normlessness.” It’s the collapse of the social structures that tell us how to measure success, how to live a meaningful life, and where to direct our ambitions. When those frameworks erode, people feel unmoored—driven by endless wants but with no sense of direction or satisfaction.

In the pre-modern world, Durkheim argued, people lived inside thick webs of meaning that helped them understand who they were, what counted as a good life, and when enough was enough. These frameworks came from many places: religious teachings about one’s duties, craft traditions that defined good work, sumptuary rules that kept status competition in check, seasonal rhythms that shaped time, and life-cycle rituals that marked major transitions.

These systems could certainly be restrictive, but most people experienced them as simply the way life worked—structures that offered direction, limits, and shared expectations.

Modernity dismantled many of these frameworks in the name of individual freedom and social mobility. Suddenly, people could aspire to anything and reinvent themselves entirely. But with old limits gone, desires multiplied. If you can always become more, achieve more, accumulate more, how do you ever know when you’ve done enough? What tells you that you are successful, secure, or “on track”?

The result wasn’t pure liberation. It was a new kind of burden: wanting without an obvious endpoint, striving without clear measures, comparing yourself endlessly to others with no shared standard to anchor the process.

This helps explain why so many people today feel anxious despite rising living standards. Wealth can meet basic needs, but it also fuels comparison—and modern life has stripped away many of the boundaries that once contained those comparisons. In achievement-driven cultures, where people set their own goals and judge themselves against constantly shifting internal standards, nothing ever feels sufficient.

Weber’s Iron Cage

Max Weber’s concept of rationalization captured another major shift in modern life: institutions stopped being guided by tradition, shared judgment, or moral purpose and instead became organized around efficiency, calculation, and technical control.

Decisions that once involved human judgment increasingly followed rules, metrics, and procedures. This made institutions more predictable and effective—but also more rigid and impersonal.

Modern life came to be shaped by what Weber called instrumental rationality: finding the most efficient means to a given end. Bureaucracies, markets, legal systems, and scientific institutions operate this way. The result was extraordinary productivity and administrative capacity. But it also stripped institutions of meaning and moral depth.

Weber called this disenchantment. The world no longer appeared as a moral or spiritual order. It became a set of problems to manage, resources to optimize, and processes to streamline.

His metaphor of the iron cage captured the paradox: we built rational systems to serve human needs, but those systems now constrain us. Bureaucratic procedures, market incentives, and technological imperatives keep operating even when they undermine human flourishing. Individuals become replaceable “human resources,” valued for their functions rather than their purposes.

Simmel on Metropolitan Life

Georg Simmel’s 1903 essay “The Metropolis and Mental Life” reads uncannily like a diagnosis of smartphone culture. Simmel argued that modern city life bombards people with constant sensory and social stimuli. To cope, the urban mind develops a protective numbness—a “blasĂ© attitude”—marked by detachment, indifference, and a shrinking capacity to feel surprise or deep emotion.

Urbanites, he wrote, become more calculating because their social world is crowded with brief, superficial interactions. When you have to navigate countless encounters each day, you evaluate people quickly, in instrumental terms. The result is thinning of relationships: less depth, less intimacy, fewer truly authentic exchanges. The emotional and cognitive energy required for rich connection is already spent fending off overstimulation.

If you swap “metropolis” for “social media,” Simmel’s analysis becomes even more resonant. The endless feed, the pressure to maintain hundreds of shallow ties, the constant performance of the self, the transformation of attention and emotion into metrics—these conditions supercharge the very defenses Simmel described. We become numb to protect ourselves, then wonder why so little feels meaningful anymore.

Polanyi’s Great Transformation

Karl Polanyi’s The Great Transformation (1944) argued that the 19th century’s most radical innovation wasn’t the market—markets had existed for millennia—but the idea of a market society, where land, labor, and money themselves became commodities. This meant pulling these “fictitious commodities” out of the social relationships that once governed them and treating them instead as items to be priced, traded, and regulated entirely by the market.

The result dissolved an older emphasis on reciprocity and the notion of a moral economy. Labor became a commodity to be bought and sold rather than a social relationship with obligations on both sides. Land became real estate to be traded rather than patrimony connecting generations. Social relationships became transactions rather than obligations. This created enormous wealth and flexibility. It also destroyed the social fabric that had made life meaningful.

Polanyi’s key insight was that markets must be politically created and enforced. The “free market” required aggressive state intervention to break up common lands, abolish traditional rights, force people into wage labor, and override local customs limiting commodification. And once created, markets generated such social upheaval that societies repeatedly tried to protect themselves through counter-movements: labor unions, social insurance, land reform, and financial regulation.

Contemporary debates about the gig economy, social safety nets, and the commodification of previously non-market domains (education, healthcare, relationships) still work through Polanyi’s problematic. We keep discovering that some things don’t work well as pure commodities—they need embedding in social relationships and moral frameworks. But market society’s logic keeps pushing toward total commodification.

The Anthropological View

Classical anthropologists—Malinowski, Benedict, LĂ©vi-Strauss—understood that pre-modern societies weren’t simply primitive versions of modern ones, but operated according to different logics. They were organized around ritual, symbol, myth, and kinship rather than instrumental rationality and individual choice.

Rituals weren’t quaint customs but mechanisms for managing life’s fundamental transitions and uncertainties. Birth, maturity, marriage, death—each required ritual marking to integrate individual experience into collective meaning. Seasonal cycles, agricultural rhythms, and religious calendars organized time as qualitatively different moments rather than homogeneous units to be optimized.

Modernity systematically dissolved these meaning-making structures. We still have transitions, but we lack rituals adequate to mark them. We have time, but it’s homogeneous—Monday differs from Sunday only in what we’re scheduled to do. We have choices, but we lack the frameworks that once made choices meaningful rather than arbitrary.

Selfhood as Social

George Herbert Mead, Charles Cooley, and Erving Goffman understood that selfhood isn’t individual but social—it emerges from interaction, from taking on roles, from seeing ourselves through others’ eyes. The self is fundamentally dialogical, constituted through relationships rather than prior to them.

This matters because modernity’s hyperindividualism misunderstands how selfhood actually works. We imagine autonomous individuals choosing identities from an infinite menu. But selves require stable social mirrors—enduring relationships and communities that reflect us back to ourselves consistently over time. When social life becomes fluid, optional, and temporary, selfhood itself becomes unstable and fragmented.

Goffman argued that everyday life works much like a stage. We are all performers who must read cues, manage impressions, maintain face, negotiate interactions, and avoid embarrassment. And this requires constant emotional and cognitive effort.

However, this work becomes exponentially harder when social roles are unclear, when we move among many different audiences (family, coworkers, online strangers), and when norms shift rapidly.

No wonder anxiety is epidemic. We’re constantly performing for audiences whose expectations we can’t know, managing impressions across incompatible contexts, lacking the stable roles that once made social interaction navigable.

Even though thinkers like Durkheim, Weber, Simmel, Polanyi, and Goffman sometimes overstated the contrast between “traditional” and “modern” life, their core insights remain indispensable. They identified pressures built into modern society—pressures we still feel every day.

Why We Forgot

If these thinkers diagnosed our condition so accurately, why did their insights fade from view?

1. Disciplinary tunnel vision: The classic theorists read widely—history, philosophy, psychology, anthropology—and tried to make sense of society as a whole. Today’s social sciences reward narrow specialization. We have far fewer attempts to pull the pieces together into a coherent picture of how modern life works.

2. The dominance of individual-based explanations: Much contemporary research, especially in economics and psychology, explains social problems as the sum of individual choices. That approach misses what the classics understood: that social structures—institutions, norms, incentives—shape what individuals can see, desire, or do. You can’t explain burnout, loneliness, or inequality only by analyzing individuals.

3. Faith in technical fixes: Durkheim and Weber believed modernity involved tragic tradeoffs: more freedom but less stability, more efficiency but less meaning. It’s easier to believe that social problems just need better policy, better design, better apps. The classics remind us that some tensions aren’t solvable; they’re intrinsic parts of the modern condition.

4. The retreat from big-picture thinking: After the 1960s, large theoretical systems fell out of fashion—often for good reasons. But the pendulum swung too far. We became wary of ambitious accounts of how society works. The result: many brilliant micro-studies but fewer frameworks to make sense of the whole.

What We Might Relearn

Returning to classical social theory is about recovering a way of thinking contemporary social science has largely abandoned: structural, historical, synthetic, attuned to modern life’s trade-offs and tragic dimensions.

We need to follow their example, and:

Understand problems as structural, not individual: The therapeutic turn treats unhappiness, anxiety, and alienation as individual psychological problems requiring individual solutions—therapy, medication, mindfulness. The classics understood these as social problems rooted in structural transformations. When Durkheim analyzed suicide, he showed it had social rates that varied systematically. Suicide was individual, but its causes were social. Similarly today: anxiety and depression have individual manifestations, but their epidemic proportions reflect structural conditions.

Recognize trade-offs: The classics saw that you couldn’t have individualism without anomie, rationalization without disenchantment, urban sophistication without blasĂ© indifference. Contemporary discourse often assumes we can have everything—complete individual freedom and strong communities, endless innovation and cultural continuity. The classics suggest we can’t.

Recover a sense of history: The classic thinkers understood something we often forget: modern life is not just “human nature with gadgets.” It’s the result of specific historical changes that dissolved older ways of organizing family life, work, religion, politics, and even the self.

Attend to what can’t be quantified: The classics understood that the most important social realities—meaning, purpose, moral order, authentic community—resist quantification. This doesn’t mean they’re not real, just that they can’t be captured by the metrics contemporary social science favors.

Think about institutions as meaning-making structures: Modern social science often analyzes institutions in narrowly functional terms—schools educate, markets allocate, courts resolve disputes. The classic social theorists saw something deeper: institutions don’t just serve individuals; they form them. They shape our expectations, our aspirations, and even our sense of who we are. They teach us what to value, how to behave, and what kinds of lives are possible.

Making Sense of Our Moment

The classical social thinkers help explain phenomena contemporary frameworks struggle with:

Why Wealth Doesn’t Bring Happiness: Economics assumes that more resources mean more satisfaction. But the classic thinkers saw something different: when moral limits collapse and wants become endless, no amount of wealth brings peace.

Why Freedom Feels Like a Burden: We tend to imagine freedom as pure gain—more choice, more autonomy, more control. The classics remind us that freedom without structure is exhausting. When every commitment is optional, when identities must be invented rather than inherited, and when nothing outside us provides guidance, choice stops feeling liberating and starts feeling overwhelming.

Why Community Keeps Falling Apart: Modern policies try to “build community” through programs, initiatives, and apps. The classics understood that real community doesn’t come from design. It comes from shared obligations, common rituals, unchosen relationships, and continuity over time.

Why Technology Makes Things Worse, Not Better: We keep expecting technology to fix loneliness or rebuild connection. But when technology is built on market incentives and the logic of efficiency, it amplifies the very problems we hope it will solve.

Why Institutions Keep Failing Us: Everywhere we look, institutions feel brittle, ineffective, or hollow. Our reflex is to demand better rules, stronger incentives, more oversight. But the classics point to a deeper issue: institutions designed mainly for efficiency and productivity can’t also provide identity, purpose, or belonging.

Living in Modernity’s Ruins

The classical social theorists don’t give us easy fixes because they knew that none exist. They understood that we cannot slip back into pre-modern forms of community, cannot simply unwind the rationalization that organizes modern life, and cannot restore the thick, taken-for-granted social structures that modernity dissolved.

But what they can give us is clarity: clarity about what has been lost, about why our deepest problems endure despite extraordinary technical progress, and about which tensions are woven into the very fabric of modern life rather than amenable to policy tinkering or therapeutic intervention.

This might seem pessimistic, but there is a kind of liberation in it. If we stop expecting technical fixes to repair what are really cultural contradictions, we may finally learn to cultivate more realistic expectations—and more sustainable forms of flourishing.

And this is where a different kind of hope enters. While we cannot reenchant the world by wishing away modernity’s disenchantment, we can reenchant it through the things that only human beings can make: through art and music, through literature and ritual, through acts of creativity and meaning-making, through humanistic inquiry that deepens understanding, through scientific investigation that expands wonder, and through social scientific insight that clarifies the forces shaping our lives.

These are not substitutes for the old frameworks; they are the means of creating new ones.

The classical social thinkers help us see our moment with uncommon clarity because they stood close enough to modernity’s birth to witness both what was gained and what was lost. They watched the great transformation unfold and grasped its full scope in ways that are hard for us, living inside it, to perceive.

Recapturing their wisdom will require us to recover their tragic sensibility, their structural understanding, and their recognition that modernity’s benefits and costs come bound together.

We are richer, freer, healthier, and longer-lived than any previous generation. We are also more anxious, more isolated, more unmoored, and less certain of what makes life meaningful. The classics saw that these aren’t contradictions but two sides of the same coin.

Understanding this won’t magically make us happy. But it might help us confront our condition honestly—and perhaps learn to reenchant a disenchanted world in the only ways that remain open to us: through imagination, creativity, inquiry, and the hard-earned clarity of seeing things as they really are.

Steven Mintz

Recommend Steven Mintz to your readers

Professor of History, The University of Texas at Austin