How Our Individualism Has Trapped Us In A Welfare State

How Our Individualism Has Trapped Us In A Welfare State

America’s enchantment with individualism is so thoroughly ingrained that it has become almost invisible, except in our massive, socialistic welfare state.
Heather Smith
By

At what point does a society cease to be a society? Somewhere there is a line between an organized group of people sharing a common culture and a group of disconnected people living in geographical proximity to one another. Whether American society has crossed that line of dissolution is debatable, but we are at the least heading intently toward that oxymoronic state of existence: individualized society.

We are obsessed with the idea of how connected we are, but social media is no substitute for real society, and our thousand electronic connections only dazzle our minds enough that we cannot recognize how paltry our actual human connections have become. The process of individualization that began centuries ago is now increasing at an exponential rate, fueled partly through technological realities, but even more so through the weighty accumulation of ideological shifts.

Cultural individualization has obviously contributed to our many dismembered families, but it has also cornered us into a welfare state mentality from which we cannot escape unless we replace our concept of a society of individuals with something more ordered and interconnected.

A Brief History of Individualism

In my ninth-grade government and economics course, I memorized the definition that “the family is the basic unit of society.” Even 20 years ago, the reality of that definition was tenuous. Family as a social unit in America has been measurably in decline for more than half a century. It is not the deterioration of the family, though, that has led to the concept of the individual as the basic social unit, but vice versa.

Throughout millennia of human history, societies were structured from the family upward. Exile, which hardly seems a threatening penalty today, was the epitome of punishment in antiquity because it cut the individual off from family and thus from society, including all means of financial and physical well-being. The much-maligned feudal system of the medieval period was not intended to keep power in the hands of a few, but rather to provide protection for all by ensuring each had a secure place within the social order.

With the exception of monastic life, outside of marriage and family there was no way to gain the material items necessary for daily existence. At every stage of life, each individual was connected to family in some way—as a child, a spouse, or a parent. In fact, up until the eighteenth century or so, the family was not only the basic unit of society but more importantly its basic model. The state was the benevolent “parent” of all, deriving authority not by mutual agreement but by transcendent decree.

The ideological shift toward individualization began to be stated clearly in the Enlightenment. Distilling truth down to only what could be derived by reason, Enlightenment thinkers effectively eschewed all realities except the self. The philosophy of the social contract became the predominant political theory in Europe, but also in the nascent United States, which declared its independence precisely on the basis of government being legitimate only through “the consent of the governed.”

The practical push toward individualism came at the behest of the Industrial Revolution. Centralized work in factories relied on people acting as individual workers, not members of a family. Machines equalized women and children with men as useful laborers, and rather than family providing security as the place physical needs could be met, it became a place of uncertainty where a man’s monetary income was increasingly strained to provide for his entire family unless he were willing to send some of them into the factories or others’ homes to work as servants or tutors.

By the mid-nineteenth century, transcendentalists such as Emerson and Thoreau turned from rationalism but continued to extol the self-sufficiency of the individual. The twentieth and twenty-first centuries have dutifully followed the path they blazed, separating the individual from society, then family, and now even the self, as we question whether we have any inherent identity apart from our transitory desires and feelings.

Family as an Oddity, Not a Norm

These intentional and unintentional shifts have brought us to the twenty-first century with the assumption people should navigate life from the perspective of the individual, not the family. The preeminent philosophy of our day is the starry-eyed Disney moral “Be yourself!” What does this vacuous imperative even mean? Can you actually avoid being yourself?

If we assume dastardly forces such as family and societal responsibilities can sway you to be something other than yourself, how can you be sure you have filtered out all these pernicious influences to distill your true essence? Unless you are content to “be yourself” in isolation, you end up performing a precarious social balancing act in which individualism turns a blind eye to its own impossibility in order to become the basis of our cultural norms. We live alone together.

Parents do not want to “impose” their beliefs on their children, thereby harming their distinct personalities, so they raise their children without religion or discipline. Birth control and abortion become “health rights” because men and women want the pleasure of sex without any inconvenient consequences to restrict their individual plans. Retirement communities and nursing homes conveniently remove the dependent elderly so their affluent children and grandchildren can continue their individual lives without nuisance.

Perhaps most telling is the rise of singlehood as a way of life. More than a quarter of American dwellings now house only a single inhabitant. We have gone from assuming that the natural life course after adolescence is marriage (living alone being a concession for rare circumstances) to assuming just the opposite. Young people who marry right out of college are viewed with skepticism, and those who marry before completing college are assumed to be acting under the influence either of immature infatuation or oppressive religious compulsion. If a person does not live on his or her own for at least a few years of young adulthood, society furrows its brow in worried concern.

This expectation of independent living comes with promises of greater personal fulfillment, financial stability, and relational wisdom, all without denying the possibility of marriage—when it is quite convenient. But not excluding the possibility of marriage and planning a route that will culminate in marriage are entirely different undertakings, as many 20- and 30-somethings are discovering. Having blithely gone along the successful career path, they wake up one morning and realize in a panic that somewhere in that foggy land of independent life, they passed the point at which they should have found a spouse and married. For others, the comfortable inertia of singlehood makes marriage seem unappealing or unnecessary, and they prefer to continue hooking up, hanging out, or maybe buying a puppy together instead of a ring.

Valentine’s Day illuminates how American culture is not only becoming more individualized but proud of the shift. Whereas February movie offerings generally include at least some ostensibly romantic flick, the only remotely romance-themed movie playing in theaters this year was “How to Be Single.” Dairy Queen actually commissioned a survey on relationship statuses and partnered with a clinical psychologist before creating the Singles Blizzard as their 2016 Valentine’s treat to celebrate the singles who now outnumber their married counterparts and, presumably, will appreciate the shift toward a “more modern definition of the day that celebrates all kinds of love, including love of self.” If the push for a self-loving version of Valentine’s Day doesn’t pan out, we can all (individually) promote Singles Awareness Day, as social media helpfully reminded me.

Our Highly Individualized Society

Individualism affects not only the family and interpersonal relationships but the entire American culture. The force of technology in our broader societal individualization is, of course, on constant display. We take selfies and post them along with the fleeting emotional commentary of our lives on Facebook, where we are superficially connected to hundreds of people, most of whom we rarely speak to in person. We compose jibes of under 140 characters and tweet them out to a world we assume is anxiously waiting to hear our reactionary quips.

America’s enchantment with individualism is so thoroughly ingrained that it has become almost invisible.

Or if we have more words and time for reflection, we compose earnest blog posts and send them into the leveling field of cyberspace where every author can always get published. We work from home, do the assignments for our online education at our own pace, and check in with a cyber church when we feel a personal need for religious stimulus. When we have to go shopping at a real store, we can mostly avoid human contact with the help of the self-checkout.

Beyond the palpable isolation of technology, however, America’s enchantment with individualism is so thoroughly ingrained that it has become almost invisible. Consider the prevalence of mandatory insurance. Although the recent mandating of health insurance has met resistance, few Americans question the legitimacy of states requiring auto insurance or lenders demanding homeowners’ insurance.

Yet such property insurance did not become widespread in America until the 1940s and ’50s, and only became extensively mandatory around the 1970s. While these mandates might seem necessary and innocuous measures to maintain a stable society, it is telling that they replace the responsibility of a community to care for its members (think Amish barn-raisings) with individual responsibility to recover from any potential disaster.

It is also not difficult to trace the high cost of health care in part to the rise of individualism. Until the eighteenth century, hospitals were almost exclusively run by religious orders as charities, and not until the early twentieth century did for-profit and government hospitals begin to proliferate. For centuries it had been assumed that the religious community should take care of its own and the destitute, but as individualism eroded the cohesion of church and family, health care became a personal concern, bolstered by the introduction of health insurance plans in the 1920s and the expectation of employer-paid health insurance that became established during World War II.

The Woeful Isolation of the Welfare State

Liberals and conservatives alike have fallen prey to the idolization of the individual. The Left lobbies for social compassion toward minorities, immigrants, the poor. The Right clamors for individual rights to gun ownership, property, free speech. Both hit some insightful notes with their impassioned pleas, but both have fundamentally missed the forest for the trees. Society is not made up of isolated individuals but of human beings in relation to one another.

Society is not made up of isolated individuals but of human beings in relation to one another.

The family and its extensions—church, school, workplace—used to be the means by which the underprivileged were served and the essential rights preserved, but as society has fragmented into total individualization, these institutions have lost their cohesion, and social responsibility has ironically been pushed further away from the individuals. On their own, the members of society lack the capability to support or protect one another, and since they no longer form cohesive local groups, the only social institution left to shoulder the protection of the individual is the increasingly unwieldy government.

Even the proposed socialism of some leftists is only sham cohesion. Rather than bringing people together in real social connection, it merely pools their money so they can avoid actual human compassion. The onus of caring for the needy is removed to the state, to which each citizen contributes not as part of a family, church, or other social institution, but merely as an individual who can pay his dues to be relieved of the duty of loving his neighbor. Those on the receiving end of government assistance are even more isolated, since the financial handouts give them no human network of support, only a check with which they are expected to care for themselves independently.

Let’s Reconnect with Each Other

Living together in isolation is not a sustainable social model. So long as we continue to think of the individual as the basic unit of society, our progression toward the disenchanted welfare state will continue, even while no amount of socialized government intervention will provide the human cohesion we need.

Indeed, government is incapable of buttressing our crumbling human connections. That task must start with rebuilding individuals into families and families into society. Like every great undertaking, the process will be slow and require sacrifice, but the recompense will be not only a healthy and sustainable society, but also, paradoxically, a stronger sense of our individual identity as we reconnect with other human beings.

Heather Smith is an advocate of classical Lutheran education and holds a BA in Elementary Education and an MA in English. Her writing may also be found at www.sisterdaughtermotherwife.com.

Copyright © 2017 The Federalist, a wholly independent division of FDRLST Media, All Rights Reserved.