Skip to main content

Reality Made Me Do It

Martha Bayles

View full PDF

A mass murderer uses a body camera to record the panic and death of his victims as seen through his own eyes, then posts the result online, where it goes viral as a thrilling upgrade on the first- person-shooter type of videogame.

A burly self-described diaosi (“loser”) in China turns himself into a live-streaming superstar called Big Li, and wins the “love” of several million followers and riches in the form of “tips” from wealthy fans—until he loses it all in the final round of a “tournament” and becomes chronically depressed.

A dysfunctional mother and her thirteen-year-old daughter appear on a reality talk show, and when the daughter behaves outrageously toward the host and audience, she gains overnight celebrity and is rewarded with a stay at a therapeutic “ranch” for troubled teens (paid for by the host), a lucrative contract with a rap record label, a chance to start her own cosmetics business, and, eventually, her own reality show.

The latter is an understandable response to that pressure—to push back with a complete, often darkly comic, revelation of one’s every misery, failure, and flaw, because these are at least real. This dynamic has led Instagram and other social media platforms to offer two different accounts: one for the image of perfection users feel compelled to project publicly, and another for all the

The current president of the United States dispenses with most of the two-way channels of communication used by his predecessors, including press conferences, adversarial interviews, and meetings with the leaders of Congress and various federal agencies, and opts for the one-way bullhorn of a nonstop Twitter feed.

These are just a few dramatic instances of what has become the dominant mode of expression in the digital age: the attention-seeking self-portrait displayed on social media. These self-portraits come in two modes: the upbeat, polished self-advertisement and the downbeat, abrasive self- exposure. The former has its uses, just as advertising does. But it also brings unbearable pressure, especially on children and adolescents, to force one’s unformed, immature, imperfect self into a painfully artificial mold of perfect happiness and success.

The latter is an understandable response to that pressure—to push back with a complete, often darkly comic, revelation of one’s every misery, failure, and flaw, because these are at least real. This dynamic has led Instagram and other social media platforms to offer two different accounts: one for the image of perfection users feel compelled to project publicly, and another for all the imperfections users want to share with close friends and others whom they trust. In an ironic twist, the first type of account is called “real” and the second “fake.” According to one digital native I know, the terms are reversed as a form of protective coloration, because “the real account is obviously the ‘fake’ account, and the fake account is obviously the ‘real’ account.”

This curious dynamic did not start with social media, because social media grew out of a cultural milieu already marked by a decline in decency, propriety, and civility. The causes of that decline can be traced back as far as one likes, but suffice it to say they include the political and social upheavals of the 1960s and 1970s and a half century of dramatic technological, economic, and regulatory change in American broadcast media, especially television.

Around the turn of the millennium, these causes converged in a particular genre of TV entertainment, the reality show, which quickly became a leading incubator of negative self- exposure. It is impossible to prove a counterfactual, but without reality TV, it seems unlikely that so many people would equate “being real” and “telling it like it is” with spilling ugly secrets, flaunting rank egotism, attacking personal morality and social norms, and exuding contempt for the opinions and sensibilities of others. This cultural turn is dismaying enough, but as this kind of behavior comes to define what is honest, authentic, and true, it becomes more difficult for free and democratic societies to push back against the looming threat of a full-fledged surveillance state, a digital Panopticon.

Not Your Grandfather’s Media

The American system of electronic media is unique in many ways, but perhaps the most salient is its tradition of private ownership. Unlike their counterparts in every other nation, the American broadcast media have never been owned or operated by the state. Yet contrary to common belief, this does not mean that the broadcast media have always enjoyed the same First Amendment protection as newspapers and other print media. On the contrary, concerns about their greater reach and power have typically led to government controls.

In the 1920s, when radio first appeared, it was a private-sector free-for-all, not unlike the Internet fifteen years ago. But by 1933 the courts had defined the broadcast frequency spectrum as a scarce public good, allowing Congress to establish a new agency, the Federal Communications Commission (FCC), to grant licenses only to those broadcasters willing to abide by its regulations
—including content rules prohibiting obscenity, indecency, and profanity. When television came along in the 1950s, it followed suit, because it was dominated by the same major networks that owned and operated the majority of radio stations.

Because the FCC could not engage in prior restraint, a form of censorship judged unconstitutional in 1931, the agency exerted control by responding to complaints received after prohibited material had aired. Those responses included fining the offending stations or on occasion pulling their licenses. To fend off such actions, the networks set up departments of “standards and practices” to monitor their content. But this system has applied only to content on free-to-air broadcasting, not to content on cable, satellite, and online services to which listeners and viewers subscribe.

The last fifty years have also seen a push toward greater freedom in all aspects of media, coming from two distinct partisan camps: one from the liberationist left, another from the libertarian right. Indeed, while disagreeing on nearly everything else, these polarized camps have effectively conspired to remove every constraint on radio and television, from content to licensing to ownership.

The left made the first move. In the early 1970s, the popular comedian George Carlin took aim at the FCC’s decency standards with his famous “seven dirty words” monologue. In 1973, a complaint was brought before the FCC by a father who while driving with his young son had heard Carlin’s monologue on WBAI, a Pacifica Foundation radio station in New York City. The case was only partly resolved in 1978, when the Supreme Court upheld the FCC’s right to prohibit profanity but left open the question of whether this included the “occasional” or “fleeting” use of an “expletive.”

While the left lost that particular battle, it won the war. It is still illegal to use profanity on NBC, CBS, ABC, Fox, or PBS. But such limited probity has hardly held back the forces challenging restraint. This is partly because the moment Ronald Reagan was elected president, the right took aim at the FCC’s rules on licensing and ownership.

Regarding licensing, the FCC under President Reagan made the process a lot less demanding. Instead of the old system, which required a broadcaster seeking a license to fill out a lengthy form and submit evidence that it was not only upholding the content rule but also providing enough news and information to “serve community needs,” the new system required a far less onerous process, in some cases merely filling out a postcard. The justification for this and other changes was that, given the growing number of broadcast and subscription outlets, the best way to serve community needs was through competition.

And this might have happened if the FCC had continued to limit the number of media outlets a single company could own in a given market. But it did the opposite. Beginning in the 1980s and at an accelerating pace amid the free-market euphoria of the 1990s, the FCC relaxed ownership limits to the point of encouraging near monopolies. From a long-accepted limit of one AM and one FM station per market, the Telecommunications Act of 1996 allowed a single broadcaster to own eight stations per major market and five per smaller market, with no cap on the total nationwide.

It seems contradictory to argue that free-market competition will serve community needs, then allow the nation’s media markets to be dominated by a few giant companies. But that is what the FCC did over three successive administrations, two Republican and one Democratic. And Americans on both sides of the culture war looked through their rose-colored glasses and predicted greater freedom, diversity, and excellence. No one seemed to notice the contradiction, because in the 1980s and 1990s, any effort to free media from the heavy hand of government was politically popular.

This is not to suggest that there was no competition throughout this period. There was plenty. But in an age of galloping consolidation, there was less competition between companies than between technologies. And since the traditional controls on content were increasingly confined to the older generation of media, that competition quickly became a race to the bottom.

From Scripted to Unscripted

The major TV networks began losing audience share to cable channels and the videocassette recorder during the 1970s. The extent of that decline was confirmed by new audience- measurement techniques adopted in the early 1980s. Advertisers began to pull back, and the production studios found themselves running huge deficits for their “scripted” programs: up to $100,000 per episode of a half-hour sitcom, $300,000 per episode of an hourlong drama. To survive, the studios hired nonunion crews and filmed in less expensive locations. But when these measures failed to cover costs, the key to short-term success became “unscripted” fare that was cheaper to produce and easier to adjust to the momentary whims of a distracted viewership.

The first unscripted success was the daytime talk show. The format dates to the 1930s, when radio networks featured highbrow interviews with authors, scholars, and public officials. The same was true of state-owned networks overseas, the BBC being a prime example. When television arrived in the 1950s, the talk shows on America’s commercial channels became more middlebrow, with late-night hosts such as Jack Paar and Johnny Carson combining serious interviews with live performances and celebrity chat. Not until the 1970s did the talk show have a strong national presence on daytime television, previously the domain of the soap opera. As noted above, this change was driven less by viewer preference than by the need to cut costs.

Taken nationwide in 1970, Phil Donahue’s syndicated show appealed directly to soap opera fans by focusing on the concerns of a largely female audience. Donahue’s big innovation—stepping off the stage and circulating among the audience to solicit opinions from individual fans—had long been a staple of revivalist preaching. But it worked no less well in this secular context. In 1986 the talk show was taken to a new level by another syndicated host, Oprah Winfrey. In those early days Winfrey assumed the role of the nineteenth-century “sob sister” reporter, offering noncelebrity guests a chance to share their sad, traumatic, sometimes sensationalistic tales—a formula that quickly pushed The Oprah Winfrey Show past Donahue in the ratings.

But the 1990s brought a different tone, with “trash TV” hosts such as Ricki Lake, Jenny Jones, Geraldo Rivera, Maury Povich, and Jerry Springer competing for guests whose lives were so blighted by poverty, ignorance, and every conceivable human failing, from incest to obesity, pedophilia to bestiality, that they were easily prodded into shedding what little dignity they still possessed. For a while, Winfrey’s show drifted in the same direction, but in the mid-1990s she announced that she was “tired of the crud” and vowed to set a better example with “Change Your Life TV.”

The 1990s also brought a more capacious genre of unscripted entertainment: the reality show. Here, too, there were precedents—Queen for a Day, What’s My Line?, Candid Camera—going back to a time when there were clear standards regarding the exposure of ordinary people’s lives on television. The same standards applied to the first generation of network-produced reality shows: America’s Funniest Home Videos (ABC), Rescue 911 (CBS), Cops (Fox), and the runaway hit Survivor (CBS). These late-1990s programs were amusing, sometimes embarrassing, but rarely degrading.

The degradation started when the cable channels got into the act. Their content unregulated by the FCC, those channels opened the floodgates, with MTV leading the way. Its popular series The Real World focused on the ups and downs of a group of young people living together. At first the show was faulted for its juvenile tone but praised for encouraging tolerance of racial and political differences. But then it went downhill. During the 2010 season, one participant accused another of stealing his toothbrush and using it to scrub the toilet bowl. The police were called, and when one officer was asked whether he thought the incident had actually occurred, he replied, “This is a reality show, so who’s to say?”

MTV followed this act with Girls Gone Wild, a show consisting almost entirely of video clips of young women baring their breasts and/or simulating sex acts for the camera. Taking up the challenge to the best of their content-restricted ability, the networks responded with cleaner but equally amoral fare such as Temptation Island (Fox), in which attractive singles tried to lure married individuals into committing adultery. Two rival shows, Bachelor Pad (ABC) and Joe Millionaire (Fox), urged female contestants to humiliate themselves in the hope of catching a wealthy husband.

One of the more successful network shows is  Big Brother  (CBS), which recently began its  twentieth season. Assembling a group of contestants in a house filled with surveillance cameras   and cut off from the outside world, the program invites the audience to play the voyeur while the group votes every week to evict some housemates and advance others. The audience also votes via text message, until the last housemate standing wins the grand prize of $500,000. Cable has  followed suit with “docu-soaps” surveilling known celebrities like rock singer Ozzy Osbourne (The Osbournes, MTV) and creating new ones in shows like  Keeping Up with the Kardashians  (E!), Real Housewives of Orange County (Bravo), and Jersey Shore (MTV).

In all of these trash-based programs, the road to fame is paved with eager, over-the-top displays of materialism, vulgarity, and selfishness. One measure of the desperate exhibitionism involved is the willingness of the participants to sign contracts in which they agree not to sue the producers for subjecting them to “public ridicule, humiliation or condemnation.” Some programs require participants to forego legal action even if the producers air “factual and/or fictional” information about them that is “personal, private, intimate, surprising, defamatory, disparaging, embarrassing    or unfavorable.”

The current occupant of the White House is frequently described as a businessman or real-estate tycoon, but a more accurate description is “trash-TV host.” For the eleven years leading up to his election to the presidency, Donald Trump was immersed in reality television. His original program, The Apprentice, began as a contest between would-be entrepreneurs for a salaried position in his company. But as each episode concluded with the contestants assembled in the “boardroom” to be judged by Trump and one or two of his offspring, it quickly devolved into trash TV, with Trump pitting the contestants against each other and crushing the losers with his signature line, “You’re fired!” The nastier the behavior in the boardroom, the greater Trump’s evident pleasure. As noted in a review of the show on the Internet Movie Database (IMDb), the contestants were “expected to do everything to keep them on the show (that means lying, trash- talking, backstabbing, etc.)…. In earlier seasons, at least some of the contestants had a bit of integrity. Now it seems like the contestants would kill their own mother.… I can’t see why anyone with common sense would want to work for [Trump]. He just likes to trash people.”

In mass entertainment media, trashing people is bad taste. In the media that serve as America’s political forum, it is something far worse: a rupture of the barrier between civil discourse and cultural degradation. There’s an old adage about a vat of wine standing next to a vat of sewage. Add a cup of wine to the sewage, and it is still sewage. Add a cup of sewage to the wine, and it becomes sewage.

“Good and bad coin cannot circulate together,” wrote the sixteenth-century English financier Thomas Gresham. His meaning, later formulated as Gresham’s Law, was that when a currency combines coins made of silver or gold and coins made of baser metals, the bad will soon drive out the good. Something similar happened in the 1990s and 2000s, when the new technology of satellite television and the opening of new post–Cold War markets led to a surge in the export of American entertainment products to the rest of the world. A few of those exports held real value for people; many more did not.

Among the valuable exports was the talent-based reality show, in which a group of would-be singers, dancers, chefs, or other creative amateurs compete for the approval of a panel of judges, as well as votes from the viewing audience. This type of show can be rough on the losers, but it is not degrading, because win or lose, the participants are competing on the basis of natural gifts, hard work, and skill. In America’s unregulated market, the good coin of these shows tends to be driven out by the bad coin of the trash-based kind. The same thing happens in Russia and China, but not because those markets are unregulated. On the contrary, it happens because both of these authoritarian regimes recognize the power of reality TV to shape not just the subjective reality of the viewer but also the political reality of the country. In Russia, the cynicism fostered by trash- based reality TV has proved useful to the Putin regime. In China, the aspiration fueled by the talent-based alternative has been perceived as a threat.

To begin with Russia: Amid the freedom and chaos of the 1990s, an array of newly launched private-sector channels tried everything from responsible news and public affairs to outrageous programs modeled on Western, especially American, entertainment. But as Vladimir Putin began to consolidate power in the 2000s, one of his first moves was to encourage the production of trash- based reality shows. He did this for three reasons. The first was money: This type of lowest- common-denominator program attracts a large audience, which translates into advertising revenue. The second was image: Western TV shows were seen as cool and hip, and the last thing the new leader of Russia wanted was to come off as a dull, drab, Soviet-style official. The third reason is less obvious but more important: Trash-based reality TV has a deleterious effect on civil society. By encouraging ordinary people to behave crassly and selfishly for the cameras, and inviting the audience to laugh at them and feel superior, trash-based shows encourage cynicism, suspicion, and hostility—hardly the foundations of a truly civil society that supports hope, trust, and cooperation.

Modeled on Big Brother, the 2001 show Za steklom (Behind the Glass) placed six young contestants in an apartment equipped with twenty-six surveillance cameras. Unlike their Western counterparts, who generally kept their clothes on, the participants in Za steklom frequently stripped and engaged in sexual foreplay. When one couple finally had intercourse on camera, the newspaper Komsomolskaya Pravda—formerly the high-minded, heavily didactic official paper of the Communist Union of Youth—ran the headline “Max and Margo finally did it!” Subsequent series went beyond sex to screaming arguments and physical blows. For example, a 2011 show called Mama v zakone (Mother-in-Law) featured violent, vulgar wrestling bouts between two older women, one sixty-five years old and the other fifty-two, over disputes between their children, who were married to each other.

At best, civil society in early post-Soviet Russia was a fragile new growth, pushing up through the ruins of the old regime amid crime, corruption, and social disorder. It needed sunshine and nutrients, in the form of independent associations, social trust, and open political debate. But those delicate tendrils were instead treated with a toxic form of degrading entertainment on a mass scale. If you doubt the cynicism behind such programming, consider the words of Valery Komissarov, a Putin loyalist who created some of the worst reality shows of the 2000s: “When people ask me, ‘Why do you pick so many idiots?,’ I know that I have done my job correctly.”

The story is different in China, where the communist regime did not collapse but instead became the handmaiden of state-directed capitalism. While news and film in China have always been kept under tight wraps (recall Lenin’s remark that “of all the arts the most important for us is the cinema”), television until recently enjoyed rather wide latitude. According to a government official I interviewed in Beijing, station managers in the late 1990s and early 2000s were urged to make television as profitable as possible, and not to worry overmuch about the content, because TV entertainment was “a frivolous matter, just singing and dancing and reality shows.” But not all reality fare proved so anodyne.

In 2004, the regional channel Hunan Satellite TV created a singing-contest reality show called Super Girl that was such a hit that it was broadcast nationally the following year. The show took off, sparking a national craze, complete with public demonstrations in favor of various contestants, which China Daily optimistically described as “a euphoria of voting that is a testament to a society opening up.” The season finale attracted 400 million viewers (the second-largest national audience in the history of Chinese television) and eight million text-message votes. When the winner proved to be a twenty-one-year-old music student from Sichuan named Li Yuchun, who defied convention by dressing in jeans and loose shirts and performing songs ordinarily sung by men, the Internet exploded with comments about this being “a glimpse of the future Chinese civil society.”

Even democracy activist and Nobel Peace Prize winner Liu Xiaobo weighed in, praising Super Girl for being “democratic and equal” and using a “selection system based upon the expert judges, the citizen judges and the viewers’ votes [that] contains the spirit of pragmatic politics.” Liu followed this comment with a caveat to the effect that he was probably “over-rating” a mere TV show in the spirit of “wishful thinking,” and that Super Girl was best seen as “a River of Forgetting that lets people release their dissatisfactions through the joy of entertainment.” But this caveat came too late. In the words of the official I interviewed, the Super Girl phenomenon “prompted an urgent debate in the higher circles of government.” Reluctant to cancel it outright, the authorities ordered paid Internet trolls and the state media to wage a propaganda campaign against “unhealthy” talent contests. Then Super Girl was canceled, just in time for the 2008 Beijing Olympics. In 2009 it was revived, but without the democratic trappings.

Slouching Toward the Digital Panopticon

In their introduction to an edited volume published in 2009, media researchers Susan Murray and Laurie Ouellette warned that reality TV might be teaching people to accept, even welcome, being spied upon:

More and more programs rely upon the willingness of “ordinary” people to live their

lives in front of television cameras. We, as audience members, witness this openness to surveillance, normalize it, and, in turn, open ourselves up to such a possibility.… We are also encouraged to participate in self-surveillance. Part of what reality TV teaches us in the early years of the new millennium is that in order to be good citizens we must allow ourselves to be watched as we watch ourselves and those around us, and then modify our conduct and behavior accordingly.(emphasis added)


Today, these words conjure a nightmare vision of a future Panopticon that not only watches each of us but also crunches terabytes of information to produce a detailed profile for every one of us, which may then be deployed by the rich and powerful to control every move we make.

For the citizens of democracies, the rich and powerful include the tech giants: Google, Facebook, Amazon, and their many subsidiaries. As described by Shoshana Zuboff, a noted author on the socioeconomic consequences of the digital revolution, the huge stock of data accumulated by these titans of “surveillance capitalism” amounts to “a concentration of knowledge unlike anything ever seen in human history,” and confers upon its owners “the ability to actually shape and modify our behavior.” Most European and American critics frame this danger in terms of “privacy.” Zuboff’s language is stronger. She warns that this “new axis of social inequality” will diminish “our capacity to be autonomous and self-determining”—that is, it will stymie “human agency.”

In Xi Jinping’s China, the same danger is framed as a goal: “to allow the trustworthy to roam everywhere under heaven while making it hard for the discredited to take a single step,” a highly placed Communist Party official tells The Economist. When fully operational, China’s much- vaunted “social credit system” will not only track the behaviors, actions, words, and opinions of 1.4 billion people, but also swiftly and efficiently reward or punish those deemed correct or incorrect.

Is the whole world slouching towards a Panopticon of digitally enabled surveillance and control? In the United States and other liberal democracies, the technical and organizational infrastructure for such a regime is not yet in place. Instead of one all-seeing Big Brother watching billions of  prisoners in their transparent cells, we have hundreds of millions of Internet users watching, and performing for, each other. But as we are beginning to realize, our performances are also being watched by the tech giants, eager to profit from the terabytes of data we provide—and, if the    critics of surveillance capitalism are right, to control our choices and ultimately our thoughts.

These tech giants are also intensely competitive, and to judge by their behavior in China, some of their leaders would gladly sell their democratic birthright for a chance to gain a monopoly in a  major market. If democratic governments do not resist this tendency but rather decide that their   own power needs shoring up through the digital magic provided by a favored corporate ally, then a variation on the Panopticon might well be built. At that point, all it would take to turn on the   switch would be a broad and deep consensus about the type of human souls we want the system to engineer. And at that dismal point, humanity’s only hope might be the lack of such a consensus.

Read in the Hedgehog Review.

Related Articles

Hollywood’s Great Leap Backward on Free Expression

Martha Bayles

Beijing moves to co-opt the American film industry as it seeks to penetrate the world’s largest market....

Continue Reading

The Realignment - Ep. 5: Megan McArdle Defends the Market

Saagar Enjeti & Marshall Kosloff

Washington Post columnist Megan McArdle likens capitalism to Winston Churchill’s famous saying about democracy, saying it’s the “worst system except a...

Listen Now

What Big Food Can Teach Gunmakers About Social Responsibility

Hank Cardello

In the face of a serious public health crisis and public outrage that will be stirred anew with every mass shooting, companies can no longer stick the...

Continue Reading