“Well, that’s a really important thing to investigate.” While Naomi Wolf’s intellectual side failed her last week, her public side did not. That first line was her measured response when a BBC interviewer pointed out — on live radio — that cursory research had disproven a major thesis in her new book, Outrages: Sex, Censorship, and the Criminalization of Love (she misinterpreted a Victorian legal term, “death recorded,” to mean execution — the term actually meant the person was pardoned). Hearing this go down, journalists like me theorized how we would react in similar circumstances (defenestration) and decried the lack of fact-checkers in publishing (fact: Authors oftenhave to pay for their own). The mistake did, however, ironically, offer one corrective: It turned Wolf from cerebral superhero into mere mortal. No longer was she an otherworldly intellect who could suddenly complete her Ph.D. — abandoned at Oxford when she was a Rhodes Scholar in the mid-’80s, Outrages is a reworking of her second, successful, attempt — while juggling columns for outlets like The Guardian, a speaking circuit, an institute for ethical leadership, and her own site, DailyClout, not to mention a new marriage. Something had to give, and it was the Victorians.
Once, the public intellectual had the deserved reputation of a scholarly individual who steered the public discourse: I always think of Oscar Wilde, the perfect dinner wit who could riff on any subject on command and always had the presence of mind to come up with an immortal line like, “One can survive everything nowadays except death.” The public intellectual now has no time for dinner. Wolf, for instance, parlayed the success of her 1991 book The Beauty Myth into an intellectual career that has spanned three decades, multiple books, and a couple of political advisory jobs, in which time her supposed expertise has spread far beyond third-wave feminism. She has become a symbol of intellectual rigor that spans everything from vaginas to dictatorships — a sort of lifestyle brand for the brain. Other thought leaders like her include Jordan Peterson, Fareed Zakaria, and Jill Abramson. Their minds have hijacked the public trust, each one acting as the pinnacle of intellect, an individual example of brilliance to cut through all the dullness, before sacrificing the very rigor that put them there in order to maintain the illusion floated by the media, by them, even by us. The public intellectual once meant public action, a voice from the outside shifting the inside, but then it became personal, populated by self-serving insiders. The public intellectual thus became an extension — rather than an indictment — of the American Dream, the idea that one person, on their own, can achieve anything, including being the smartest person in the room as well as the richest.
* * *
I accuse the Age of Enlightenment of being indirectly responsible for 12 Rules for Life. The increasingly literate population of the 18th century was primed to live up to the era’s ultimate aspiration: an increasingly informed public. This was a time of debates, public lectures, and publications and fame for the academics behind them. Ralph Waldo Emerson, for one. In his celebrated “The American Scholar” speech from 1837, Emerson provided a framework for an American cultural identity — distinct from Europe’s — which was composed of a multifaceted intellect (the One Man theory). “The scholar is that man who must take up into himself all the ability of the time, all the contributions of the past, all the hopes of the future,” he said. “In yourself slumbers the whole of Reason; it is for you to know all, it is for you to dare all.” While Emerson argued that the intellectual was bound to action, the “public intellectual” really arrived at the end of the 19th century, when French novelist Émile Zola publicly accused the French military of antisemitism over the Dreyfus Affair in an open letter published in L’Aurore newspaper in 1898. With “J’Accuse…!,” the social commentary Zola spread through his naturalist novels was transformed into a direct appeal to the public: Observational wisdom became intellectual action. “I have but one passion: to enlighten those who have been kept in the dark, in the name of humanity which has suffered so much and is entitled to happiness,” he wrote. “My fiery protest is simply the cry of my very soul.”
The public intellectual thenceforth became the individual who used scholarship for social justice. But only briefly. After the Second World War, universities opened up to serve those who had served America, which lead to a boost in educated citizens and a captive audience for philosophers and other scholars. By the end of the ’60s, television commanded our attention further with learned debates on The Dick Cavett Show — where autodidact James Baldwin famously dressed down Yale philosopher Paul Weiss — and Firing Line with William F. Buckley Jr. (also famously destroyed by Baldwin), which would go on to host academics like Camille Paglia in the ’90s. But Culture Trip editor Michael Barron dates the “splintering of televised American intellectualism” to a 1968 debate between Gore Vidal — “I want to make 200 million people change their minds,” the “writer-hero” once said — and Buckley, which devolved into playground insults. A decade later, the public intellectual reached its celebrity peak, with Susan Sontag introducing the branded brain in People magazine (“I’m a book junkie. … I buy special editions like other women shop for designer originals at Saks.”)
As television lost patience with Vidal’s verbose bravado, he was replaced with more telegenic — angrier, stupider, more right-wing — white men like Bill O’Reilly, who did not clarify nuance but blustered over the issues of the day; the public intellectual was now all public, no intellect. Which is to say, the celebrity pushed out the scholar, but it was on its way out anyway. By the ’80s, the communal philosophical and political conversations of the post-war era slunk back to the confines of academia, which became increasingly professionalized, specialized, and insular, producing experts with less general and public-facing knowledge. “Anyone who engages in public debate as a scholar is at risk of being labelled not a serious scholar, someone who is diverting their attention and resources away from research and publicly seeking personal aggrandizement,” one professor told University Affairs in 2014. “It discourages people from participating at a time when public issues are more complicated and ethically fraught, more requiring of diverse voices than ever before.” Diversity rarely got past the ivy, with the towering brilliance of trespassers like Baldwin and Zora Neale Hurston, among other marginalized writers, limited by their circumstances. “The white audience does not seek out black public intellectuals to challenge their worldview,” wrote Mychal Denzel Smith in Harper’s last year, “instead they are meant to serve as tour guides through a foreign experience that the white audience wishes to keep at a comfortable distance.”
Speaking of white audiences … here’s where I mention the intellectual dark web even though I would rather not. It’s the place — online, outside the academy, in pseudo-intellectual “free thought” mag Quillette — where reactionary “intellectuals” flash their advanced degrees while claiming their views are too edgy for the schools that graduated them. These are your Petersons, your Sam Harrises, your Ben Shapiros, the white (non)thinkers, usually men, tied in some vague way to academia, which they use to validate their anti-intellectualism while passing their feelings off as philosophy and, worse, as (mis)guides for the misguided. Last month, a hyped debate between psychology professor Peterson and philosopher Slavoj Žižek had the former spending his opening remarks stumbling around Marxism, having only just read The Communist Manifesto for the first time since high school. As Andray Domise wrote in Maclean’s, “The good professor hadn’t done his homework.” But neither have his fans.
But it’s not just the conservative public intellectuals who are slacking off. Earlier this year, Jill Abramson, the former executive editor of The New York Times, published Merchants of Truth: The Business of News and the Fight for Facts. She was the foremost mind on journalism in the Trump era for roughly two seconds before being accused of plagiarizing parts of her book. Her response revealed that the authorship wasn’t exactly hers alone, a fact which only came to light in order for her to blame others for her mistakes. “I did have fact-checking, I did have assistants in research, and in some cases, the drafting of parts of the book,” she told NPR. “I certainly did spend money. But maybe it wasn’t enough.” Abramson’s explanation implied a tradition in which, if you are smart enough to be rich enough, you can pay to uphold your intellectual reputation, no matter how artificial it may be.
That certainly wasn’t the first time a public intellectual overrepresented their abilities. CNN host Fareed Zakaria, a specialist in foreign policy with a Ph.D. from Harvard — a marker of intelligence that can almost stand in for actual acumen these days — has been accused multiple times of plagiarism, despite “stripping down” his extensive workload (books, speeches, columns, tweets). Yet he continues to host his own show and to write a column for The Washington Post in the midst of a growing number of unemployed journalists and dwindling number of outlets. Which is part of the problem. “What happens in the media is the cult of personality,” said Charles R. Eisendrath, director of the Livingston Awards and Knight-Wallace Fellowship, in the Times. “As long as it’s cheaper to brand individual personalities than to build staff and bolster their brand, they will do it.” Which is why Wolf, and even Abramson, are unlikely to be gone for good.
To be honest, we want them around. Media output hasn’t contracted along with the industry, so it’s easier to follow an individual than a sprawling media site, just like it’s easier to consult a YouTube beauty influencer than it is to browse an entire Sephora. With public intellectuals concealing the amount of work required of them, the pressure to live up to the myth we are all helping to maintain only increases, since the rest of us have given up on trying to keep pace with these superstars. They think better than we ever could, so why should we bother? Except that, like the human beings they are, they’re cutting corners and making errors and no longer have room to think the way they did when they first got noticed. It takes significant strength of character in this economy of nonstop (and precarious) work to bow out, but Ta-Nehisi Coates did when he stepped down last year from his columnist gig at The Atlantic, where he had worked long before he started writing books and comics. “I became the public face of the magazine in many ways and I don’t really want to be that,” he told The Washington Post. “I want to be a writer. I’m not a symbol of what The Atlantic wants to do or whatever.”
* * *
Of course a public intellectual saw this coming. In a 1968 discussion between Norman Mailer and Marshall McLuhan on identity in the technology age (which explains the rise in STEM-based public intellectuals), the latter said, “When you give people too much information, they resort to pattern recognition.” The individuals who have since become symbols of thought — from the right (Christina Hoff Sommers) to the left (Roxane Gay) — are overrepresented in the media, contravening the original definition of their role as outsiders who spur public action against the insiders. In a capitalist system that promotes branded individualism at the expense of collective action, the public intellectual becomes a myth of impossible aspiration that not even it can live up to, which is the point — to keep selling a dream that is easier to buy than to engage in reality. But an increasingly intelligent public is gaining ground.
The “Public Intellectual” entry in Urban Dictionary defines it as, “A professor who spends too much time on Twitter,” citing Peterson as an example. Ha? The entry is by OrinKerr, who may or may not be (I am leaning toward the former) a legal scholar who writes for the conservative Volokh Conspiracy blog. His bad joke is facetious, but not entirely inaccurate — there’s a shift afoot, from the traditional individual public intellectual toward a collective model. That includes online activists and writers like Mikki Kendall, who regularly leads discussions about feminism and race on Twitter; Bill McKibben, who cofounded 360.org, an online community of climate change activists; and YouTubers like Natalie Wynn, whose ContraPoints video essays respond to real questions from alt-right men. In both models, complex thought does not reside solely with the individual, but engages the community. This is a reversion to one of the early definitions of public intellectualism by philosopher Antonio Gramsci. “The traditional and vulgarized type of the intellectual is given by the man of letters, the philosopher, the artist,” he wrote in his Prison Notebooks — first published in 1971. “The mode of being of the new intellectual can no longer consist in eloquence, which is an exterior and momentary mover of feelings and passions, but in active participation in practical life, as constructor, organizer, ‘permanent persuader’ and not just a simple orator.” It doesn’t matter if you’re the smartest person in the room, as long as you can make it move.
* * *
Soraya Roberts is a culture columnist at Longreads.