Originally published in Liberties Journal, Winter 2023, Volume 3, Number 2
Michael Ignatieff’s latest book is On Consolation: Finding Solace in Dark Times.
I have been a college teacher for some of the happiest years of my life. When I tell people what I do for a living, what I really do, I say I teach people to think for themselves. It’s still a wonderful way to make a living, but over time I have begun wondering whether I have been fooling myself. I could just be teaching them to think like me, or how to package the conventional wisdoms that they have scraped off the Internet. I find myself wondering, therefore, what it really means to think for yourself, what it means for teachers, for students, and for a society that tells itself it is free.
Thinking for yourself has never been easy, but the question of whether it is still possible at all is of some moment. The key ideals of liberal democracy—moral independence and intellectual autonomy—depend on it, and my students will not have much experience of either if they end up living in a culture where all of their political and cultural opinions must express tribal allegiance to one of two partisan alternatives; where they live in communities so segregated by education, class, and race that they never encounter a challenge to their tribe’s received ideas, or in a society where the wells of information are so polluted that pretty well everything they read is “fake news”.
Thinking for yourself need not require every thought to be yours and yours alone. Originality is not the goal, but the autonomous and authentic choice of your deepest convictions certainly is, and you are unlikely to make authentic choices of belief unless you can learn to wrestle free from the call of the tribe and the peddlers of disinformation and reach that moment of stillness when you actually can ascertain what you think.
The contradiction that teachers face in their classrooms —are we teaching them how to think or how to think like us?—plays out across our whole society. From grade school through graduate school, in our corporate training centers, in our government departments, our enormous educational apparatus is engaged in training us in the ways of thinking appropriate to academic disciplines, corporate cultures, and administrative systems, but with the ultimate objective, so they proclaim, of independent thought. This contradiction – the simultaneous pursuit of conformity and independence — is also at the core of our anxiety about innovation. Our societies say that they prize “thinking outside the box”, whatever the box may be. Our political authorities tell us that the solution to every problem we face—secular stagnation, climate change, geostrategic chaos—depends on “innovation”, which in turn depends, finally, on somebody, somewhere thinking beyond the cliches and the conventions that inundate us daily and keeps us locked in a state of busy mental stagnation.
Our culture tells us we either innovate or we die, but economists such as Robert Gordon point out that the capitalist economy of the twenty-first century has nothing on the horizon to rival the stochastic lurch of innovation in the Edison and Ford eras at the beginning of the twentieth century. The energy transition, when it finally comes, may yet unlock a new stochastic lurch, but so far progress is slow. Despite these warning signs of stagnation—even decadence, as some conservatives argue — our culture clings to the signs of innovation that we can identify, even to minor upgrades to the software on our phones, because such hope as we have for our future, depends on faith that the onward and upward path of new ideas continues. But this hope soon collides directly with the unprecedentedly large pressures for conformity that are generated by those same institutions of innovation. The Pharaonic size of the corporations that control the digital economy raises not just a problem of monopoly or distributive justice, but also an epistemological quandary: are Google, Microsoft, and Facebook, these all-devouring machines, actually places where people can think for themselves? I refer not only to their customers but also to their employees. Is it likely that their well-compensated and disciplined workers can think against the grain of the corporate logic that they serve every day? If not, we may all be on the long way down to secular decline. An institution of innovation may be a contradiction in terms.
The same question haunts the universities in which I have spent some of my life teaching. Every one of these institutions is devoted to what it calls “academic freedom”, but how much freedom do these institutions, their disciplines, their promotion and recruitment processes, their incentives, their intellectual fashions, actually make possible for the individuals inside them? Are they hives of orthodoxy or heterodoxy? Does the competitive search to publish, to earn grants, to make an impression on the public, create a genuine climate of intellectual liberty or just enclose everyone, willingly or not, in an organized, pious, censorious, progressive conformity? That is not a trick question.
The academic fields in which I work—history and international relations—demand new thinking because the world that they study, and for which they train aspiring practitioners of diplomacy and finance and politics, is changing fast, yet it is unclear that any of us have the measure of the moment. To use the metaphor that everybody is using, we are observing a shift in the tectonic plates, a fracturing of what used to be called “the liberal world order.” Not a day passes without someone offering a new grand narrative to replace the stories that we told ourselves to make sense of the end of the Cold War. But each new claimant fades away, defeated by the complexity of the task; and in some cases, it may be the affirmation of the old wisdom that is a sign of intellectual independence. We are living in a dark cloud of crises—climate change, war, global inequality, economic turbulence, social intolerance — and nothing matters more than that we think freshly and clearly.
Recently, as the gloom has deepened further, our rounding up of the usual suspects—the incompetence of our leaders, the savage swamp that is the internet, the frantic polarization of politics—has given way to a kind of chronic epistemological unease, as if we have begun to realize that we have trapped ourselves inside frames of thought that point us away from reality and not towards it. The trouble is that too many people find comfort and safety in those frames and resist the instability – another term for open-mindedness – that results from questioning them.
But sometimes there are exceptions to the retreat into platitudes. Let me give an example. I sit in on regular team calls by a company that advises major business clients on geostrategic risk. Everyone on the call is impressively well-educated and articulate and plugged into impressive information networks. Someone on the call has just returned from a highly regarded conference of experts. Others have just had confidential discussions with a government minister or a well-placed adviser. The calls review the world’s unending torrent of bad news—Ukraine, China-Taiwan, the climate change inferno in the Mediterranean and California, the repeal of Roe v. Wade—and all this calamitous news is parsed by knowledgeable principals and then packaged for the clients and “monetized” for the company.
The calls are wonderfully informative, as well as scary. What is scary is the fear that this superbly informed group might be unable to see what may be staring us all in the face. On one such recent call, the CEO of the firm, after listening to an hour’s review of the US-China standoff on Taiwan in the wake of Nancy Pelosi’s visit, asked his team the right question: with all this privileged information, weren’t they just recycling what all the other shapers of elite opinion—The New York Times, Project Syndicate, The Economist, Le Monde, the Frankfurter Allgemeine Zeitung, the CIA, the State Department, the West Wing, and competing risk assessment firms—were also thinking? What are we missing? he asked. What are we not seeing?
These are the right questions and not just for his firm. Advanced opinion tends to run in herds, exactly like unadvanced opinion. It is increasingly useless. Epistemological panic—the queasy feeling that our time has jumped the tracks — has created a hectic search to create a new master narrative, a new single explanation, a new key to all mythologies. (How else explain the success of Yuval Noah Harari?) This frantic search to give our time a shape and a meaning makes me wonder whether the new technologies that so accelerate the exchange of information are in fact having a spectacularly ironic effect: instead of incubating new ideas, they are coagulating conformity. As a result, might our information elites just be sleep-walking their way towards the abyss, like our ancestors in 1914? But even this analogy to the sleepwalkers of 1914 is a little shopworn, another sign that I am no less trapped in my paradigms than anyone else.
From the classroom to the engineering lab, to the national security conclave, to the geostrategic risk business, our future depends on our capacity to think anew. While thinking in a group can help, groupthink is the enemy of integrity and innovation. A truly new thought begins in a single insurgent mind.
Do we live in societies where such people are allowed to flourish? Do any of us actually still know what it means to think for ourselves? Or are we all just recycling opinion on our devices? It would not be good, at this late stage in the life of liberal democracies, to conclude that we live, despite our vaunted freedoms, in a state of unfreedom. The paradox–Tocqueville and Mill warned us—is that free institutions do not create free minds. It is always possible that our institutional freedoms might still leave us chained in a state of epistemological unfreedom. So how, beyond re-reading Tocqueville on the tyranny of the majority and Mill on liberty, do we think about the obstacles to free thinking in our own time?
For a start, we are confused about what thinking is. To think is not to process information. We have impoverished our understanding of thinking by analogizing it to what our machines do. What we do is not processing. It is not computation. It is not data analysis. It is a distinctively, incorrigibly human activity that is a complex combination of conscious and unconscious, rational and intuitive, logical and emotional reflection. It is so complex that neither neurologists nor philosophers have found a way to model it, and the engineers of artificial intelligence are still struggling to replicate some of the simplest forms of pattern recognition that human cognition does so effortlessly. We must beware that in our attempt to make computers think like us we do not end up thinking like them.
Nor is thinking a matter of expressing yourself or having opinions. It is not about turning on the fountain of our personality. It is an exercise in finding reasons to persuade yourself and others that something is true, or at least plausibly true. Thinking has truth as its goal and its organizing discipline. Bullshitters, as Harry Frankfurt told us, are precisely those who do not think: they simply express what comes into their minds, without concerning themselves with whether what they are saying has any relation to reality. Every university is, or should be, properly worried that their classrooms are training generation after generation of accredited bullshitters.
Nor is thinking about expressing your identity. It is not an intellectual achievement to parrot the truisms of our tribes. Indeed, thinking is about emancipating yourself from identity, insofar as it throws into question prior and unexamined ideas. Identity is a warm bath, but thinking is a discipline. Many important lines of thought are austerely impersonal, purged of the origins of those who thought them up. Think of mathematics, physics, and logic. These are great systems of thought made possible by thinkers who in the laboratory and the library discarded their identities altogether and lived, for years at a time, in the impersonal realm of signs and symbols. It doesn’t matter that perfect objectivity is impossible; imperfect objectivity, the progressive correction of biases, is certainly possible. Even in the social sciences, it is basic to our methods that we try to leave our personal biases, experiences, and histories behind, or in some way to bracket them, so as to see what the data tells us. In this work, thinking means leaving ourselves behind.
In deep ways race, ethnicity, class and gender do structure our thoughts, but they do not have to imprison them, and we should not celebrate the intellectual limitations that they impose upon us. Thinking is communicative action, and the fact that we can communicate to each other across the limits imposed by our particularities is proof that we are capable not only of thinking for ourselves, but thinking about how other people, very different from us, might see the world. Thinking is therefore reflective about thought: we need to grasp how these factors of biology and sociology condition our understanding, so that we may, for the purpose of analyzing a subject and understanding it truthfully, neutralize those conditions. Lots of other animals engage in conceptual operations, but we are the only animals who have thoughts about our thoughts. Thinking, properly speaking, turns back upon itself. It is the attempt to understand how we end up thinking the way we do, and what we can do to transcend the boundaries set up by our identities.
One of the most important modern ideas about thinking is the recognition of the intellectual advantage conferred upon us all by social and racial diversity. The best way to think through and beyond your own identity is to be in communication with other identities, different from your own, and to realize, as you interact, how their identities shape what they think. In realizing this, you realize something about yourself. Yet thinking is emphatically not about reproducing yourself, over and over, and confirming your starting point, as a male, female, white, black, rich or poor person. That is not thinking. It is what Freud called repetition compulsion.
Thinking can take us to a lonely place, where skill, technique, experience, even will power, sometimes are of no avail, where, to make progress at all, we have to step away and wait until a new path suggests itself. We can get better at thinking, to be sure, but there is always going to be someone who thinks better than we do. This is another reason why we are not like machines. They all compute the same way, while each of us is differently endowed. It is when we see others thinking, and so often more clearly, that we discover, once again, just how difficult it is, and also just how much it defines us, to think for ourselves.
Not all cultures and periods of history have made this goal the basis of their society. Chinese mandarin education, under the emperors, was formidably egalitarian and proved to be a crucial avenue of mobility and elite renewal for a traditional society, but thinking for yourself was not its goal. Its purpose was to train bureaucrats in the routines and the skills required to administer an empire. The Greek city-states of the ancient world, the Roman imperial regimes, the medieval Christian states, and the Islamic empires, though they were all the settings of intellectual breakthroughs, all feared what would happen if women, or slaves, or the poor were allowed to think for themselves. They understood, quite rightly, that adopting such a principle – recognizing the essentially egalitarian and meritocratic nature of genuine thought — would be playing with fire, likely to blow apart systems of rank order and privilege. Perhaps only Western culture, and only since the Protestant Reformation, has embarked on the dangerous experiment of believing that everyone, literally everyone, is entitled to think for themselves, and entitled, furthermore, to that once exclusive privilege of male elites, the right to an examined life.
To manage this revolutionary adventure, Western authorities took education away from the church in the eighteenth century and made a massive bet on secular learning, first by private charity, then by the state. As an English government minister said in the 1870’s, introducing an education bill, “we must educate our masters.” The hope was that education would canalize the combustible thoughts of the newly enfranchised (male) masses into sober conformity. Education ever since has sought to manage the contradiction between what a free society preaches about the right to think for yourself and the lessons it needs to instill to maintain bourgeois stability.
All of us have lived through that contradiction, first as pupils in primary school. Schools do teach you how to think and they do it (or they used to do it within living memory) by dint of repetition, boring exercises, lines written in a notebook, raps on your knuckles when your penmanship wobbles. Later, in college, you begin learning a discipline, and the experience shapes your thoughts ever after. No originality then, no thoughts you can call your own, before first learning how to think from someone else. Your thoughts do remain in that magnetic field of your primary and secondary education for your entire life. Looking back, once education is over, we can see just how halting and foolish our first steps into knowledge truly were. I still have a battered paperback of The Birth of Tragedy on my shelf, the same edition I first read fifty years ago in college, and I squirm when I re-read my marginal notations. Thinking for yourself is a humbling encounter with your own foolishness.
One thing you learn, as you grow, is that you don’t have to think everything through to first principles; you don’t have to master everything. We cannot rely on ourselves for our understanding of astrophysics and medieval art and macroeconomics; we must rely on authorities. But we can arrive at strict standards for what counts as authority and choose to trust people who are in a position to know. A lot of what teachers do, when we try to get people to think for themselves, is to mark out, as best we can, the terrain of knowledge where relatively secure answers have been found and can be trusted.
Learning to trust a discipline and the knowledge it has certified as true is the starting point for any venture into thinking for yourself. Most of what you know you take off the peg, like a suit or a shirt you buy in a store, or the furniture you get from IKEA. College is like an IKEA store. It does not include assembly. In the lecture hall, the concepts are delivered boxed and flat-packed. You take them back to your room, and with the Allen wrenches that the professors have given you, you put together the furniture to fill the rooms of your mind. No surprise, then, if your mind starts looking like everyone else’s.
This creates what Harold Bloom famously called the anxiety of influence, the sense of being derivative, the worry that you are not speaking your own truth but being spoken by the various discourses that you have assembled for yourself. There is even the anxiety that there is nothing left for you to say. “Born originals,” the eighteenth-century poet Edward Young wrote, “how comes it to pass that we die Copies?” You want to think for yourself because you want to be somebody. Thinking is how you create, over time, the feeling that you exist as a distinct individual.
But originality, again, is not the objective of thinking. Bloom was writing about poets, and more generally about artists, for whom originality matters deeply, but the point of thinking is to say something true, not something new. Originality is not the only marker of individuality. Novelty cannot vouch for the truth of a proposition. And the obsession with originality eventually leads us to admire contrarianism – or as it is more accurately known, sheer contrarianism – which is a contentless and purely theatrical position, determined by no higher ambition than to say what is not being said, and thereby to draw attention to oneself. Sometimes a consensus is right and sometimes it is wrong; if you have thought the subject through and arrived at your own reasons to concur with others about it, then you have not surrendered your intellectual integrity; but a contrarian is just a reverse weathervane. The point is not to think originally but to think critically.
You free yourself from the anxiety of influence, first, by learning a discipline and then by casting a skeptical eye on the discourses that it seeks to make canonical. In my undergraduate and graduate education, spanning the 1960s and 1970s, discourses blew regularly through the campus, sweeping all right-thinking people into their clouds of abstraction. First it was Marxism, then structuralism, then post-structuralism, then discourse analysis. I forget the rest of them. They created fashions in thought and fashionable thinkers, lots of them French or German. Cliques arose in which the confident deployment of a certain jargon was the test of entry. Fifty years later many of these discourses seem comic or irrelevant, but at the time they were exciting and predominant, a mighty source of academic power and legitimation, and their successors today are no different, with the same unhappy consequences for freedom of thought.
In evaluating the currently fashionable strictures on what can and can’t be said on campus, I find myself reaching back to my graduate education and remembering how the ascendant discourses liberated me only to suffocate me later. As a trainee historian, I was enthralled by the young “humanist” Marx of 1843-1844, and the Marxist historians of the British school, Eric Hobsbawm and E. P. Thompson. For a time, these brilliant Communists and ex-Communists were my maitres a penser, even though I never ceased being a stone-age Cold War liberal. All apprentices need masters. Then there comes a moment, essential to any free and mature intellectual life, when apprentices need to break free.
Intellectual independence always begins with a small act of rebellion. In the middle of reading a book, or during an experiment, or pondering the numbers spewing out onto your computer screen, you suddenly have a suspicion that something isn’t right, though you are not sure what. This is the decisive moment, when you have a choice to ignore this flickering unease or to begin to wonder why. Once this epistemological disquiet takes possession, the question becomes what you are prepared to put at risk to resolve it. There is always an element of oedipal struggle in any commitment to think for yourself — a moment when you break with a supervisor, an intellectual idol, a close friend. If the revolt succeeds in giving you your freedom, it leaves behind a lifelong watchfulness, a skepticism about authoritative academic (and non-academic) rhetoric and the prestige of maitres a penser. At the end of this journey, you may end up feeling, like those radical Protestant sectarians of the sixteenth century, that you have earned the right to sola Scriptura, to read and interpret the holy scriptures for yourself, without the mediation of dogmatic authority.
Here is where the individualism of modern liberal culture, derided equally on the right and the left, is so critical for the creation of free thought. For there must be a moment when a thought fails to convince you even when it appears to convince everyone else, or when you are alone with a thought that doesn’t seem to be occurring to anyone else. The communitarians would have you question this experience of independent reflection and wonder nervously whether heresy or apostasy is on the horizon. For some communitarians, other minds are more real than their own. But this is a fiction: like the specificity of your body, the specificity of your mind is evidence of your individuation.
Yet it is always a struggle. Someone else is always quicker, cleverer, has read the book that you did not, seems to grasp things that you fumble for. So thinking for yourself quickly becomes a competition to think as quickly and as smartly as the others. Competing with others, you discover, is how you become like them, not how you distinguish yourself from them. Catching up is not an intellectual activity. So you begin to separate from the pack, from the people you eat with and go to the movies with. You choose solitude and study. You set yourself the task of mastering a book, a theory, a disciplinary approach, anything that will give you the sense that you have mastered what is necessary for serious thought about a subject, and so can rely, not completely but sufficiently, upon your own resources. When you have gained mastery, at least to your satisfaction, this gives you the confidence to go on, and you need confidence, because everyone around you seems to be ahead of you. You discover that you have your own pace, your mind’s own tempo. As for those canonical books on the shelves, their authors remain in another league: austere, elegant, relentless, eternal; and thinking for yourself means respect for the canon, for the very few eternal works that set the standard for thought itself. But if the canon teaches anything, it is that you must not become enslaved to it, or anything else, only because it is venerable.
What you think about hoovers up the entirety of your waking and sleeping life, the eternal and canonical as well as the transitory and foolish. You work on the promiscuous stream of your own consciousness—memories, jokes, phrases, shameful incidents that will not go away, epiphanies that had better not go away, the babble of social media, the unending tide of information that sluices in and out of your machines, the data you collect and that collects you, the books you read and annotate. None of this stream counts as thought itself, and none of the processing of information counts as thinking. Actual thinking comes later, when the moment arrives to put order in the house.
As I have aged, my own mind, originally furnished by Ikea, has gradually become more like those photographs of Picasso’s studio: palettes, brushes, sketchbooks, dusty photos, abandoned sculptures, paint tubes, knives, African heads, all lying around in an order that only he could understand. So thinking for me became an exercise in trying to remember where I put something in the studio of my mind, where I heard something, read something, since suddenly I seemed to understand what use to put to it. There must be orderly minds whose upstairs rooms are like an office, with filing cabinets, and books and manuals arranged in alphabetical order, affording instantaneous ease of access. Good luck to them. But thinking emerges from chaotic minds as well as orderly ones. What both have in common is tenacity, going on and on, when everyone else has turned out the light and turned in.
When you have found an idea that is properly your own, you come back to yourself with a feeling of surprised discovery. You have revealed yourself, for the pleasure and satisfaction of it, surely, but also because you think it might make a difference, change somebody else’s mind, “contribute to the literature”, and so on. Many of the motives that draw you to an act of thought are extrinsic—like trying to impress, to get tenure, to make your mark; and extrinsic motives will lead your thought into certain grooves to meet certain expectations. Thinking guided only by extrinsic motives runs the risk of being captured by the institutions, the authorities, and the celebrities you want to impress.
Yet there is also a kind of thinking for yourself where you feel compelled to do it, intrinsically, for its own sake. The experience is of being taken possession by a thought. This is certainly the most perplexing, frightening, and pleasurable kind. For it feels like the idea is thinking you, thinking itself into being, inside your head. You go to sleep thinking about it; you dream about it; when you wake, you see some new element you want to explore; everything is murky, struggling to rise into consciousness, to get itself fixed in language, so that you can begin to understand what the idea is trying to say. The unconscious aspect of thinking is uncanny and suggests that part of what makes thinking for yourself so difficult is that it is never an entirely rational or conscious process, even when it prides itself on being so, but instead a thick and murky and subliminal experience, in which reason is never all there is, highly emotional, highly unstable, impossible to predict, difficult to control, and finally very hard to put down on the page.
Thinking a new thought can feel like a moment of self-expression, but it is actually a moment of self-distanciation in which, having searched your mind and memories for materials, you suddenly see that you can create something new from the materials that you had taken for granted. Thinking can feel like a surprise, in which you awake to what was in you, until then unsuspected and unrevealed.
Thinking for yourself is not solipsism, a perfectly isolated and self-confirming activity that occurs only inside your head. It is social: you work within force fields of thoughts and ideas created by others, and good thinkers are often, though not always, generous ones, quick to acknowledge others and their dependence upon them. Since dependence creates anxiety, a worry about being derivative, the only way out of this worry is to gratefully acknowledge your dependence and then make sure to take the thought further than where you found it. From dependence on the thoughts and insights of others, you gain confidence in your own way of thinking.
With that confidence comes a responsibility to the field in which you work, a responsibility to your own reputation as a serious person who doesn’t play around with the facts. Thinking for yourself means taking responsibility for your thoughts when they enter public discourse. Fools are people who say the first thing that pops into their heads. (That is the deep problem with the spontaneity of social media; a person’s first thoughts are never his best thoughts.) Even bigger fools claim an originality that they do not deserve. The biggest fools of all think that everything they think is their own invention. Self-congratulation is never the hallmark of a real thinker. Instead, thinking people have a morality of knowledge: they accept the obligation to back up what they say with evidence, facts, footnotes, citations, and acknowledgement of where they got their ideas from; with a rigor of argument and an analytical care about concepts.
In our thinking about thinking, we rarely pay enough attention to its ethical frames of responsibility to self, to method and discipline, to truth itself. Nor do we pay enough attention to its psychic requirements, to the kind of character that is needed to be a good thinker. The qualities that make a good thinker are not the same as make a good citizen. Civility and politeness are all very well in society at large. Good manners are a lovely attribute, but in intellectual life the instinct to be nice and to be liked can be fatal. Rigorous thinking must be followed wherever it leads. The unpleasantness of controversy is inevitable. Everybody cannot be right. This is yet another instance where the ethos that liberal democracy encourages—civility, deliberation, compromise—is in radical contradiction with the intransigence proper to scientific and humanistic discovery. In politics, in ordinary life, compromise is good, even necessary. In reasoning and research, by contrast, you must be a fighter, tenacious, persistent, stubborn in your insistence that someone’s argument is wrong and yours is closer to the truth that you are both pursuing.
Thinking for yourself also means honesty in defeat. It is hard for competitors in intellectual fields to admit when they are wrong, when the data fails to prove their point, when they cannot find a way to rebut someone’s devastating rebuttal. Thinking for yourself demands a difficult kind of honesty, a willingness to concede defeat when the evidence is against you, and enough resilience to resume the search for the answer that still eludes you. And these conditions for honest and public disputation are not only traits of thought but also traits of character. Liars, even brilliant ones, are useless to thinking.
Liberal societies need this kind of character and this way of being, but it works against many liberal values: compromise, perfect reasonableness, the search for social accord, the desire to be liked and thought well of. There is something intransigently solitary about thinking for yourself, and a liberal society should want to encourage a temperament that is strong enough to endure solitude, contradiction, unpopularity, standing up for yourself against the tide of opinion. As Tocqueville and Mill, our greatest psychologists of liberal society, saw long ago, a society like ours that lives by and for progress and innovation cannot survive without protecting and fostering this type of intransigently individualistic character.
If thinking for yourself is the goal of your life, then it pays to maintain a certain distance from the institutions in which you work and live. Distance implies wariness about received opinion, about fashions, about the recurring tides of certainty and urgency that course through the places where we work and soon have us all facing the same way, thinking the same thing. The larger point is about liberal society: if thinking for yourself is your goal, do not go looking for the warm bath of belonging or the certitude of faith. Do not expect a free society to provide these for you. Belonging is not the fondest dream of a serious intellectual. She dreams of other satisfactions first.
Liberal society works best—it is most productive as well as most free—when its members all feel a certain alienation, a certain dividedness about the objectives and the values of the institutions inside which they work. Alienation is another term for distance and detachment; it need not be melodramatic or wounding. It is simply one of the epistemological conditions for a working mind, and for the pleasures of its work. Objectivity is a variety of alienation. Who would trust the views of a thinker who is not to some degree alienated – that is, detached – from her subject? One of the very first scientific societies in the world, the Royal Society, founded in 1661 by the likes of Isaac Newton and Christopher Wren, took as its motto: Nullius in Verba. Take Nobody’s Word for it. Imagine: the greatest scientists of their time set up an organization to promote scientific thinking, and their working rule was not even to take each other’s word for the truth. That is an ideal of truth that is also an ideal about institutions: to give us a place not to belong and imbibe the pieties of a tribe, but where we take nobody’s word for it.
In case you were wondering, I doubt that I have myself measured up to these standards. I have lived in an orthogonal relation to the liberal institutions that have made my life possible: in academic life but not of it, in the commentariat but not of it, and so on, but whether this has enabled a genuine independence of mind, I cannot say, and in my own reckoning I come up short. So my strictures here are not an exercise in coyly concealed self-congratulation, so much as an effort to describe an ideal and to recommit myself to it in the classes that I teach.
The larger question, beyond the type of intransigent individual that we want a liberal society to nurture and to protect, is how to manage the contradiction, at the heart of our institutions, between teaching people how to think and teaching them to think for themselves. This is an old contradiction, of course, but we need to understand how new conditions in the twenty-first century have sharpened it. Briefly, the answer is this: no society before ours has ever been so dependent for its economic progress and cultural self-understanding on knowledge. An entire literature — “post-industrial”, “knowledge is power”, “the information society”, “symbolic analysts” – has established this epochal difference. In such an order, the creators of knowledge possess enormous power; and we can measure it not least by the extraordinary financial returns that patented inventions, best-selling books, movies and TV shows, apps and other knowledge products earn for those who devise them.
Having monetized the products of thinking in hugely valuable forms of intellectual property, corporations, governments, universities, and some especially successful individuals face the dilemma of how to institutionalize their creativity. The owners of ideas, whether they are individuals or corporations, have powerful new algorithms at their disposal, enabling them to canalize innovative ideas into opportunities for profit, market share, influence, and power. The issue hanging over all of this is whether new technologies for discovering and aggregating preferences in ideas helps or hinders the impulse—thinking for yourself– that drives the entirety of our society’s capacity to innovate. The new technologies have accelerated the concentration of economic and intellectual power in a handful of monstrously large corporations that run the Internet and thereby set the standards for university culture and learning world-wide.
If you put these two factors together—new technologies to aggregate and identify preferences in ideas and an oligopoly of institutions that certify what profitable and useful knowledge is, together with their licensed authority to teach people how to think—what do we have? A society that teaches us how and what to think, or a society that teaches us to think for ourselves? Those who think that “surveillance capitalism” is the root of the problem, or who prefer to blame social media, or who wish to force-march us back to their old verities – all these people have their answers ready. But the question is more than a sociological or political one. It is, rather, a personal and moral one. It may be more useful to conclude that this is a question—whether thinking for yourself is possible—that cannot have a general answer. It must be asked by everyone who cares about an examined life and an examined society. A society will be only as thoughtful as the people who are its members; or as thoughtless, with all the terrible consequences that always issue from unfree thinking. Those who do think for themselves will have to answer the question for themselves. It is a condition of the proper use of freedom to ask yourself continually whether you are truly as free as you think you are.
—