13 Comments
User's avatar
Ondřej Frei's avatar

Thank you, Erik. It feels ironic that the more wrong is there in the world, the more we try to control everything through the omnipotent data idol. Maybe that's actually the epitome, really - humanity slowly realizing how out of control we really are and in the deluded attempt to regain some ground under our feet, we turn to data as something tangible, simple enough for us to comprehend and understand (unlike the "big problems" which keep eluding us)?

Expand full comment
Erik J Larson's avatar

As for the feeling of control, I think most of the wisdom literature is centered on the idea that we can't control things and it's folly to try. In the modern era we've proven that somethings can be controlled, but we've manifestly failed to control everything and I think that It's not for a lack of trying, it's that we run up against fundamental limits.

There's no such thing as a limit to Silicon Valley. Everything is a straight line going up to infinity. That's just dumb. I don't know how else to say it.

Limits can be extraordinarily fecund for science. In other words, when we discover we can't do something we also discover a lot of things that we hadn't thought about that are available to us.

The second part about limits is that unintended consequences are constantly thwarting our efforts to just march forward to the goal. We can't see what's coming around the corner, and that means we don't know what the line is that we're following. So humility becomes part and parcel of making sense of the world. And I agree that the modern ethos is uninterested in this kind of discussion. To our peril.

Expand full comment
Erik J Larson's avatar

Hi Ondřej,

Agreed.

Something that’s hit me more clearly lately—and felt almost like an epiphany—is this: when we talk about data, we’re talking about what’s already been measured. It’s retrospective by nature. But as the old line goes: not everything that can be counted counts. And the flip side is also true—not everything that counts can be counted. So the minute we anchor ourselves in data, we’ve already left something out—often the most vital, nuanced, or contextual parts of reality.

Expand full comment
SG Atcheson, MD's avatar

Dr. Larson

When I read your Substack I felt sad for you, but your current predicament also reminded me a of a situation I found myself in more than 30 years ago. I read The Myth of Artificial Intelligence when it first appeared in 2021, and your words have influenced my thinking and writing ever since. You took me to Charles Sanders Peirce and the magic of induction, something that had been nothing more than a vocabulary word from my school days. "The surprising fact, C, is observed…," you wrote, and I was hooked. Honestly, I don't think a day has gone by when I haven't at least mentally brought up something from your writing. Maybe that's why I feel that I owe you some words of my own as compensation.

I will be 82 in a couple of months, and am no longer the magnificent physical specimen I once imagined myself to be. Sarcopenia of aging is a real thing and cannot be prevented, only stalled for a while. Those body hackers you write of are peddling fraudulent dreams. I still lift weights 3 days a week, serious weights, but not nearly as much as I once could despite having two shelves in a kitchen cabinet so filled with bottles that I call it Supplement City. Still, I can think, and plan, and hope, and am more happy, optimistic, contented, every day than ever before in my life.

That was not the case in 1993, the year I turned 50. I was wondering if I had wasted 20 years practicing rheumatology. My specialty was in crisis, full of disillusioned doctors thinking that they were spending all their time accomplishing exactly nothing substantial for their patients. It was the lowest paying of the medical specialties. Fellowship positions at the most prestigious teaching schools went unfilled. The premature "miracles" stemming from the discovery of cortisone in 1948 led to stagnation and disillusionment because the drugs lost efficacy and had inherent toxicity. Ironically, it all boiled down to Data, actually lack of same. 1993 turned out to be the bottom of the pendulum swing, and the unrecognized Renaissance of rheumatology and immunology started bringing real miracles to our patients within just a couple of years.

I think the field of artificial intelligence is temporarily trapped in the belief that it can accomplish just about anything if only we can collect and properly arrange enough Data. The people designing and building the machines are attempting the impossible. You are so right in describing the process as a secular religion. It is simply absurd for anyone to think that he can find perfection in viewing the body as a purely material entity. Material life was designed to fail, sooner or later, because only in the failure can the meaning of life, that which we all hunger for, be understood. Life is the Grand Discriminator, and humans are the Communicators, the only ones who can speak for all of the life forms.

Richard Feynman taught something that has stuck with me as much as your thoughts on abduction: "It is important to realize that in physics today, we have no knowledge of what energy is." Just as important to me is the apparently unanimous consent of physicists that energy is immaterial. Once I put one and one together, and it took me a while, I realized how ridiculous it is for committed materialists to argue that the mind must be an emanation from a material brain, when the physicists — those supreme reductivists — are convinced that the prime mover is immaterial. My question: If you are so comfortable with the idea that all material creations come about because of interactions with an immaterial something that we call energy, why are you so resistant to the idea that energy may also interact with an immaterial realm that is beyond measurement?

I was very much in the materialist camp until 2001, when I was studying cytokine biochemistry in an effort to better understand the amazing new drugs that revolutionized rheumatology, and recharged my batteries in the process. Cytokines are proteins, and I learned that a fully functional protein of average size may be the most complicated structure in the universe. AI tells me that each human cell contains 10,000 different proteins, and each cell contains about 42 million proteins in total. A single human body may harbor over a hundred thousand different proteins. Our biosphere may contain tens of billions or even trillions of different proteins. Nobody knows, and nobody will ever know. A single protein will only function properly if it is folded together just right, but the number of possible folds of the average protein exceeds the number of atoms in the universe, so I am told.

AI also tells me that my body contains 37 trillion cells, and it harbors my microbiome, those trillions of bacteria, viruses, fungi, and microscopic plants and animals that I am living happily with, exchanging information with, all in an effort to ensure our quiet enjoyment of the not-yet demised premises — I and all my invisible friends.

You write of the transhumanist dream, "the merging of man and machine," as a step in the evolution of man, and your words tell me that we agree that it is pure folly, actually dangerous to the human spirit. You write of the hopeless religion of Dataism, and you are just spot on. Where does this Data come from? How is it formed and manipulated? All of the definitions, all of the operations, come from our imaginations and subsequent communication to others. Hard data is only as hard as you allow it to be. The number "1" is just as imaginary as the square root of -1. They're both products of observation and subsequent mental manipulation. Values are “squishy,” but numbers are “real,” you wrote, and I’m glad you used quotation marks.

No wonder you're depressed at the current worldview, that the purpose of life is to generate data,

consume data, and be measured by data. It is so wrong, just manifestly wrong.

Intention is the defining feature of life, all life. Life intends to survive and reproduce, and in order to do so it must explore and discriminate. It must explore in order to eat, and it must discriminate in order to avoid being eaten. I learned a lot from reading Elizabeth Anscombe, and I like her word much better than intentionality, a word too passive to describe what life does. Life uses data to survive.

Why would the most complex and interesting creature in the known universe want to merge with a machine? By definition, the machine must be less complex than man, its creator. Didn't John Von Neumann mathematically demonstrate that decades ago? "Transhumanism holds that human beings are ever-evolving…" — but where is the evidence for any true evolution, ever? There is none, but there is plenty of evidence for adaptation, just like the bills on Darwin's finches.

Michael Polanyi, one of the great physical chemists of the early 20th Century before he became a greater but still-neglected philosopher of science, convinced me of the dual nature of man, a creature under dual control, partly by the laws of physics and physiology, but ultimately directed by the actions of an immaterial mind. In "Life's Irreducible Structure," he writes that the laws of physics and chemistry have an inanimate nature, meaning that they go on regardless of the presence of humans, or life in general. But he says that if all humans were exterminated, the production of machines would stop, and not until men rose again could machines be formed once more.

In "The Structure of Consciousness," Polanyi describes the actions of an immaterial mind, the tacit knowing and tacit integration of the higher principle of mental actions upon the physical parts of the human mechanism.

Wilder Penfield, the great 20th Century neurosurgeon, also started his career as a committed materialist, but after 50 years of meticulous study he concluded that the mind was separate from the brain, and in one of his final acts he wrote Mystery of the Mind, in which he declared that the mind was immortal. I found it to be compelling, and very accessible to non-physicians.

Dr. Larson, I am sorry that you feel bad for the younger generations, but your thoughts are those which I am sure others have expressed for at least 2000 years. What I believe is that you are soon to see a real change in the outlook of the younger scientists. They will come to understand that AI is an extraordinarily powerful tool, amazingly useful but still a tool. It cannot ever be anything more, because the laws of physics simply prevent that from happening. As Faulkner said, "… man will not merely endure: he will prevail. He is immortal, not because he alone among creatures has an inexhaustible voice, but because he has a soul, a spirit capable of compassion and sacrifice and endurance." That is our nature.

Steve Atcheson

Expand full comment
Erik J Larson's avatar

Hi Steve—this is wonderful, thank you. And thank you for taking the time to lay things out so thoughtfully.

I want to focus on finitude. I remember once asking a friend: “Can I be a Christian, but not go to heaven? That part sounds awful. I don’t want to be around forever—I just want a long, rich life, and to pack everything into it.” He was not amused. He said, no, there are no versions of Christianity where you accept Jesus but opt out of eternal life. I remember thinking: really? You can’t even bargain with him? Just give me the belief, skip the heaven.

That conversation stuck with me. I think you’re absolutely right: we talk a lot about eternal life, in religion and now in tech, as if it’s self-evidently desirable. But what if it’s not? What if finitude—the fact that it ends—isn’t a problem, but the very condition for meaning?

Sartre once said that in the absence of an infinite reference point, nothing finite has any meaning. But meaning to whom? I don’t think my life has meaning because it never ends. I think it has meaning precisely because it does.

I’m not trying to start a theological debate. But I do wish more people appreciated how deeply value is tied to limits. Without endings, there’s no urgency, no poignancy, no sacrifice. Take it easy man.

Expand full comment
JMM's avatar

A great article and thanks for sharing such a powerfully eloquent analysis of modernity. I am not sure that a world obsessed with data secularism is all that worth aspiring to. I loved the Nietzsche question "What will fill the void now that God is dead".

Expand full comment
Erik J Larson's avatar

It’s fascinating to revisit the late 19th century and unpack what Nietzsche was really getting at. The developed world was in flux, yes—but church attendance was still strong. There were millenarians and Mormons, revivalist preachers drawing crowds, and a vibrant, evolving Black religious tradition giving rise to Baptist churches, gospel music, and new forms of spiritual expression. In short: God was still very much in the air.

And yet Nietzsche wrote, “God is dead.” What he meant wasn’t that people had stopped believing, but that belief no longer sat at the center of how we explain the world. God had been dethroned not by politics or culture, but by science. And Nietzsche understood that this shift wasn’t just intellectual—it would ripple through our emotional, psychological, and cultural lives. He saw that without a metaphysical foundation, the meaning structures that had guided civilizations for centuries would begin to erode.

I’m not an atheist myself. But I’ve always found it striking that the great world religions all emerged around the same epoch in human history—thousands of years ago—and that everything since has been, in a way, an extension or remix of those ancient roots. Today, we have New Age movements, spiritual hybrids, and fragments of the old faiths, but the cultural center has shifted. Nietzsche’s warning was simple: if no one sees any reason to look back, and no new foundation is laid, then God is not only dead—he’s culturally forgotten.

have many Christian friends—people I deeply respect, some of the finest I’ve ever known. And I understand that the Christian response to Nietzsche’s “God is dead” is often something like: That’s why we need to bring Him back—through faith, through witness, through renewed moral clarity. But that’s not what Nietzsche meant. He wasn’t calling for a spiritual revival. He was saying it’s not possible. That’s what makes it terrifying.

It’s not about whether the resurrection is true or false. It’s about the fact that, culturally, we no longer live in a world where those truths anchor society as they once did. The idea that Jesus rose after three days—once held by much of Europe and large parts of the world—no longer structures the moral or intellectual center of modern life. For a while, it did. So did Buddhism, for a huge swath of humanity. But that’s not where we are now.

Nietzsche’s claim wasn’t that belief had vanished, but that the conditions for shared belief had dissolved. God is dead not because no one believes—but because belief can no longer hold the world together. Whether or not it’s true, we are living in a different kind of world.

Expand full comment
Guy Wilson's avatar

Erik, I read your post in the morning and have been thinking about it as I worked this afternoon. I read Ted Gioia's piece yesterday and have been thinking about it too. I've been filling notebooks with bits and pieces of this for months. I am probably closer to Gioia's conclusions, but I come from a different milieu than either of you and so have a different take. I apologize to both of you if I misrepresent what you have written. We all filter everything through the lens of our own reading, experience, and thoughts. I can only write out a fraction of what I would like in a comment.

One issue I have with both of you is the tendency to think in terms of the end of the Enlightenment (or Enlightenment). A second is the desire to think in terms of a new renaissance. Even when I was studying history in graduate school in the 80s, I was bugged by the idea of an Age of Reason. I no longer believe that there was one. The philosophes did use reason, but they used it too often to ride hobbyhorses. I am not sure that they had much more grounding in reality than a lot of the people you criticize in this piece (who certainly deserve the treatment you give them). More to the point, it was as at least as much an age of unreason. There was plenty or religious ferment and unusual religious movements. It was also the age of Swedenborg and Mesmer.

When we look for a renaissance, we need more than just the falling apart of an intellectual system. It did arise, of course, in reaction to a petrified intellectual worldview, one on its way to becoming as dysfunctional as the tech world's. It also arose in a period of climate change and war; vast technological change; persecution of Jews, lepers, heretics, witches, and even the occasional werewolf, the first two groups suffering pogroms in the years around the beginning of the era; and, of course, pandemics.

What, and how soon, would we have recognized the start of the Renaissance, and with it modernity, if we had been in the wrong place or looking for the wrong things? For a long time, it was a slow burn outside of a few centers. I'm unsure where to look or what to look for. One of the other commenters had some remarks on art. It may be that we will first see it there, or it may be literature, or something else. What if the collapse of American cultural hegemony reveals something new already flourishing elsewhere? What if there is a new renaissance that comes as a major backlash against AI and the authoritarians forcing it on us? It would be too early. We cannot say what conditions would fuel it. They might, for a while, even for a few generations, feel like the worst things we could experience.

I have a bad habit of quoting William Gibson novels. In his most recent one, 2020's Agency, about a very different AI from the one we have now, a character from the next century, speaking to others in an alternate 2017 or 2018, observes that the Singularity began decades before, maybe a century or more and continued for a long time after. There is a lot of truth in that. The present moment seems quite horrible. I expect things will get worse in fact. Whether that will be because of authoritarianism; or the ideologies of capitalism, technofeudalism, and digital salvationism will rip themselves apart leaving chaos in their wake; climate catastrophes accelerate; new pandemics; wars; or something we cannot foresee, I don't know. The immediate future Gibson wrote is about as unpleasant as it gets, and his future world of the 2130s is still beset by existential problems and is barely hanging on. Still, there are aspects of it that look like a renaissance.

The Renaissance and the Enlightenment were both mindsets that eventually affected/afflicted the whole world. They were part of much more complex periods, influenced and influencing by other trends within their host societies, but also with other cultures. Part of me doubts we will see equivalent movements in what remains of my lifetime, perhaps not in yours either, or we could see something emerge in the next ten years. Personally, I think the best thing is to engage with the reality, fight the little skirmishes we can for what we think should be, and look at the sheer variety of what is happening. Meanwhile, I work for a university and live in a university town, engaged with some of the problems Gioia mentions, wondering if anything I've worked for in my careers, first as an historian, and then as an instructional technologist, will survive. I don't know if anything I've written here will strike a chord for you or anyone else but hope it might spark something for someone.

Expand full comment
Erik J Larson's avatar

Thanks for this, truly. There’s so much here—thoughtfully expressed, grounded in history, and not given to easy narrative closure. I think you’re absolutely right that movements like the Renaissance or the Enlightenment weren’t monoliths, and certainly weren’t as clean or “rational” as they’re often remembered. As you point out, they coexisted with mysticism, persecution, and collapse. It’s easy to forget how much unreason rode alongside reason in those eras.

Your point about the slowness—the distributed emergence—of major shifts also resonates. If something new is being born now, it’s likely not happening all at once, or in the spotlight. Maybe it’s emerging in small cultural pockets, artistic subcurrents, or even counterreactions to the dominant ideologies of data, disruption, and optimization. Maybe it’s not even in the West.

I’ve also wondered whether the real backlash to AI will come not through political action but through cultural refusal—a slow rediscovery of meaning in domains that can’t be simulated or sold back to us. That might look less like a movement and more like a quiet shift in sensibilities, which of course is harder to detect in real time.

Anyway, I’m grateful for your comment. It reminds me that the most valuable conversations aren’t the ones that land on an answer, but the ones that deepen the questions.

Let me add here also: I had a kind of front row seat working on search technology in the early 2000s in Austin Texas. And so I can kind of go back 25 years and then do a before and after comparison to get some perspective. What I see is that the same corporate forces moved in and took over the new technology and bended it to their aims. People complained for a little while and then they just went along with it. Now I laugh when people are saying down with big tech, because it's too damn late! We have to wait for new innovation or make our own.

So I think we can glimpse in very broad terms historians are going to write about say 20 years from now or even 50 years from now. Also for the purposes of in shining the enlightenment I think the fact that someone in the early part of that realize that the sun is at the center of the cosmos rather than the earth was sufficient to give it a permanent place in history as a change. If it was just Copernicus OK. But it was an entire revolution in thought. Another one occurred in the early 20th century with the move from classical to quantum mechanics. Yes all sorts of stuff was going on and in fact we use Newtonian mechanics to put a man on the moon. But to say that that wasn't a radical conceptual shift would be to be really really missing the point. So, I generally take that tack. But the provisos are also interesting and thank you for pointing them out.

Expand full comment
David Goetzinger's avatar

I’ve been thinking a lot about these issues lately. I’m a visual artist planning to launch a new YouTube channel. The channel will feature me painting, sharing the images generated by my imagination.

There are people out there who believe that my efforts as an artist will be pointless in the near future, with generative AI producing all the imagery anyone might ever want, and doing so better than any human. I disagree, but I worry that humanity might be willing to dismiss human creativity as valueless were it possible.

I don’t believe generative AI will become sentient any time soon, if ever. So it will continue to produce mindless product, however carefully trained the AI. When I consider all the thought that goes into my art, I know that generative AI cannot be trained to copy it. 1) The imagery comes from my imagination, 2) the design elements are chosen purposefully, 3) and the composition devices are also chosen purposefully. On top of that, a hypothetical model I started building while studying neuroscience underlies much of the imagery.

When I was at U of O, I found a mistake in a textbook and shared my finding with the author. Soon after, I started developing a hypothesis about processing rules the brain is running in managing visual stimulus. The idea really extends to other brain functions, but since visual stimulus has been deeply studied I thought that was the best place to develop a proof-of-concept hypothesis.

The hypothesis is still a pet project, and may come to nothing. But it is yet another element contributing to my artwork that an AI would have no access to. So no AI will be producing my artwork better than I do.

The question is, will people still value human creativity? I once saw some idiots online promoting AI to produce images in the style of “Vincent van Gogh.” They did so without realizing that the artistic style itself is the product of an artist, a product of human creative effort. If one wants to produce art, why not try to be the next Vincent van Gogh? Why not develop a unique art style? Why not produce one’s own art?

Generative AI is non-thinking. Using it requires no talent of a geek who has none. What it does do, however, is condition ordinary people to no longer value human creativity. I suspect that people will thoughtlessly accept a world of AI-generated “art” and “music” and more. They won’t recognize it is worthless and won’t care.

What advanced AI will produce is a population pacified by AI-generated art and music, a population made weak through reliance on advanced robotics, a species birthed, raised, entertained, and ultimately provided tireless palliative care through death.

Expand full comment
Erik J Larson's avatar

Hi David, I think what tends to happen is that artists—including writers—take the latest technology and use it to serve their own ends. As a writer, when I look at the output of large language models, I see a kind of baseline. The question becomes: What makes my work better than this?

I’m not too worried about the idea that people will suddenly flock to AI-generated music, art, writing, and everything else. Who really wants to buy a “personal” piece of art made by a billionaire-backed server farm in Silicon Valley? There’s no story there.

What usually happens—and I think will continue—is that the best artists will take the tools and make something human with them. Not a replacement, but something reflective, unique, and alive. In the end, people want something that says something about us. I think it was Sam Harris of all people who remarked that he wouldn't bother reading a novel even if it was the best novel in the world if it was written by a computer. I would add that it couldn't be the best novel in the world either. It will involve statistical generalization across many many data points which means that it won't really be saying anything genuinely unique.

Expand full comment
Sunil Malhotra's avatar

Thanks for this searing critique on Dataism, Erik.

Your message is clear and urgent. We are not machines, and treating ourselves as such is a fast track to nihilism.

Love that you call for a return to the "old stories"—myths, values, and philosophical questions that remind us we're more than data. Without them, even the most optimised lives risk becoming deeply unfulfilling.

The Vedic texts, Yoga and Bhagavad Gita are pointers to those parts of ourselves that transcend the mechanical brain. Bringing Yoga to the Ai party might be one good way to shift the existing paradigm.

Btw, Harari does Vipassana. Give Yoga a chance.

Expand full comment
Erik J Larson's avatar

Hi Sunil, I don’t have anything against yoga—there’s real value there. But I think we’ve split into two dominant postures: acquiescence, and performative outrage. The first gets dressed up as wellness or “self-care,” the second as political action, but neither one generates real agency or constructive change.

We’ve got techniques to make ourselves comfortable with powerlessness, and then we have these bursts of anger that play out more like theater—amplified by social media, but often disconnected from any clear goal or ground-level reality. There’s no revolution without clarity, and most people don’t know what we’re even fighting anymore. So a lot of that energy just turns inward—we end up fighting ourselves.

The question isn’t whether yoga is good or bad. It’s what role it plays in a culture that’s increasingly lost its bearings. Are we waking up—or checking out?

Expand full comment