The Vision

  • The World is Digital

    “It is appointed for man to die once, and after that comes judgment.” (Hebrews 9:27)

    Older generations often complain that young people can’t write in cursive. In the past, cursive was universal, but today, fewer schools even teach it. Soon this skill (and the ability to read cursive) may vanish entirely. I understand the frustration. Handwritten communication was once essential, and many mourn its decline. But we must recognize that typing has replaced it. In the past, authors wrote by hand, then hired typists to transcribe their work for readability. Why didn’t they type themselves? They lacked the skill. Typing was once specialized knowledge. Today, if you can’t type, you can’t function in society. Viewed this way, trading penmanship for typing isn’t such a terrible exchange.

    Another disappearing skill is reading analog clocks. Children once learned to tell time on them because most clocks were analog. But as digital clocks become ubiquitous, fewer people master the old way. This is also a natural historical progression. Analog clocks existed only because we lacked the technology for digital ones. Digital clocks require electronic circuits and displays, while analog clocks use simple mechanical works. Yet the information that analog clocks present, the position of hands on a dial, is not immediately comprehensible. People must acquire the skill to interpret them. We needed that skill only because we couldn’t build digital clocks. Now that we can, there is no reason to learn the old method.

    Some argue, “Analog clocks are still better because the world itself is analog.” Consider the time between 1:00 and 2:00; there is no gap. An analog clock’s hands pass through infinite points between them. A digital clock, however, displays only discrete moments: 1:00:00, 1:00:01, and so on. The same applies to visual displays. The world appears continuous, with no jagged lines and smooth transitions between objects. Even the best screens consist of dots with visible gaps if you look closely. Common sense suggests the world is analog, and digital technology is merely an imperfect approximation of this analog reality.

    But if you think this way, you are in for a shock. The world is digital. Analog is how we perceive reality because we cannot see it as it truly is. We discovered this through quantum physics.

    The Brave New World

    Max Planck introduced quantum physics in 1900. While studying blackbody radiation, Planck discovered that energy could only be emitted or absorbed in discrete packets he called “quanta.” This was the first proposal that energy is not continuous. In 1905, Albert Einstein extended Planck’s idea to explain the photoelectric effect, where light striking metal ejects electrons. He proposed that light itself consists of energy quanta, later called photons. This discovery earned him the Nobel Prize (even though his theory of relativity went unrecognized at the time). Then in 1913, Niels Bohr applied Planck’s quantization to atomic structure, proposing that electrons orbit the nucleus only at specific, quantized energy levels and emit or absorb light when jumping between them.

    Scientists soon realized that quantum physics would revolutionize our understanding of reality. As mentioned earlier, most people still conceive of the world as analog and continuous. They struggle to imagine that light cannot be smaller than a photon. Consider a lamp with a dimmer switch. As you lower it, the lamp dims. But what happens when you turn it down as far as possible without switching it off? Many assume it emits a faint glow. In fact, if there is not enough energy to produce a single photon, the lamp emits no light at all. There is no such thing as half a photon, just as there is no pixel between two dots on a screen.

    The fact that the world is digital, that things cannot be divided infinitely, resolves Zeno’s paradoxes. One famous example is Achilles and the Tortoise. It argues that a faster runner (Achilles) can never overtake a slower one (the tortoise) with a head start, because Achilles must first reach the point where the tortoise began. By that time, the tortoise has moved farther ahead. This continues infinitely, suggesting Achilles would need infinite steps to catch the tortoise, making motion seem impossible. But as we have seen, the world cannot be divided infinitely (and scientists think even time and space are quantized). That is why Achilles can overtake the tortoise.

    The quantum world behaves differently in other ways as well. Consider motion: when we climb stairs, we pass through every point between steps. But quanta do not move continuously. They vanish from one energy level and reappear in another. This process is called a quantum leap. How something can disappear and reappear elsewhere remains mysterious. Yet that is how nature operates at the smallest scales.

    Even the boundary between possible and impossible shifts in the quantum realm. In our world, a fragile object cannot break a solid one; an egg will never crack a rock. But in quantum physics, sometimes it can. This effect, called quantum tunneling, allows particles to pass through barriers they should not cross. Nuclear fusion, for instance, occurs when hydrogen atoms combine to form helium, generating the energy that makes the sun shine. Even though the pressure inside the sun is not high enough to overcome the atoms’ mutual repulsion, because of quantum tunneling, some atoms fuse anyway.

    Quantum physics also transforms our concept of cause and effect. Einstein supposedly said, “Insanity is doing the same thing over and over again and expecting different results.” But according to quantum physics, you should do the same thing repeatedly and expect different results. Consider a window. You can see through it and also see your reflection. Why? Some photons pass through while others reflect. But which photons pass through, and which do not? There is no rule for this. The same photon sometimes passes through and sometimes reflects. You cannot know why or how, but you can calculate the probability.

    The most puzzling quantum phenomenon is entanglement. When particles become entangled, their states link so completely that one cannot be described independently of the other, even across vast distances. Measuring one particle instantly reveals the other’s state. No signal passes between them. The system behaves as a single, unified whole. The implications are profound. It suggests correlations that defy classical explanation. If distant particles appear to influence each other instantly, does something move faster than light? According to Einstein’s relativity, that should be impossible. Yet quantum physics says otherwise.

    Cat and Spider

    Because quantum physics was so strange, even many scientists struggled to accept it. Einstein, one of quantum physics’ founders, was unhappy with how it developed. He called entanglement “spooky action at a distance” and disliked the idea that nature fundamentally involved randomness. His famous quote, “God does not play dice with the universe,” reflects his discomfort with probability and measurement in the theory. (Niels Bohr, who developed the mainstream interpretation of quantum physics called the Copenhagen interpretation, replied, “Einstein, stop telling God what to do.”) Erwin Schrödinger, another key figure in quantum physics, agreed with Einstein in rejecting the Copenhagen interpretation. To demonstrate how ridiculous quantum physics’ claims were, he proposed a thought experiment involving a cat. Imagine a cat, a flask of poison, and a radioactive source sealed in a box. If a radiation monitor detects radioactivity (a single atom decaying), the flask shatters, releasing poison that kills the cat. If no radioactivity is detected, the cat lives. In the quantum world, things exist in superpositions—two different states existing simultaneously. But if that’s true, can a cat exist in superposition, both dead and alive? Of course not. Therefore, it’s absurd. (By the way, some misunderstand Schrödinger’s cat as illustrating quantum physics, but it actually criticizes it.)

    What happens when a quantum event occurs? Many scientists believe the universe branches. In Schrödinger’s cat scenario, there’s a world where radioactivity happens (and the cat dies) and another where it doesn’t (and the cat lives). Since each world contains countless quantum events, the number of universes is effectively infinite. This is called the Many-Worlds Interpretation (MWI). Though a scientific theory, it’s become enormously popular in entertainment. Many movies and TV shows explore the multiverse concept. Spider-Man: No Way Home is a prime example, in which three Spider-Men from three universes meet. (In MWI, multiple versions of the same character exist, so there can be different Spider-Men.) The animation version of Spider-Men, Spider-Man: Into the Spider-Verse has the same theme, and imagines a multiverse (called the Spider-Verse), in which there is a different version of  Spider-Man in each world. This movie and others prove that quantum physics is no longer an esoteric branch of science but a major part of popular culture.

    Science and Eastern Religions

    The double-slit experiment, one of physics’ most famous and mind-bending demonstrations, reveals the observer’s role in quantum physics. Imagine a board with two thin parallel slits, and behind it a screen recording where particles land. Shoot photons toward the slits. When you shoot many photons, they interfere, creating an interference pattern. But what happens when you shoot one photon at a time? You’d expect a different pattern since there can’t be interference when there is nothing to interfere with. Yet over time, an interference pattern emerges. Somehow, the photon interferes with itself. To solve this mystery, you place detectors to check which slit the particle passes through. When you repeat the experiment, the interference pattern disappears, and you get two bands, as if particles only went through one slit each. Observation changes the outcome.

    This challenges the traditional concept of the scientist as a “fly on the wall” observing without influence. In quantum physics, things behave differently when observed. In the double-slit experiment, before measurement, the photon exists in a superposition—going through both slits simultaneously. But measurement “collapses” this superposition into a single outcome: one slit or the other.  This opens the door to speculation. Paul Davies, for example, speculates that the world produced humans with consciousness so they could collapse quantum superposition. The New Age Movement particularly tries to interpret quantum physics to support its worldview. It claims that your mind shapes the world, the foundation of its positive thinking movement. Buddhism says, “Everything is created by the heart.” Traditional physics offers no way to harmonize this Eastern view with science. Quantum physics, at least for the New Age Movement, seems to. Of course, what quantum physics actually says differs greatly from New Age claims, but many in the movement try to use quantum physics as an entry point for their religion. In fact, as many quantum physics books seem to be written from a New Age perspective as from a genuine scientific one.

    Physicist Fritjof Capra’s The Tao of Physics: An Exploration of the Parallels Between Modern Physics and Eastern Mysticism explores profound connections between quantum physics’ conceptual foundations and Eastern spiritual traditions like Taoism, Buddhism, and Hinduism. Capra argues that fundamental quantum physics principles resonate with Eastern metaphysical paradigms. The quantum complementarity principle, which holds that particles can exhibit dual, seemingly contradictory properties (wave and particle aspects) depending on measurement context, closely parallels Taoism’s yin-yang duality. Yin and yang represent complementary, interdependent forces whose dynamic interaction sustains cosmic balance and unity. Similarly, quantum physics reveals a universe where opposites coexist in a seamless, interconnected whole rather than in rigid conflict. Capra emphasizes that quantum theory describes a web of relations in which observer and observed form an inseparable unity. This echoes Eastern mysticism’s emphasis on holistic interconnectedness and the fundamental role of consciousness. Most scientists believe this book goes too far in finding parallels between quantum physics and Eastern religions. Nevertheless, it shows how quantum physics can be a starting point for a fresh view of Eastern religions.

    Life in the Matrix

    One question people ask about quantum physics is: “Why is the world digital?” Digital sounds artificial. The analog world seems more natural. Some think the world is digital because we are living in an artificially created world, like the world of The Matrix. In this film, people live in a computer simulation while in reality they exist only in fluid-filled pods, serving as nutrients for machines. Yet some people seriously believe that this is how we are living. Rizwan Virk, an MIT-educated computer scientist, explores this possibility in his book The Simulation Hypothesis. Another proponent is Elon Musk, who famously states there is a “one in billions” chance we are living in base reality. Nick Bostrom, a Swedish philosopher, also supports this hypothesis. Bostrom argues that if advanced civilizations create many simulations, then the number of simulated conscious beings would vastly outnumber those in the original base reality. A randomly selected conscious entity is therefore much more likely to be simulated than real. Paul Davies makes a similar point. Since simulations can contain simulations, and so on, our chances of being in a simulation are much higher than being in the real world.

    In Steven Spielberg’s Ready Player One, we see a future in which people in a dystopian world escape into a virtual reality game. There, you can be anyone, forgetting the misery of the real world. But they soon realize the danger of living too much in a virtual game world and decide to limit their playing time. What if we are stuck in a simulation? Then there is no way to escape, unlike in The Matrix, where brave people manage to escape into the real world.

    One major problem with thinking we are in a simulation is that it eliminates right and wrong. Just as people can kill and steal without guilt in a game like Grand Theft Auto, if you think you are in a simulation, you can do anything without considering moral implications. The two high school students who killed 13 students at Columbine High School in 1999 used to wear long black trench coats like the character Neo in The Matrix. They were imitating Neo, who could kill anyone, believing it was just a simulation. Thinking we are living in a simulation might sound attractive, but it can destroy life’s meaning and morality.

    Science and the Meaning of Life

    With quantum physics, science entered a new era by detaching itself from understanding the world. At science’s beginning in ancient Greece, it was closely connected to life. Aristotle, who laid the foundation of Western scientific tradition, said everything that exists or changes can be explained by four causes: the material cause (what something is made of), the formal cause (its form or blueprint), the efficient cause (the agent of change), and the final cause (the purpose or end goal). This was an attempt to understand the world through meaning. Things fall, for example, because they naturally move toward their proper place. Then, every physical movement is meaningful because it brings order to the world. In his worldview, everything moves toward a goal, eventually leading to what Arthur O. Lovejoy called the Great Chain of Being.

    With the birth of modern science, science rejected the concept of goal or meaning. For Galileo, a scientist’s job was not to know why something happened. Unlike Aristotle, he didn’t want to know why things fall. He wanted to know how things happen—in this case, how to calculate a falling object’s speed. Science was no longer about “Why?” or purpose, but about “How?” or mechanism.

    With quantum physics, science entered another phase: scientists don’t even seek to understand how things work. For instance, they don’t try to explain quantum physics’ mysterious aspects mentioned above. Quantum physics’ appeal isn’t its power to explain how the world works, but its power to calculate probability. Physicist N. David Mermin’s phrase, “Shut up and calculate!” captures this new attitude. As Sean Carroll lamented, scientists lost interest in asking deep questions.

    This shift in science mirrors a broader change in society. In the past, life was guided by purpose and meaning. For instance, in Confucian Korea, people lived for their parents—a concept that may sound unfamiliar to modern minds but gave life a clear direction. Today, schools teach how to live, but not why. As a result, they produce capable and intelligent individuals who nonetheless lack a sense of purpose. The move toward abstract and purposeless calculation in science is, therefore, a natural outcome of this wider loss of meaning.

    Reliable Knowledge

    Quantum physics challenges our perception of science as reliable knowledge. I often hear people say, “I don’t need God because I have science.” For them, science isn’t just a tool for understanding the world but an alternative to religion. Their position grows from the West’s traditional rationalism. But what is science, and how reliable is it? If it says, “There are many worlds and different versions of you,” should I believe that without doubt? But what about Einstein or Schrödinger, brilliant scientists who rejected what other scientists accepted? Are they allowed to question science, but not us? Are scientists the new guardians of truth?

    First, let’s consider what science is. In most European languages, “science” derives from “knowledge” (just as “science” comes from Latin scientia, meaning knowledge). But science is not simply knowledge. It is a special kind. Science is a rational system of knowledge about the physical world. Scientists don’t try to understand anything non-physical: if they study non-physical entities like ghosts, it’s only when those entities manifest in the physical world. This focus on the physical world has misled many to believe science rejects the spiritual world. It doesn’t. Rather than saying, “There is nothing outside the physical world,” it says, “We only study the physical world.”

    Science originated in the Renaissance when people like Galileo Galilei began conducting experiments to understand the physical world. Francis Bacon, another Renaissance thinker, famously used the metaphor of spider, ant, and bee to describe three approaches to knowledge. According to him, men of experiment are like ants who only collect material. Reasoners resemble spiders who spin webs from their own substance. The bee takes a middle course: it gathers material from flowers but transforms and digests it by its own power. For him, science differed from medieval philosophy, which was largely speculative, and was more than just collecting data from nature. Rather, it was building a knowledge system based on experiment.

    But science, which did not even have a name and was called natural philosophy in its early days, soon impressed people through amazing discoveries. People stopped believing Earth was the universe’s center because of findings by Galileo, Copernicus, and Kepler. But Isaac Newton proved science’s power through his discovery of gravity. With modern science, Europeans had a system that could replace Aristotle’s old one.

    Chicken and Swan

    Philosophers, Europe’s traditional guardians of truth, didn’t just sit and watch science’s rise to power. They examined science’s philosophical basis and found it lacking. David Hume, the Scottish philosopher, was most successful in casting doubt on science’s reliability. He argued that induction—the process of moving from specific observations to general laws or predictions—is not rationally justified. According to Hume, just because a pattern has been observed repeatedly in the past doesn’t guarantee it will continue in the future. This is the “problem of induction.” Hume pointed out that scientific theories rely on the assumption that the future will resemble the past, but this assumption itself cannot be logically proven or justified without circular reasoning. Therefore, science’s foundational method, induction, is epistemologically insecure.

    Consider gravity. I heard a scientist say, “Gravity works the same everywhere in the universe.” But how does he know? Isn’t that his personal belief? Or did he travel everywhere in the universe to verify gravity works the same there? You might say, “So far, it’s worked everywhere.” But past events don’t guarantee future ones. Hume said, “No amount of observations of white swans can allow the inference that all swans are white, but the observation of a single black swan is sufficient to refute that conclusion.” In other words, your past experiments might only indicate your failure to find one example that will disprove your theory. Bertrand Russell used the story of a chicken fed daily by a farmer to make Hume’s same point. The chicken observes the farmer coming every morning with food and concludes this will always happen—forming a “law” based on past experience. However, one day, instead of being fed, the chicken is slaughtered. This shows the flaw in assuming past regularities will continue unchanged into the future.

    In the twentieth century, Karl Popper continued the philosophical attack on science. If Hume is correct, and no one has refuted him, then you cannot “prove” something as a universal law—you can only falsify something as wrong. Science, then, is not a collection of proven theories but a collection of theories not yet falsified. That’s why people don’t use the term “law” anymore in science. In the past, people thought scientists discovered nature’s laws. But if they are laws (in the sense of how nature works), they’re unchanging. So now science concerns itself with theories, like the theory of relativity, not laws, like the law of gravity.

    Paradigm Shift

    Thomas Kuhn is a pivotal thinker who fundamentally reshaped our understanding of how science operates and evolves. His influential work, The Structure of Scientific Revolutions, challenges the classical notion that scientific progress is a straightforward, cumulative process in which knowledge steadily builds upon itself over time. Instead, Kuhn presents a much more nuanced picture, proposing that science advances through a cyclical pattern of development involving distinct phases. During periods of “normal science,” researchers work within an established framework or paradigm (a widely accepted set of theories, methods, and assumptions) to address specific problems, often described as puzzles, that fit neatly within this conceptual structure. However, Kuhn explains that such periods of normal science are inevitably interrupted by crises, which arise when persistent anomalies accumulate. At some tipping point, the existing paradigm is no longer tenable, and a revolutionary shift occurs. This revolution leads to the adoption of a new paradigm that better accounts for the observed phenomena, fundamentally transforming how scientists view the world. A classic historical example is the shift from geocentrism, the Earth-centered universe model, to heliocentrism, which places the Sun at the center. Initially, scientists laboriously tried to reconcile data with geocentrism, but as contradictory evidence mounted, the model could no longer sustain its explanatory power. Eventually, scientists abandoned it in favor of the heliocentric model, a classic example of what Kuhn terms a “paradigm shift.”

    This transition, Kuhn emphasizes, is neither straightforward nor universally accepted. Paradigm shifts are met with resistance from those deeply rooted in the old ways of thinking, scientists, institutions, and even entire academic communities reluctant to abandon familiar frameworks. This resistance to change is eloquently captured by physicist Max Planck’s observation: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.” Thus, Kuhn’s view highlights science’s human and social dimensions, revealing it as a complex cultural enterprise rather than a purely objective, logical endeavor.

    Acceptance or rejection of scientific ideas, Kuhn suggests, depends as much on social factors (familiarity, generational shifts, community consensus) as on evidence or formal logic. This social aspect explains why certain scientific debates endure. For instance, Albert Einstein famously disputed aspects of quantum physics late in his life. However, most contemporary scientists do not share Einstein’s rejection because they belong to a new generation raised amidst quantum theory’s successes. They accept quantum physics not merely for its empirical merits but also because it has become the dominant paradigm within modern science.

    Kuhn’s insights invite us to critically reflect on our current scientific landscape. It raises the provocative question of how much of today’s “modern science” might eventually be replaced or revised. Are we, in fact, embracing specific scientific ideas because they resonate with the intellectual climate of this era? If scientific truth is contingent upon prevailing paradigms and their acceptance within communities, then it suggests that science is more a social construction than an unchanging absolute truth. This perspective encourages humility in evaluating scientific knowledge, acknowledging its evolving and culturally embedded nature.

    Science and Scientism

    While philosophers successfully attacked science’s foundation, often forcing scientists to concede by adjusting their truth claims, many scientists remained blissfully ignorant of (or purposefully ignored) those challenges. Eventually, they built a belief system called scientism. Scientism is excessive belief in science’s certainty. While science is a human attempt to understand the world, scientism sees science as the only reliable way to understand the world, thus elevating science to religion’s realm. Richard Dawkins is a good example. Dawkins firmly believes in science as the ultimate reliable tool for understanding the natural world and rejects religious faith as belief without evidence. He treats God’s existence as a scientific hypothesis and argues against faith, considering it harmful. Dawkins advocates for a worldview where scientific knowledge progressively replaces superstition, emphasizing empirical evidence and naturalistic explanations like evolution and abiogenesis. If a scientist doesn’t believe in God, that’s his choice. But if he says, “Science rejects God’s existence,” he’s abusing his position as a scientist. It implies science has jurisdiction over spirituality, judging which religion can be accepted. Science studies the physical world—there’s no way it can determine whether the spiritual world exists. But scientism doesn’t accept science’s limits, believing humans can understand anything through science. The problem is that many science promoters actually promote scientism, like Neil deGrasse Tyson, Stephen Hawking, and Carl Sagan. Because of these scientism proponents, many think science is an absolutely reliable method for understanding the world, and religion has no place in modern society. If someone considers this deeply and reaches such a conclusion, there’s nothing anyone can do. But it would be sad if they believe this because they embraced scientism’s message, thinking it’s science’s message.

    Scientism can even infiltrate the church. I once heard a Christian scientist say, “If our faith is disproved by science, we should give up faith.” So he thinks Christians should give science the power to judge faith, while faith has no such authority over science. Then we’d have to pray to science, “Please don’t destroy our faith!” Being Christian means accepting Jesus as Lord. There should be no other god that can destroy our faith in Christ.

    Scientism is closely connected with positivism, a philosophical system holding that genuine knowledge is only that based on empirical observation, scientific verification, and logical reasoning. Positivism became the philosophical foundation for scientific naturalism, empiricism, and later logical positivism in the twentieth century. But it has a fundamental weakness that led to its decline and eventual demise. Logical positivism’s core doctrine is the verification principle, which says a statement is meaningful only if it’s empirically verifiable (through observation or experiment) or analytically true (true by definition, like logic or math). But the verification principle itself is neither empirically verifiable nor analytically true. In other words, it says, “Everything should be verifiable, but we cannot verify the statement ‘Everything should be verifiable.’” It rejects other beliefs for claiming without verification, but its own claim was no different. Once philosophers realized this, nobody followed it. Scientism has positivism as its foundation. But since most people do not see that scientism’s claim is just that, a claim, they still accept arbitrary assertions such as “Science is the only way to truth.”

    Science and Faith

    Nobody can deny that science is the most important area of society this century. We believe science has special authority that other branches lack. That is why if you say, “This is what theologians say” or “This is what philosophers say,” nobody takes you seriously. But if you say, “This is what scientists say,” everyone pays attention. Doctors are respected because they are scientists who practice science on the body. When something must be done without fail, we call it “doctor’s orders.” Nobody says “pastor’s orders,” because pastors have no authority today among the general public.

    But we should also consider whether science really deserves this level of trust. First, as we have seen, it has a philosophically weak foundation. An honest scientist would say, “I am trying to develop a hypothesis that can withstand attempts to prove it wrong. And what I do is heavily influenced by my culture,” instead of saying, “I am going to show how the world works. My discovery is an absolute and timeless truth.” Second, with quantum physics’ birth, there are many things scientists gave up trying to understand. Science is reduced to calculating probability rather than understanding how the world works.

    There was a time when many Christians opposed science, believing it was harmful to their faith. I do not think that is the case today. However, now the danger is embracing science as much as non-Christians do, accepting it as an authority superior to all. Christians are called to be in the world without being of the world. Whatever we face, we should “take every thought captive to obey Christ” (2 Corinthians 10:5). Science is no exception. We should neither reject nor worship science.