Jesus Christ is the same yesterday and today and forever. (Hebrews 13:8)
Cape Town sits where two ocean currents meet. When I visited, the beach on the west side was too cold to even wade in—the water flows up from Antarctica. But on the east, we could swim comfortably in the warm current from the equator. A short distance made all the difference. Just as Cape Town marks where two currents converge, history has moments where two eras collide. The Renaissance is one example. It marked the end of the Middle Ages—an age of faith but limited knowledge—and the beginning of Modern Times, characterized by reason, nationalism, and individualism. If the Renaissance ended the Middle Ages, its beginning coincided with what Edward Gibbon called “The Decline and Fall of the Roman Empire.” Before Rome declined, the world knew Pax Romana, economic prosperity, and classical culture. After came the Catholic Church’s dominance, feudal structures, and monastic learning.
But the greatest fault line in history occurred just a century ago. Virginia Woolf, with prophetic intuition, declared that “on or about December 1910, human nature changed.” Mortimer Adler observed, “There is a clear break between this [the 20th] century and the twenty-five centuries that precede it in the tradition of Western civilization.” What happened between the late 19th and early 20th centuries? How did this new world emerge?
The Medium is the Message
One cause of this historic shift was the proliferation of media. Traditional societies had few information channels. The 19th century brought newspapers and magazines. The 20th century’s electronic media accelerated this explosion exponentially. When I grew up in Korea, television offered only four channels. This was common worldwide—most countries had four to five channels, with one or two for special interests, leaving viewers three main options. It was a time when half the nation tuned in to the same show together. Then cable television arrived, breaking technological barriers and offering over 100 channels. The internet followed, and today YouTube provides virtually unlimited choices. We now face an overwhelming flood of video content. With most people carrying camera-equipped phones and countless upload platforms, we encounter seemingly endless streams of content daily. Now, AI technology generates increasingly realistic videos—completely convincing fakes are inevitable.
This proliferation has brought a surge in fake news. Many believe fact-checking is the solution, but this is naive. Consider: who should fact-check? If CNN does it, critics cry “fake news fact-checks fake news.” If the BBC does it, skeptics claim “Europeans can’t be objective.” Government fact-checking? Then who adjudicates when governments contradict each other—like South Korea saying masks prevent COVID spread while Sweden says they don’t?
Even simple questions prove nearly impossible to verify. Take suspicions that companies like Google or Facebook eavesdrop through phones. On one side, countless people report talking about something, then seeing it advertised on their devices. On the other hand, experts insist such surveillance would be prohibitively expensive, and companies issue absolute denials. Who can determine fact from fiction? Just as hallucination is inherent to AI, fake news is unavoidable today.
Jean-François Lyotard, who first used “postmodern” philosophically, begins The Postmodern Condition by discussing computers’ widespread impact. Postmodernism emerged not from philosophical arguments but from changes in media. When only a handful of information sources existed, they commanded extraordinary authority. Now people believe what they want, and no one can effectively fact-check. The traditional framework for determining truth and identifying authoritative information has broken down. We live in a world simultaneously freer and more confusing. This is postmodern life.
Disappearing Photos
These changing times appear clearly in social media. Consider Facebook and Instagram. Though owned by the same company, they embody distinctly different philosophies. Facebook, the older platform, tries to let users do everything. You can write unlimited text, create multiple photo albums, share files and links, and join communities. Its messaging app is so robust that virtually everyone in Sweden uses it. Instagram differs fundamentally. It began as a simple photo-sharing app and maintains that simplicity. You’re limited to 2,200 characters, 20 photos per post, and cannot share files or links. There are no communities. These constraints mean you cannot use Instagram as a homepage the way you can Facebook. This makes Facebook seem superior, yet it’s not. I rarely see young people actively using Facebook, while most embrace Instagram.
Facebook and Instagram reflect opposing worldviews. Facebook embodies the belief that more is better—a perspective dominant throughout human history. From the beginning, humans always needed more than they had. Anything offering more was considered better. Even in the early internet age, services aimed to provide more. Facebook, launched in 2004, was designed around this worldview.
But people soon realized the digital world offered too many free options, leading to decision fatigue. What they needed was a service that didn’t burden them with endless choices. That’s why Instagram, launched in 2010, became popular. It deliberately limits what you can do. For older generations, this feels like a disadvantage. For younger users, it’s an advantage—it removes the burden of choice.
Consider the story function. Today, most social media platforms (including messaging apps like WhatsApp) feature stories—posts that disappear within 24 hours. Most older people don’t understand why valuable posts should vanish. Most young people think it’s brilliant. They don’t want photos lasting forever. There are too many photos anyway. They prefer ephemeral posts that do not burden their profile.
Stories were adopted after platforms observed young people using Snapchat, where images disappear shortly after viewing. While photos on most social media persist indefinitely, Snapchat photos vanish by design. Why do young people love this? Because when the entire service provides only temporary access, they feel free to share anything spontaneously—even embarrassing moments—without overthinking. The irony is striking: while humanity worked tirelessly to preserve data—carving stone, copying manuscripts by hand, building libraries—in the digital age, data preservation is the last thing young people want. We truly inhabit a different world, producing different culture, different attitudes, and even a different worldview.
The Dawn of a New Era
Around the turn of the 20th century, signs of a new age appeared throughout art and science. Picasso opened a new chapter in fine art with Les Demoiselles d’Avignon (1907). This painting is packed with many revolutionary ideas, which eventually replaced traditional concepts like perspective and mimesis. In music, Igor Stravinsky heralded the new age with The Rite of Spring (1913). Harmony gave way to primitive energy; even silence could replace beautiful melody. In literature, T.S. Eliot’s poem “The Waste Land” (1922) and James Joyce’s novel Ulysses (1922) demonstrated completely new approaches to poetry and fiction. The change was so radical that C.S. Lewis observed it was the biggest break in tradition since the Epic of Gilgamesh. The Lumière brothers invented the Cinématographe in 1895, and D.W. Griffith revealed cinema’s potential through The Birth of a Nation (1915). A new medium was born that would dominate popular culture for centuries to come.
In science, Max Planck laid the foundation of quantum physics in 1900, and Albert Einstein published his theory of special relativity in 1905. This meant the end of Newtonian physics and a new worldview that would transform how people think. Movies like Interstellar or Everything Everywhere All at Once wouldn’t exist without relativity and quantum physics.
A new academic discipline emerged that would shape the modern world more profoundly than traditional fields. Psychology was born when Wilhelm Wundt founded the first laboratory for psychological research in 1879, but Freud’s Interpretation of Dreams (1899) popularized the field among the masses. We are so influenced by Freud that many expressions we use today—like repression, denial, and projection—come from him. With modern psychology came the realization of the deep, twisted human unconscious beneath the veneer of rationality. It was another nail in modernism’s coffin.
La Belle Époque
The death of Queen Victoria at the beginning of the 20th century symbolizes a world that ended with the new century. As ruler of the British Empire, Queen Victoria governed one-quarter of the Earth. Europe was completely Christian, completely white, and full of optimism. Every day brought new inventions making life easier and more enjoyable—electricity, cinema—and no one doubted this would continue.
It was also a time of Christian growth. The 19th century is called a great century of mission because Christianity spread to every corner of the world. Since the West was superior to the rest of the world, many showed interest in Western religion, and Western governments could pressure other governments to ensure missionary freedom. In Europe, most people identified as Christians and attended church on Sunday. As the British sang “God save the King” or Americans “Praise the Power that hath made and preserved us a nation!”, they felt they were not just citizens of a powerful nation, but a nation blessed by God. Life was so good that the period between the end of the 19th century and World War I is called La Belle Époque (French for “The Beautiful Era”).
But during this Christian flourishing, Danish philosopher Kierkegaard raised his voice, criticizing Christians whom he accused of “playing Christianity.” For him, true Christianity is about a personal relationship with God, not external signs like attending Sunday service. Christianity was meant to be a lived experience centered on an individual’s personal encounter and decision to follow Christ as a contemporary, not just a historical or societal tradition. He said a Christian should live “in the audience of one.” As the Reformers emphasized coram deo (Latin for “in the presence of God”) as life’s basic attitude, his concept of the audience of one teaches the importance of putting God at the center of daily life.
One problem Kierkegaard observed in his day’s Christianity was its reliance on reason. The 19th century universally accepted rationalism, which became central to Christianity as well. So, being a Christian meant agreeing with well-formulated theological theses. But this also reduced Christianity to a branch of a great European intellectual system. To avoid this, he put inward decisiveness and passionate commitment at the core of Christian faith. His emphasis on the “leap of faith,” as he saw in Abraham’s offering of Isaac as a sacrifice, should be understood in this context.
The World without God
Friedrich Nietzsche, who died six months before Queen Victoria, was another key figure at this historical turning point. Nietzsche was an astute observer of Western culture. When he said “God is dead,” he wasn’t discussing God’s non-existence (otherwise he would have said “God doesn’t exist”). Rather, he was reacting to the Enlightenment’s consequences and the rise of reason, science, and secularism, which had caused the foundational Christian worldview to lose authority and relevance. We live, as he predicted, in a world where belief in God is considered “unacceptable.” Europe went from full faith in God to no faith very quickly, and Nietzsche was the prophet of this change.
Europe has two cultural roots: Christian and Greco-Roman. During the Middle Ages, these streams harmonized, largely because Christianity was so dominant. But since the Renaissance, Greco-Roman influence grew stronger. Things came to a head with the Enlightenment—a movement trying to build society on reason. People had to choose between reason and faith, a choice they’d never had to make before. Eventually, most Europeans chose reason and abandoned faith.
But Nietzsche showed that reason’s victory contained its own destruction. The Enlightenment emphasizes criticism of authority. But what happens when you criticize reason’s authority? Or will you say, “You must not criticize reason because reason is sacred”? Then how is rationalism not just another religion? Eventually, Nietzsche realized the Enlightenment’s self-image as reason’s victory over illusion was itself an illusion. Imagine teaching your child to doubt every authority. How long before your child doubts your parental authority? That’s what happened with Nietzsche. By pushing Enlightenment logic to the extreme, he discovered the Enlightenment was something to overcome, just like Christian faith.
Even though many respect Nietzsche as a great teacher who taught us how to live, his life shows how difficult it is to live without God. He was so brilliant that he became a professor at 24. But he suffered a major mental breakdown at 44, from which he never recovered. In the past, people believed his mental breakdown was caused by syphilis, but today, many doubt it. Rather, his rejection of faith seems to be a real cause. He seemed to have suffered from megalomania. For example, he called himself Anti-Christ. His autobiography Ecce Homo contains chapters with self-laudatory titles like “Why I Am So Wise,” “Why I Am So Clever,” “Why I Write Such Good Books,” and “Why I Am a Destiny.” As someone who did not believe in the virtue of humility, nothing stopped him when he felt he was a genius who needed to tell everyone how great he was. Because there was no God for him, life was all about will to power, and life’s meaning was to become a superhuman. But when he had his breakdown, his brilliant mind couldn’t help him. He lived 11 more years, remaining incapacitated until he died in 1900.
The Counterculture
When Nietzsche criticized the Enlightenment, most people didn’t pay attention to him. They were too busy enjoying The Beautiful Era to realize they stood at history’s threshold. Only two world wars made people realize that the myth of progress was simply a myth, the idea of ending war through international trade was a grand illusion (as the movie The Grand Illusion shows), and the human heart was as dark as Joseph Conrad described in Heart of Darkness. Suddenly, the worldview that had ruled Europe for centuries since the Renaissance was abandoned, and people began seeking alternatives.
Baby boomers (or boomers), born around World War II’s end, wanted to build a new world without following their parents’ models. Their culture is called the counterculture. The counterculture movement supported several ideas. For example, they were the first to focus on environmental issues. Before them, almost no one paid attention to pollution or environmental protection, but now it’s common sense. They also championed racial and gender equality. Until the 1960s, many U.S. schools were still segregated—black students couldn’t study with white students. As Rosa Parks’ incident shows, black people could only sit in the back of buses and had to give up seats if white people needed them. The older generation considered this normal, but baby boomers saw it as an issue to address. They were a very political generation, producing many politicians. Both France’s Soixante-huitards and Germany’s Achtundsechziger dominated their politics for decades. In the U.S., since 1993, except for two presidents (Barack Obama and Joe Biden), all the rest were born in 1946. U.S. politics is still heavily controlled by Boomers.
The counterculture embraced Eastern culture, including Eastern religions. While the Enlightenment’s legacy led to atheism, the counterculture rejected it. They were not materialists; rather, they were seekers. In their search for spirituality and life’s meaning, many experimented with Eastern religions like Hinduism or Buddhism, and the New Age movement, a Westernized synthesis of Eastern religions. Many young Europeans traveled all the way to India by road (Afghanistan was a free country then, making India reachable by land from Europe), practicing yoga and meditation, which eventually became popular in the West. Many Indian words became part of the English vocabulary, like karma or guru. Some seekers even found their answer in Christianity—they’re known as Jesus People. Keith Green, a former hippie who became a Christian through radical conversion and made several popular Christian music albums, is a good example. Also, Youth With A Mission (YWAM) was born during this cultural upheaval.
The Youth in Mission
YWAM was founded in 1960, and as the name suggests, it’s a group of young people. It was born from Loren Cunningham’s vision about waves of young people reaching the nations. So one of YWAM’s foundational values is to champion youth. Interestingly, YWAM started when young people claimed society’s leadership. Boomers went to college and formed their unique culture in the ’60s. Then their social influence grew. John F. Kennedy’s election, popular among Boomers, signaled change. The Beatles’ popularity showed their growing influence. In the ’70s, they invaded and conquered Hollywood, opening the New Hollywood era. Francis Ford Coppola, slightly older than his Boomer peers, opened doors with The Godfather (1972). Then Martin Scorsese, Steven Spielberg, and George Lucas all achieved great success, securing the Boomer generation’s Hollywood foothold.
Boomers’ success as young people was possible because of a new economic reality. In an agricultural society, there wasn’t much information to process, and the only way to lead was by experiencing many events. So elders led society. In industrial society, you needed knowledge and organizational skills to lead, and middle-aged people were leaders. But in the information age, you need creativity and charm to lead (or influence) others, and youth are best equipped for that.
While society rapidly reorganized to give youth space, the church, especially missions, still wasn’t open to accepting young leaders. If you wanted to be a missionary in the 1960s, you had to finish college, theological seminary, missionary training, and work in a local church to build support. So even if you decided to become a missionary as a teenager, you’d be ready only in your early thirties. Sending people in their twenties as missionaries sounded unrealistic, even irresponsible. They required extensive missionary training, partially because of slow transport. When William Carey, the first modern Protestant missionary, went to India in 1793, it took him five months to travel. Later, technology shortened this, but tradition said you could only send well-trained missionaries because sending missionaries was difficult. Also, once there, communicating was difficult, so if you sent someone with hidden problems, it was hard to discover problems and even harder to fix them. In that situation, you took time to train missionaries and sent only the mature.
But as commercial flights became popular, the situation changed. Now it took only a day for a missionary to reach the mission field. New concepts like short-term missions became popular. When it took ten months to travel back and forth, sending anyone short-term didn’t make sense, but when it took two days to come and go, people went on week-long mission trips. Also, if reaching the mission field is so easy, you can send young people for short trips as missionaries. You don’t require ten years of training. Rather, you train them for a week and send them for three weeks. That was YWAM’s beginning. It was just a group of young people who wanted to serve God, going on outreaches to share the gospel. Even today, we try to remain faithful to our calling of championing youth.
The X
While the baby boomers’ counterculture was influential, it was too radical for universal acceptance. Hippies completely embraced it, but the rest of the generation was only influenced without full immersion. But in the ’90s, as another generation grew up, this new attitude (or culture) became universally accepted by the generation. It’s called Postmodernism, and the generation behind it is Generation X.
The name “Generation X” originated as a term to describe the generation following the Baby Boomers, roughly those born between 1965 and 1980. The term was popularized by Canadian author Douglas Coupland in his novel Generation X: Tales for an Accelerated Culture. Coupland borrowed the term from Paul Fussell’s 1983 book Class: A Guide Through the American Status System, where “X” represented people who did not want to concern themselves with societal pressures, status, or money. Generation X is the first generation that grew up with computers and videos. They saw communism’s collapse in their youth, so they have no illusions about ideology as the world’s savior. In fact, they don’t trust any system. That’s one reason they’re less political than other generations. Boomers as a generation are liberal, and Gen Z are conservative, but Generation X aren’t attached to politics. They’re individualistic and skeptical of institutions, meaning politics isn’t usually their favorite career choice.
But Gen Xers are excellent visual artists, especially filmmakers. Through home videos, they acquired visual literacy. They’re good at mixing different visual styles to make unique blends. Quentin Tarantino (a bit older than Gen Xers but sharing their artistic taste), David Fincher, Paul Thomas Anderson, and Wes Anderson are good examples. The Matrix is a quintessential Gen X film because it encapsulates the generation’s skepticism, disillusionment, and anxiety about technology, corporate control, and modern life’s authenticity. Released in 1999, when most Gen Xers were entering adulthood, it resonated with many of their core concerns and cultural experiences.
Generation X also influenced the church significantly. Their focus is on authenticity, not intellectual coherence, so they don’t try to win people through apologetics but through genuine friendship. Myron Penner, who wrote The End of Apologetics: Christian Witness In A Postmodern Context, represents the typical Gen X Christian attitude. Many older Christians see this as a problem. They think Christianity is an intellectual religion, and our job is to “think logically” so we can explain Christian truth well to others. But Generation X feels that even if you’re intellectually consistent, if you don’t practice Christ’s love and kindness, then you’re failing to represent Christ.
The End of Old Order
Even though some Christians believe postmodern Christianity betrays true faith, postmodern Christians think traditional Christianity is simply a version popular in the male-dominated West. In today’s world, where people and ideas travel freely, that version is impossible to defend.
I heard someone share their experience at a YWAM base. This was before iPhones, so everyone watched TV together, leading to many conflicts among people wanting different channels. After a long discussion, the base leadership, which was all-male, made an announcement: “From now on, we’ll use the TV only for news and sports.” For them, it was the best decision because news and sports are essential, and watching only those eliminates fighting. But they were blinded by gender bias. Men are fact-oriented, so they like news. They’re also competitive, so they like sports. But women are often more interested in what happens in relationships, so they tend to prefer TV shows to news or sports. From a male perspective, watching TV shows is private life, while watching news and sports is public affairs. This shows that what’s called “objective” is often only objective from a male perspective.
As much as the male perspective, the Western perspective shaped traditional Christianity. In the past, the West was the whole world, but now, with the increasing influence of the non-Western world, it’s difficult to say everyone should think like Westerners. There are several differences between East and West, but communication is one area where the difference is clearest. In the West, clear communication is valued. In the East, communication is supposed to be subtle and indirect. Direct communication is seen as immature, even selfish, by Easterners.
In Mikhail Bulgakov’s novel The Master and Margarita, there’s a scene showing how indirect communication works. Pontius Pilate gives orders to his chief of secret police, Aphranius, to protect Judas of Kiriath. But instead of protecting him, Aphranius kills him. When he reports Judas was murdered, Pilate gives him money for his efforts. If you’re only good at direct communication, this scene is a mystery: Pilate telling Aphranius to protect Judas instead of killing him, Aphranius killing Judas against his master’s order, then Pilate rewarding Aphranius despite his failure. However, if you’re good at indirect communication, what happens is clear: Pilate clearly wanted to kill Judas but couldn’t as a Roman officer. So he tells his trusted subordinate to protect him, thinking he’d understand his true intention. Aphranius, instead of focusing on what Pilate said, focused on his true intention. There was no reason for a Roman governor to protect a private Jewish man. Besides, he could see that Pilate was upset about executing Jesus, meaning he was angry with Judas. So “Protect Judas” could only mean “Kill Judas.” This ability to send and receive hidden messages is called nunchi in Korean. Nunchi is highly valued, and they say, “If you have nunchi, you can eat salted shrimp at a Buddhist temple.” A Buddhist temple never serves non-vegetarian food like salted shrimp, but with nunchi, you can gain what seems impossible. As Euny Hong, author of The Power of Nunchi, says, nunchi is a Korean superpower that enabled Korea’s rapid economic development.
Indirect communication is embedded in Buddhism. The Flower Sermon tells an episode in Buddha’s life. Buddha once gathered his disciples for a sermon beside a pond. Instead of speaking, he silently held up a single lotus flower. The crowd remained puzzled, but Mahakasyapa smiled quietly—the only one to grasp Buddha’s unspoken teaching. That’s how Buddha decided to pass his secret true teaching to him. Once again, it doesn’t make much sense to us. We’re as perplexed as Buddha’s disciples at the scene. And that’s the point: Buddhism’s true teaching is not for everyone. It’s hidden and can be passed only to those who have eyes to see, ears to hear. From this hidden tradition, Zen Buddhism was allegedly born. It’s said to be a religion without words, without explanation, without instructions, and without knowledge. For Westerners, it’s impossible to imagine such a religion: how would you understand or spread it? But that’s why it’s gaining popularity in the West today. As a mysterious person is attractive, a religion without explanation attracts those fed up with clear, direct communication.
Being and Nothingness
One big difference between Eastern and Western worldviews is the concept of existence. In the West, existence is the center of philosophy. And in Western thinking, existence is like a billiard ball—perfectly spherical and almost unbreakable. In the East, non-existence is important, and reality is like air. Think of a room. What’s in it? Most people mention furniture, items on the table, pictures on the walls, and maybe people. But by doing so, they leave 90% of the room’s reality unmentioned: they didn’t mention the air. Air is so inconspicuous that we don’t even think it exists. But without air, we die. It’s an important part of reality that we often overlook. In the East, the concept of chi is important. Chi is an invisible field that reflects and shapes reality. Probably the best way to explain it is to liken it to the Force in Star Wars. Both are metaphysical energies that permeate and connect all living things, influencing life, health, and spiritual power. As “the Force is strong with this one” in Star Wars, some people have strong chi. As Jedi control the Force to fight, martial arts masters control chi to fight. So if atom theory is the ultimate expression of the Western worldview, field theory is the ultimate expression of the Eastern worldview. Unlike atoms, fields are invisible yet interact with matter and energy.
In European languages, people are called human beings (French être humain or German menschliches Wesen), reflecting the concept of being, which has been understood as tangible and unchanging. In Chinese, a human is “human between,” because the empty space between people makes a human being human. Once again, we see the different emphasis: in the West, the focus is on the tangible, physical reality, while in the East, the focus is on the intangible, empty space.
This worldview difference gave birth to different art styles. In the West, a painting is supposed to be full of color. If a painting isn’t covered with colors, it’s either only a sketch or unfinished work. In the East, paintings aren’t usually full of colors. Rather, they should contain much empty space. In the East, there’s a concept of “the beauty of white space”—by not filling the picture with colors, you create beautiful spaces. In Japan, this concept of negative space is expressed in literature as haiku. A haiku has three lines with a syllable pattern of 5-7-5, for a total of 17 syllables. Here is an example by Matsuo Basho.
An old silent pond… / A frog jumps into the pond, / splash! Silence again
It is extremely short, but it contains everything to communicate the mood. It fits the Eastern worldview that focuses on empty space.
Postmodern Christianity
As the world enters the postmodern phase, Christians wonder how to express the gospel’s reality in this world. In the past, being a Christian meant intellectual agreement with doctrines. It meant leaving certain ideas behind and accepting different ones. That’s why C.S. Lewis famously described his own conversion to Christian faith as deeply unenthusiastic, even calling himself “the most dejected and reluctant convert in all England.” In modernism, if you realize Christian ideas are intellectually more powerful than alternatives, you’re supposed to accept Christianity whether you want to or not, just as conquered nations accepted victors’ gods in the ancient world. Lewis himself advanced an argument for faith. He said Jesus would have either been out of his mind, intentionally deceiving, or exactly who he claimed: the Son of God. Of course, it’s difficult to imagine Jesus was either a lunatic or a liar, so it makes the most sense to believe him as the Son of God.
But today, people don’t want to hear arguments for faith. In this environment, presenting Aquinas’ Five Ways to prove God’s existence or Kant’s Moral Argument for God isn’t effective. An argument is meaningless if people don’t listen. Then sharing the gospel must mean something different today. In the past, sharing the gospel meant spreading information about Christianity. We as Christians knew something people didn’t, so if we told them, they’d accept Christian faith. But today, in this world of information overload, the concept of spreading information isn’t welcome by most people.
Still, we can communicate the gospel by sharing our lives. We can testify to what God has done in us. We can be the expression of God’s grace on earth. Even when people don’t listen to arguments, they’ll listen to our stories. As Jesus said, “You will be my witnesses” (Acts 1:8), we can testify to what he has done. Also, we can provide a community. One of modern life’s worst aspects, especially city life, is loneliness. In this wired world, when you can connect with anyone on earth for free, people feel lonely and disconnected. But we, as Jesus’ followers, live in community. It’s a miracle that proves a loving God exists. Jesus said, “By this all people will know that you are my disciples, if you have love for one another“ (John 13:35). People won’t know God through our arguments but through our love for each other.
Many Christians worry that postmodernism will destroy Christianity. They’re afraid the theological system we’ve been building for centuries, if not millennia, will be abandoned. But I believe it’s a great opportunity to live our faith authentically. When we practice what we believe, we’ll see God move in people’s lives. The times change, but God’s truth doesn’t change.