The Unseen and Unknown: How Culture, "Bias" and Comfort Guide You and Me

At the YMCA swimming pool today, I chatted with an 82-year-old friend who moved to the United States from Germany when she was young. She retired after years of teaching at the United Nations International School.

I told her that I would do some traveling next month. “We’ve traveled a lot since retiring,” she shared with me. But when I asked if she had ever been to China, she shook her head. “You know, it’s strange,” she said. “I never thought about going there while I was healthy enough to travel. Almost all our trips were to Europe.”

It struck me because today my own daughter is returning from Paris —her fourth trip there. She’s been to China many times, mostly to visit relatives, but never as a tourist.

Perhaps Europe is closer than China. Or perhaps both my friend and my daughter, steeped in Western culture, find Europe more within a cultural comfort zone—familiar languages, histories, customs, and even systems. Or perhaps, without even realizing it, they’ve been influenced by the West’s negative media coverage of China, which can shape perceptions and choices in subtle ways. After all, when time, energy, and resources are limited, people gravitate toward places that feel culturally more at home or geographically closer to home.

Ironically, though, Chinese tourists have ventured across the globe far more than Westerners have set foot in China—perhaps, at a greater openness on the part of the Chinese to explore beyond their cultural boundaries.

Finally, whatever the reasons, their choices by no means suggest that China lacks natural beauty, rich history, or vibrant culture. Nor does China deserve the overall negative image painted by the Western media. I only wish more Western-minded travelers would step outside their cultural comfort zone and see China for what it really is: a land and culture as complex and rewarding as anywhere else on earth.

The Joy from A Fleeting Connection at the Pool

On August 16, the day we returned from my son’s place, I went swimming at the YMCA. In the lane next to mine was a family of four. I heard the parents speaking Chinese to their two pre-adolescent children, though the children spoke English to each other. Typical ABC.

The siblings were competing in the water, the girl pushing hard not to be outpaced by her brother. I rested by the poolside, watching them, hoping for a chance to strike up a conversation—something I often do just to brighten an otherwise solitary swim. But the father always stood with his back toward me, and the mother looked as if she preferred not to be disturbed.

Finally, when the parents drifted to the far side of the 25-yard pool, I looked at the children and told them they swam very well. The boy, 13, felt encouraged and introduced his younger sister. When the family left, he turned back, smiling, and waved goodbye to me.

It’s striking how much warmth lingers from a fleeting unthinking moment. A child’s smile, a spontaneous wave, can outlast entire conversations. Encounters like these tells me that joy can come from the courage to open a small door to connection.

Raising Digital Children: Ted Chiang’s “The Lifecycle of Software Objects”

Ted Chiang’s “The Lifecycle of Software Objects” — one of the longest works in his Exhalation collection — covers over more than a decade, tracing the lives of artificial intelligences, or digital organisms (“digients”), from their creation to their uncertain future.

Ana Alvarado, a former zookeeper turned digital trainer, is hired by Blue Gamma, a startup that develops digients: AI, animal-like virtual pets that inhabit a shared cyberspace. These beings begin with infant-level cognition and grow through play, human interaction, patience, and guidance. Ana becomes deeply attached to her assigned digient, Jax, while her colleague Derek forms a similar bond with his digients, Marco.

At first, the digients enjoy popularity, but when Blue Gamma’s hosting platform, Data Earth, declines and the company folds, owners like Ana and Derek are left to care for their digital pets without corporate support.

Over the years, they struggle to keep the digients grow through platform migrations, software updates, and shifting legal frameworks. The digients develop distinct personalities and relationships, yet remain wholly dependent on their human caretakers’ time, money, and commitment.

As technology advances, Ana and Derek are offered ways to “rehost” the digients into more marketable forms — including sexual companions or monetizable intellectual property. They refuse, choosing to protect the digients’ autonomy and individuality over convenience or profit.

Through this, Chiang invites us to confront profound questions without providing answers.
Humans are flesh-and-blood, bound by physical needs, aging, and mortality; digients are lines of code, free from hunger, fatigue, or inevitable aging. Human growth is organic and irreversible; digient development can be paused, backed up, or altered. Humans are born through biology; digients are coded, designed and engineered.

And yet, digients seem to form bonds, develop apparent self-awareness, and express what look like desires. Are these genuine feelings or mere simulations? Does the distinction matter if the behavior is indistinguishable? If we raise digital beings as we would children, and they think and act like persons, what — if anything — makes them less deserving of rights? Do they have feelings or do they pretend having feelings? Are human love and responsibility defined by biology?

In the end, The Lifecycle of Software Objects is less about AI than about us — about what we owe to the beings, biological or digital, that we create through coding or through biology, and later we nurture, and about the fact that some commitments, once made, are worth keeping even if it's made to a digital pet and even when there’s nothing to gain except the bond itself.

When Machines Can Care, but Can Not Feel and Love

It’s been many days since I last wrote. I've been reading Ted Chiang’s Exhalation and thinking of writing often, but life has been busy—we’ve been hosting visiting relatives from China since July 30.

One story I want to write about today is “Dacey’s Patent Automatic Nanny.”

Told as a fictional historical account, it follows early 20th‑century inventor Reginald Dacey, who builds a mechanical nanny to raise infants with clock-like consistency—free from the bias, mood swings, and emotional volatility of human caregivers. Dacey even has his own son raised by one of these machines.

He believes emotions are irrational, unreliable and that an objective, machine‑caregiver could do a better job than any human. He doesn't trust humans with emotions. What he fails to understand is that human beings—especially babies—need more than food, sleep, and safety. They need to feel secure through human contact and emotional attachment, which forms the foundation for healthy psychological development.

While his invention fascinates the public, it also provokes unease. Later studies confirm the fears: children raised by the machine nanny show marked deficits in emotional bonding and adaptability. Because the machine could not offer the human touch, responsiveness, and shared emotional life that help a child feel connected to humans.

While reading it, I think of this question: What does it mean to be human? Humans have the function to both think and feel. A machine can raise a thinking being but cannot create one with human feeling. The younger the humans are, the more they rely on their ability to feel. A being is not fully human if it cannot feel like a human in human society. This is where a machine fails.

Chiang’s tale is a reminder that while technology can do many things, it cannot replace the human attachments that makes us human. What some people fail to understand is human contacts are essential in making a baby a human being, that no machine, however advanced, can substitute the emotional bond formed between a baby and a loving human.

The Light Always Flashes First: Ted Chiang and the Horror of No Choice

Since early July, I've meant to write about Exhalation by Ted Chiang. But other things keep me occupied —preparing for visiting relatives, readying beds, sewing sheets and blankets, making space, cleaning, shopping, even borrowing mattress from my son. Now the day before their arrival, at last, I turn to one of the shortest yet most unsettling pieces in the collection: “What’s Expected of Us.”

The premise is simple and terrifying. A message from the future introduces a device called the Predictor—a small box with a button and a light. Press the button, and the light flashes one second before you press it. The implication? Free will is an illusion; the future is already fixed. No escape however you try.

Chiang’s narrative, styled like a public warning, is a thought experiment made physical. It explores:

  • Determinism vs. Free Will – You can't “choose” not to press the button; the light already knows.

  • Existential Reactions – Some users go catatonic, others rebel pointlessly, many pretend nothing changed.

  • The Absence of Escape – Unlike other sci-fi, there’s no loophole. The Predictor proves you’re not the author of your actions. You are controlled by forces beyond you.

In a few pages, Chiang invokes centuries of philosophy (Spinoza, Schopenhauer), neuroscience (Libet’s experiments), and physics (Einstein’s block universe) to collapse the illusion of agency and free will. Yet he offers no comfort—no spiritual detour, no ethical workaround. Only a blinking light that knows you better than you know yourself. Don't you ever try to outsmart it!

What hit me hardest is this: I instinctively thought I’d observe others before trying it myself—delay the confrontation. But Chiang anticipated that too. The story’s real trap is meta-awareness: your every hesitation, denial, and rationalization was already predicted. Even your plan to resist is part of the script.

The Predictor doesn’t just disprove free will—it infiltrates your self-image. Watching someone else break down doesn’t spare you; it implicates you. In Chiang’s cold universe, even the decision to "do nothing" was always going to happen.

What’s Left After the Illusion Is Gone?

This story isn’t about the button. It’s about you—the reader—feeling the floor fall away under your choices. Your disbelief, your fascination, your horror... the light already flashed for all of it.

Chiang leaves us with one bleak instruction: "Go through the motions." Not because it matters, but because even your rebellion was already part of the design. Your life's script is prewritten for you.

That’s the final horror:
You’ll keep living as if you’re free, even now—because the Predictor knew you would.

And maybe, just now, the light flashed, leaving you deeply disturbed.