A few days ago, I met a Chinese woman at the YMCA who was seven years my senior and recently retired. I shared with her that I had stopped working at the end of 2016, stop “bending for five bushels of grain” 为五斗米折腰, wèi wǔ dǒu mǐ zhé yāo, figuratively refers to compromising one's integrity or values for a low salary. She asked, "What did you do after retirement?" Truthfully, not much—after all, the pandemic started in 2020.
Back then, I had ambitious plans, yet, surprisingly, I accomplished few items on my list and instead spent a lot of time on things I hadn’t planned for. Writing wasn’t even on my radar, as I assumed it was for those who had nothing else to do. I wanted to enrich my life with experiences worth writing about, to fill my days with action and discovery.
Of course, some unplanned activities crept in: playing games and binge-watching short videos on social media, often under the guise of “relaxation.”
Turning to some recent news, Intel has issued an ultimatum to its employees: choose severance or brace for layoffs. Once a powerhouse in semiconductors, Intel now finds itself excluded from the Dow Jones Industrial Average. This day and night transformation is a story of missed opportunities and shifts in industry focus.
Consider Intel's journey alongside NVIDIA’s rise. Intel’s CPUs have always been the “brains” of computers, while NVIDIA’s GPUs were initially crafted for graphics rendering. CPUs held a must-have status, whereas GPUs, once considered optional, gained increasing relevance in specialized tasks.
In commercial value, CPUs historically far outshined GPUs, but shifts in demand began to level the playing field. As Bitcoin mining, 3D gaming, and film production took off, NVIDIA’s GPUs surged in popularity. Intel, however, disregarded these “small patches of market,” missing the opportunity to capitalize on new markets.
The lesson learned here? Industry giants often falter not due to a lack of technical prowess but rather the opposite: their dominant strength leads them to overlook tiny signs of emerging trends, over-confident in their invincibility. For instance, Intel dismissed the potential of smartphones, neglecting that market entirely. Yet, as demand grew, the once-overlooked market expanded and once “non-mainstream” products became central, and Intel’s dominance eroded. This shift didn’t happen overnight; it was the result of gradual changes rather than sudden breakthroughs.
I wonder what Intel’s visionary founders—Gordon Moore, Robert Noyce, and Andy Grove—would think of this transformation. Michael S. Malone’s book, The Intel Trinity: How Robert Noyce, Gordon Moore, and Andy Grove Built the World's Most Important Company (2014), delves into the legacy of these pioneers. Though Intel’s CPU remains among the best globally, the question remains: can Intel reclaim its former glory?