Forget tablets and economic woes. “Good enough” computer performance might be the real reason for lackluster PC sales.
While rumors of the PC’s demise are greatly exaggerated–an industry that moved more than 350 million units in 2012 is not “dead”–computers undoubtedly aren’t selling as quickly as they once did. Analysts forecast PC sales to far exceed tablet sales for the foreseeable future, but the growth rate for PC sales has utterly and completely flatlined.
The big question, of course, is why?
A couple of theories inform conventional wisdom. Most pundits blame stagnant PC sales on the likewise stagnant economy, or point toward the ascension of smartphones and tablets. Others argue (fairly persuasively) that the flattening of growth is attributable to theA idiosyncrasiesA of PC sales in developing countries, where computers are a rarely replaced luxury item. A second wave, analysts say, has yet to come after an initial surge in sales in those nations.
Like most economic sectors, the PC market is influenced by myriad factors, and some truth lies in all three of those explanations. After watching my mother-in-law happily troll Facebook and sling emails on her nearly ten-year-old Pentium 4 computer, however, an even more insidious possibility slipped into my head.
Did CPU performance reach a “good enough” level for mainstream users some years back? Are older computers still potent enough to complete an average Joe’s everyday tasks, reducing the incentive to upgrade?
“It used to be you had to replace your PC every few years or you were way behind. If you didn’t, you couldn’t even run the latest software,” says Linley Gwennap, the principal analyst at the Linley Group, a research firm that focuses on semiconductors and processors. “Now you can hold onto your PC five, six, seven years with no problem. Yeah, it might be a little slow, but not enough to really show up [in everyday use].”
Old processors are still OK for everyday use
This may come as a shock to performance-pushing PCA enthusiasts but the average Joe almost never encodes videos, nor will you catch him fragging fools in Crysis 3. Instead, Average Joe spends most of his time on mundane, often Web-centric tasks: Buying stuff online, sending emails, engaging friends and family on social media, maybe watching the occasional YouTube video–on default resolutions, natch, not high-definition–or playing a few hands of Solitaire.
In other words, hardly the kind of activity that begs for an overclocked, water-cooled, hyper-threaded Core i7 processor. Or even a modern-day Ivy Bridge Core i3 processor, if we’re being honest.
“If you’re just doing Web browsing, using a few spreadsheets here, a little bit of word processing there, you’re not going to notice the difference between [older] 2.5GHz and [newer] 3GHz processors,” Gwennap says.
My mother-in-law’s decade-old Pentium 4 PC chugged a bit (especially to my performance-oriented eye), but it held up fine for basic Web use and standard-definition video watching. What’s more, the need for cutting-edge silicon could drop even further as more and more tasks that once required beefy computers transition to off-site cloud servers. Witness the Pixlr program and Nvidia’s audacious GeForce Grid initiative,A as well as the multitude of streaming video services. Indeed,A Chromebooks are getting popular for a reason.
Intel’s Core 2 Duo and Quad chips hit the streets way back in 2006, and they still perform well even if you’re pushing your PC beyond basic Web-based tasks. Gamers can still play most modern titles (like Borderlands 2 and Skyrim) at solid detail settings and HD resolutions on Core 2-based computers. Fairly recent testing byA Tom’s Hardware and OCAholic shows that Core 2 processors compare decently against more current AMD processors and midrange Intel Core chips. Older AMD chips, such as 2009’s 3.4GHz AMD Phenom II X4 965 Black Edition, still have game as well, according to happy Newegg customers.
There’s a reason for that, Gwennap says. Moore’s Law–at least as we commonly invoke it–has morphed into Moore’s Kinda Debunked Theory in recent CPU generations.
“I think we’ve been falling behind Moore’s Law ever since Intel hit the power wallA back in 2005,” GwennapA said in a phone interview. “At that point, power really became the limiting factor, not the transistor side.” The performance improvements slowed even more dramatically after Intel released the Nehalem architecture in late 2008.
Moore’s Law slams into the (power) wall
Before we dive too deep, a quick primer is in order. Moore’s Law takes its name from former Intel CEO Gordon Moore, who predicted in 1965 that the number of transistors on integrated circuits would continue to double every other year. Most people used a modified version of the term, uttered by Intel executive David House, which claims that computing power doubles every 18 months.
The letter of Moore’s Law technically still holds true. It’s the intent of Moore’s Law (as verbalized by House) that’s lagging.
“[Intel’s] performance growth has slowed to a crawl,” Gwennap wrote in a Microprocessor ReportA column in December 2012. “…Even counting a modest boost for the new Sandy Bridge CPU, performance is increasing at just 10 percent per year for desktops and 16 percent for laptops [between 2009 and 2012], a far cry from the good old days of 60 percent annual performance increases.”
In other words, newer processors are no longer head-and-shoulders better than their predecessors. For Average Joe, who primarily works inside Facebook, email and iTunes, the everyday difference between an older Core 2 processor and a modern Core processor is negligible, no matter what the benchmarks say.
“I absolutely think the slowdown in computing performance gains play a big factor [in slowing PC sales],” Gwennap told PCWorld. “Maybe even more so than the whole tablet thing. Why would you replace your PC if it’s not noticeably faster than what you bought two or three years ago?”
CPU performance takes a backseat
The “CPUs are good enough” schtick is nothing if not contentious, however.
“I’ve been here 20 years and people were saying Windows 3.1, a 60 MHz Pentium, and 1MB of RAM were ‘good enough’ way back in the 90s,” Intel PR manager Dan Snyder told PCWorld via email. And he’s totally, utterly right. The “good enough” meme has been around forever. (Remember the myth about Bill Gates saying that 640KB of memory ought to be enough for anybody?)
Here’s the thing this time, though: Snyder went on to list several examples of Intel’s latest technology endeavors–tablet system-on-chip processors, Android-friendly processors, enhanced onboard graphics–and while all of them are highly intriguing in their own right, none involve pushing pure CPU performance. (And how could they, with the power-wall limitations?)
Instead, modern-day CPUs have focused more on introducing value-adding extras to augment the incremental year-to-year computing-performance improvements. Onboard graphics have improved tremendouslyA over the past few years, most notably in AMD’s accelerated processing units (APUs) and the HD Graphics 4000 visuals baked into some of Intel’s Ivy Bridge chips. In fact, integrated graphics have reached the point where they can deliver fairly smooth gameplay experiences if you’re willing to dial down your detail settings.
Reducing energy consumption is another focus for chip makers, and not just to enhance battery life in tablets and notebooks. The energy and graphics gains introduced in modern processors can actually help compensate for the incremental CPU gains.
“Moore’s Law was always about the cost of the transistors just as much as it was about performance increasing, as you could afford more and more of them,” Gary Silcott, the senior PR manager for AMD’s APU and CPU products, said in an email. “As the physical limits of the materials are stretched, and the cost[s] of the factories rise, at some point the cost of the transistors requires you to raise performance and extend battery life in the design itself. That’s why AMD moved to heterogeneous computingA with our APU architectures. By combining different processing engines [such as graphics processors] on the same system on a chip, you can address a much broader range of workloads, with GFLOPs of compute power in a very small area of silicon and with very little power.”
Does that mean what it sounds like?
“Absolutely,” he said when I asked him whether AMD planned to focus the bulk of its processor development on improving energy efficiency and integrated graphical capabilities, rather than fixating on sheer CPU performance. “The question gets to the heart of everything we have been talking about.”
A unified vision, it seems, may just be the future of processors. Last year, AMD, Qualcomm, ARM, Samsung, Texas Instruments, and other leading chip makers created theA Heterogeneous System Architecture FoundationA to “drive a single architecture to overcome the programming limitations of today’s CPUs and GPUs.” Rather than knocking down the power wall, the HSA Foundation hopes to skirt around it with parallel computing.
The silver lining
Even though pure CPU performance isn’t accelerating fast enough to encourage recurring PC sales, the HSA Foundation’s work points to a bright future for average Joes and embittered hard-core video encoders alike. And even if that vision falters in the details–Nvidia and Intel are notably missing from the group–industry leaders are hard at work to advance the cause of the CPU itself.
Both AMD and Intel invest heavily in R&D to stay on the bleeding edge of technology. In particular, Intel has a mind-boggling $18.2 billion–billion!–earmarked for research and acquisitions in 2013 alone, with plans to move to bigger CPU wafers and new lithography technologies that will enable the company to create ever-smaller transistor sizes in the coming years. (Ivy Bridge’s 22nm-process step is just the beginning.)
Meanwhile, Intel’s push towards ubiquitous computing–gesture controls, speech recognition, and so on–not only advances traditional interface models, but the technologies involved also require strong computing heft. Sneaky, sneaky.
Next, the temporary dA(c)tenteA on emphasizing computing performance at any cost is actually a good thing for the PC industry, much as it pains my hard-core-geek heart to say it. With their backs against the power wall, Intel and AMD have been free to innovate in other technological areas, allowing them to introduce changes that are altering the very concept of computers as we know them.
“The lines are increasingly blurring between mobile devices, with Ultrabooks, tablets, and convertibles with touch,” Intel’s Snyder said, and he is correct yet again. If the company hadn’t been able to focus its efforts on power efficiency and graphical fortitude, would a paradigm-shattering device like Microsoft’s Surface Pro tablet even be around today? I’d wager not.
The launch ofA Intel’s next-gen Haswell chips should herald a time of thin, fanless tablets-slash-laptops with full computational chops and all-day battery life. AMD’s next-gen APUs and newly unveiled Turbo Dock technology promise the same ubiquitous hybrid-style potential, and 3D gaming will be supported everywhere.
That way lies the future. The absence of skyrocketing performance advancements has undoubtedly left many people clinging to older PCs long beyond the traditional upgrade time frame, but the lull has also opened doors that would have been left closed if AMD and Intel had kept the pedal to the CPU’s metal. Consider the power wall and Moore’s Kinda Debunked Theory a temporary regrouping and refocusing–not a death knell.