The differences that silicon can make to the desktop
What's under the bonnet
Workshop Looking at the sleek laptops, all-in-one and small form factor PCs of today, they have changed beyond all recognition compared to the deskbound, utilitarian behemoths of even a decade ago. Much of this change is thanks to the evolution and integration of the internal components of PCs, enabled by advances in manufacturing processes that have led to a huge increase in the number of transistors on individual chips.
Functions that used to be on individual and separate chips, such as the CPU, the graphics chip and controllers for memory and I/O, have been brought together to reside on fewer physical chips. The latest generations of chips are moving towards a single chip containing all the essential functions necessary to build a PC, and are often termed a “System on a Chip” or SoC.
Integrating all these components has had a number of positive effects. Multiple CPU cores can increase system responsiveness and performance when running multiple applications. Performance has also been boosted by integration, since moving the components such as GPU, memory and I/O closer together allows for more effective communications directly on the die, rather than having to cross different chip boundaries and busses. Reducing off-chip communications – often one of the highest consumers of energy - has the effect of reducing power consumption, and because the integrated components can be better optimised as they are designed together, the power efficiency can be further improved.
The great thing is that this performance increase literally comes for free, without requiring additional operating system and application software development effort. The functions are compatible with what has gone before, but work much better. It allows the latest operating systems and business applications to run well, while also supporting many of the media and entertainment applications that users often wish to use - even with corporate kit. Another very tangible side-effect of the increasing integration is the ability to build thinner, smaller and lighter PCs that make less noise and run much cooler. Better performance and nicer looks without more effort - just the things that end-users are typically the most vocal about when unhappy.
The other path that designers have followed is to add new functions and capabilities to the chips. These may include instructions to accelerate security functions such as encryption, or specialised co-processors that can help with management functionality. They may also include modifications that allow the graphics processor (GPU) to act as a co-processor to speed up certain high-performance applications. This approach has a lot of potential to add value.
Some of the new features are pretty easy to take advantage of, but in many cases they may also require a lot of additional investment in tools and specific software developments for the benefits to be recognised. A good example is the move to 64-bit computing. AMD introduced 64-bit extensions to the x86 architecture in 2003, and these required modifications to the operating system and applications in order to be used.
Taking advantage of the 64-bit functions has required moving to a different version of the operating system and applications which is a big undertaking – for example, Adobe Flash, which is very widely used for web applications and video content, does not yet support 64-bit browsers. The result is a slow migration targeted where the limitations of 32-bit have been felt most acutely such as applications requiring large amounts of memory.
One of the most important extensions is also the most overlooked - the management capabilities being built into business PCs. On-going operational costs and user support generally outweigh the initial purchase cost of a PC, often by an order of magnitude. Modern enterprise PCs include capabilities in the silicon platform to enable remote management and patching, OS re-installation, and more effective user support through keyboard, video and mouse sharing, even if the operating system cannot boot.
The problem is that few companies as yet have started to take advantage of the management features built natively into the silicon. Part of this is due to the lack of visibility of the new features, as they require additional software to realise the value. Many of the leading management solutions now integrate with technologies such as Intel vPro, which can help ease adoption, but arguably these features should come with at least basic tools to allow them to be used out of the box.
Just as big a problem is that awareness of the management capabilities of the silicon is low, and is coupled with a mix of capabilities within the PC environment. Some PCs – notably consumer based PCs, older PCs and some laptops - will lack the management capabilities, while some will have only a partial set of features and still others the full set.
Without a strategy to approach utilising this inbuilt management capability, the installed base remains at risk of being mixed ability. With substantial improvements in operational costs and risk mitigation potentially up for grabs, it is worth investigating what the minimum standard in hardware management features should be so that when the time is ripe, the installed PC base is ready to participate. ®
Yes and no
A number of the benefits that have been talked about only occur if the OS and app software support it; particularly graphics acceleration and low-power modes. Using very old software (or even new software missing the correct drivers) cannot take advantage of these features.
On the other hand, if you compare the same kind of software now with that of 10 years ago, the newer software requires a LOT more of the silicon to even do the same things as the old one did. Compare Office 97 and Office 2007 for example, or Windows 2k and Vista. How much more processing Word needs now, just to show letters on the screen when typed from the keyboard! Or webpages that take forever to render because the page source is the size of war and piece, with 2000 separately loaded parts and several FLASH objects.
The software companies seem to think that the improvement in silicon is not to allow a more responsive system, but to mean that clicking the menu should use 1Gflop of processing for some wacky animation, or that optimisation of their software is not required because people can just buy yet faster systems.
It is a race, and I'm not sure that the hardware is winning compared to the software.
Intel market segmentation defeats management solutions
Management is not going to get much better as long as Intel insists upon artificially disabling features in their chipsets and processors. Intel persues a market-segmentation strategy, which makes it impossible to procure systems that implement the silicon-aided managment and security features. I have been sifting through the Intel product database for weeks, trying to figure out how to deploy systems based upon Intel Trusted Execution Technology (TXT) as well as error-correcting RAM, only to find that this is only supported in a handful of their highest-end server platforms. This nonsense is in stark contrast to AMD, who support ECC and and Secure Virtual Machine (SVM) on every chip they sell. (Some vendors do not support SVM in the BIOS, but many do so.) Intel thinks that we want McAfee anti-virus as part of vPro. Ha. I'd settle for ECC and TXT across all product lines.
Windows done right
A multi-processor machine with all inclusive chips, each running its own program/window, now that's a laptop.
Done with a program, window closes, chip stops using energy.
Accept for storage access, theoretically, nothing should be slower than the minimum value of the processor use.
Please note, I am not proposing that a processor can't run more than one program, only that, functional dependency should be the rule.
The advantage vPro has is that it's free to use (using Intel's provided software). However, these "business class" machines are usually more expensive and less powerful (cheaper CPU, less RAM) than a common "consumer" PC. The cost difference may be offset by support requirements, however, things such as remote patch installation and remote KVM can be accomplished by having a proper WSUS setup and a VNC-style system in place. Granted, you don't get boot-screen KVM capability, but it's fairly rare (in my experience) that the OS won't boot at all. Usually it's just the garden-variety user-environment virus (you don't give end users admin privs I hope!) that can be wiped by booting into safe mode or (hopefully) caught by your Enterprise AV/Malware program.
In all, I think end-users would be more satisfied with a more-powerful machine with proper setup and config than a vPro-enhanced system. The IT staff would appreciate it as well, as they don't have to field calls of "my computer is running slow" near as much, and would potentially lengthen the computer refresh cycle by a good 6-12mo.
Zeke's Law #376
...Computers are smarter than management.
211... When you can't afford barely good enough.