Enterprise computing's 64-bit era arrives
The 64-bit standard points the way not only to superior performance with today’s most advanced hardware and software, but a future in which performance scales to hardware with capacity so high that it is currently only theoretical.
A vast infrastructure upgrade is taking place around us, without the clatter of jackhammers, the roar of bulldozers, or even a cent of new government spending.
The upgrade is being led by top technology vendors, businesses of all sizes, and thousands of individual buyers.
It is a quiet evolution of bits and bytes, yet it could form the foundation for the next revolution in computing performance and scalability: The 64-bit era.
Introductions of 64-bit hardware have been steady since 2003, including IBM’s PowerPC 970, 64-bit extensions, AMD’s x86-family products and Intel’s Itanium.
For decades, the size of the internal “words” or codes computers use to receive, process, store, and access information has steadily evolved. From its basic two-bit beginnings, representing the simplest calculating device that could be considered a computer, the “architecture” of a computer has repeatedly had its code length doubled—expressed in a framework of data bits—to process more sophisticated operations and increase system capacity.
Thanks to the mathematical principles underlying digital computing, each doubling has provided more-than-double gains in complexity, leading ultimately to today’s advanced 32-bit architecture. Compared to the simplest two-bit computers, it represents more than a billion-fold increase in the number of values that can be represented within processors, memory, storage devices, and operating systems.
But the ever-increasing workload put on modern computers has begun to challenge the capabilities of even this well-entrenched standard. Now IT managers and technology vendors are taking the next architectural step, 64-bit, to prepare for the emerging and future demands being placed on computers.
What is driving the 64-bit migration? In a word, memory—and specifically, the ability of 64-bit systems to use a lot more of it. According to one expert quoted by InfoWorld, making larger memory available “will have a big effect on any application that needs to store large amounts of data in memory. Databases, e-mail servers, collaboration software . . . will benefit from the move to 64 bits.”
32-bit architecture has served as a de facto industry standard since the 1960s. In the early years of this era memory and storage was expensive and no one could envision needing more than the maximum four gigabytes of memory that 32-bit systems (then largely limited to large businesses and scientists) could address.
Even then, some supercomputers used 64-bit registers (process storage spaces), but mainstream business computers and applications were still tied to the 32-bit standard.
As data sets got bigger, and enterprise computing demands increased, a new 64-bit architecture began to emerge in the mid-1990s, led by high-end workstation and server machines from HAL Computer Systems, Sun Microsystems, IBM, Silicon Graphics, and Hewlett Packard.
Consumer electronics products got in on the 64-bit act even before personal computers did, in the form of the Nintendo 64 and the PlayStation 2.
From 2003 onward, the 64-bit hardware train gained momentum with the introduction of 64-bit processors including IBM’s PowerPC 970, 64-bit extensions to AMD’s x86-family products, and Intel’s Itanium.
Even as 64-bit hardware hit the market, enterprise-level operating systems and applications in the 1990s were not quite ready to take full advantage of it yet.
32-Bit Barriers, 64-Bit Possibilities
According to Steve Kinney, senior architect and development manager at enterprise content management maker Perceptive Software, large organizations with heavy computing requirements test the limits of all types of enterprise software compiled for the 32-bit standard. Regardless of the speed or number of processors they run on, heavily utilized 32-bit applications running under 32-bit operating systems face inherent performance barriers based on the number of operations possible per clock cycle and the maximum amount of addressable hardware memory the architecture allows—about four gigabytes.
In fact, says Kinney, “The addressable memory limit of 32-bit applications running under 32-bit operating systems forces developers to make tradeoffs between memory usage, performance, and scalability.”
Transitioning to a 64-bit architecture rebalances that equation, he says, ultimately giving developers better options for optimizing both performance and scalability far into the future.
For example, access to additional memory can be used by developers of 64-bit applications to cache performance-critical data and reduce overall system latency. That can mean real-world performance gains for systems under heavy processing loads.
Engineered for Growth
The 64-bit standard points the way not only to superior performance with today’s most advanced hardware and software, but a future in which performance scales to hardware with capacity so high that it is currently only theoretical. (See sidebar below.)
Key tools have been used to support the transition to 64-bit. Processor-based “compatibility modes” and “emulation,” methods allow 32-bit applications to run under 32 or 64-bit operating systems on 64-bit processors.
While these may be convenient tools for environments in which true 64-bit applications need to run alongside 32-bit applications that have not yet been recompiled, Steve Kinney points out that such workarounds lose the inherent performance benefits of 64-bit processors and operating systems.
“For optimum performance and nearly limitless scalability, the only valid strategy for 64-bit success is a complete combination of 64-bit processors, operating systems, and applications,” Kinney says.
According to Tony Clark, team lead for platform technology at Perceptive Software, the long-awaited combined availability of 64-bit processors, operating systems, and applications is finally at hand, giving organizations with heavy computing requirements an option to scale large enterprise systems without losing performance.
However, Clark cautions that 64-bit architecture is not a universal speed-up elixir. “Client-server applications, in particular, are sensitive to server bottlenecks,” he says. “While 64-bit versions of those applications may not provide noticeable benefits to individual clients in environments where they’re competing for server time with a limited number of other users, 64-bit can help IT managers maintain a responsive user experience and high system throughput, even as the number of connected nodes dramatically increases.”
Clark says the rise of the architecture in large-enterprise settings, and the need of these organizations to run a “pure” 64-bit shop without emulation or compatibility modes for 32-bit programs, is driving software developers to recompile applications for 64-bit infrastructures.
Even now, however, the lack of availability of some 64-bit device drivers, an important piece of the operating system puzzle needed to take advantage of system hardware add-ons unique to each customer, could prevent comprehensive 64-bit conversion in some settings.
Clark also points out that it is multithreaded applications — those capable of running on multiple processor cores—that will be well-positioned for even newer generations of 64-bit processors with an ever-increasing number of cores. In practical terms, it will fortify each server to support dramatically growing user loads without slowing down . . . a bit.
The 64-Bit Advantage
No matter how much RAM your current system has, the fixed memory limit of two gigabytes per application process imposed by any 32-bit operating system remains a barrier to optimum performance under the most demanding enterprise computing loads.
Large data sets, common in database-intensive, multiuser applications, must be constantly managed by the processor, application and operating system to stay within the two-gigabyte area of addressable memory where calculations are performed. In most cases, a single application does not have access to a full two gigabytes of physical memory, so an alternative approach to memory allocation is needed. This approach, called “virtual memory,” combines physical RAM with hard disk space into a data storage block that appears as a seamless entity to the application but is relatively slow compared to performing the same operations completely within physical memory.
64-bit processors and operating systems vault 64-bit applications far over the two-gigabyte wall of current 32-bit applications. Current 64-bit operating systems raise addressable memory to two terabytes, a thousand-fold increase compared to 32-bit systems, enabling many more calculations to be performed solely in high-speed physical memory. The end result is less reliance on the relatively slow disk-based input/output operations of virtual memory and sustained enterprise application performance even in highly scaled settings