4 GB: The next RAM barrier

Back in the early days of DOS, we had a memory barrier of 640 kB of memory. I know, it seems quaint now, something that you can find on the chipsets of audio greeting cards rather than real computers, but we spent a lot of time juggling applications to fit in that space. We had special hardware cards that could address more memory, and swapping programs (remember Quarterdeck?) that could allow us to run bigger apps. (And for those of us that are really old, we even remember the 64 kB barrier of the earliest Apple // computers!)

Now we are approaching another memory barrier, only this time it is 4 GB. That is the biggest amount of memory that 32-bit processors can access. It is a problem particularly for servers and has this eerie sense of déjà vu all over again for me.

Four gigs seemed like a lot of memory just a few years ago. We didn’t really need to worry, and our desktop operating systems seemed comfortable inside it. Then Microsoft got greedy with Vista, RAM got much cheaper and apps got bigger. Before you knew it, we are once again running out of headroom.

What is driving these bigger applications is the popularity of both virtualization and database servers. Virtualization is especially memory-intensive. If you want to take advantage of this technology, you have to bulk up your machine with lots of memory and disk. And the more RAM you throw at database servers, the happier they are.

Another big consumer of RAM is the video card and how it interacts with system memory. Some of them share their memory space with the PC, which means when you are running graphics-intense operations, you take away some of that RAM from all your applications. Again, we’ve heard this tune before. And most of us haven’t really paid much attention to the video card in our servers, because we didn’t think they needed much horsepower there. After all, we weren’t planning on running GTA4 on our servers, right?

There are solutions: run the 64-bit versions of Windows, or Linux, or even the Mac OS, which can address memory beyond 4 gigs quite nicely. This is nothing new on the Mac or Linux side, which has had 64-bit OSs for many years. Indeed, if you go back into the early 1990s, we had DEC Alphas and Silicon Graphics’ Irix and all sorts of workstations that were 64-bit processors and 64-bit OSs. Some apps are now only available in 64-bit versions, such as Microsoft Exchange 2007. Others, like Oracle 11g, are still available for both 32- and 64-bit versions.

The problem is with Windows, and particularly finding the right 64-bit drivers for these machines. Rewriting drivers isn’t sexy stuff, and generally the province of some very talented coders that are dedicated enough to stick to the project. One engineering manager I spoke to told me it took his team six months to rewrite his driver set, and it wasn’t a fun six months at that. “Microsoft’s driver signing requirements are intense, he told me. “And at the time we were engaged with them, they were adding and changing tests during the process without informing us, which increased the dev cycles and cost.”

This driver issue is tricky, because you don’t usually think about all of them that you need to upgrade when you are looking at your server portfolio, and generally you don’t know what you need until you install a test machine and see what isn’t supported. Then the fun begins.

So take some time to plan out your strategy if you are running out of RAM. Take a closer look at the new Windows Server 2008 64-bit version, and whether it will run on your existing hardware. And while you are at it, look at Apple’s X Serve too: it might be a lower-cost alternative to running all those virtual machines on a true 64-bit platform.

(this appeared in Baseline Magazine this week)

0 thoughts on “4 GB: The next RAM barrier

  1. David,

    Last year we upgraded our production database servers to some of the “quad quads” — 4 processors per machine, 4 cores per processor. And that made a huge performance gain for us. A few weeks ago we upgraded server RAM significantly — and the performance gains we are seeing on poorly performing queries are huge (mainly because we can now keep 2x as much data in cache, reducing the number of expensive reads from the disk).

    Some of our developers are even finding interesting uses for large capacity RAM-drives to do test builds faster by doing them completely off-disk.

    All pretty cool stuff, but you are absolutely correct in the 32 bit universe — getting more than 4GBs to address correctly is a huge pain/sometimes impossible.

    -Cary

  2. i spent much of my computer training working on old boxes (they didn’t trust me with the good stuff during school lol) but what i’m trying to say is that we’ve came along way from giant ram plates to something that fits in our palm.

    p.s. we now have mobos that support 8 gigs (non-server boards 🙂 )

  3. Try http://www.appsense.com/products/performancemanager/

    AppSense Performance Manager includes a util call Physical memory control.

    Used normally in the corporate space it works on most windows based platforms and helps reduce the RAM used by your apps – apps can be included or excluded if needed

    When enabled on my Windows XP machine, IE only takes 12MB instead of 26MB!

  4. should i mention the fact that you probably are not going to try and put this massive 8gb of ram on something that is using old hardware that requires these drivers.

    software thats not compatible with 64bit on the other hand can be a pain especially in the windows space. however if you have recent hardware or even just a few years old you shouldn’t have trouble going above 4gb. I’ve got more than 4gb and its a desktop machine.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.