ITExpertVoice: Understanding Microsoft’s Server Roadmap

While Windows 7 is getting all the attention, especially here at ITExpertVoice, Microsoft has a few other irons in the fire and has been hard at work updating its rather extensive server line. Some of new technologies in its latest desktop are slowly finding their way into its Windows Servers series of products. Let’s give you a roadmap to understand what is new in the server side of things and how they all fit together and make use of Windows 7.

Microsoft provides five dizzying ways that you can take a closer look at their servers. Many of these products have free trial versions that you can download, some for 30 or 120 or even 180 days before you have to purchase a real license. Others are set up on Microsoft-hosted sites that you can experiment with using just your Web browser to try them out. And some even have Virtual Hard Disk images (VHDs) that you can download and then run on a HyperV server to set up your own test network of virtual machines. There are also a series of “Virtual Labs” where you can watch videos and be guided through the product here on MSDN. There is no membership required, but you will need IE and XP to run the lab software. Finally, Microsoft is also beginning to make Amazon Machine Images available on Amazon’s cloud-based services so you can set up your own test networks there.

ITworld: The new changes to Microsoft Windows 2008 Servers

Microsoft has some new additions to its Windows 2008 Server line, but sadly it needs to have a better naming convention to make it easier to keep track. If you haven’t looked at this operating system since it was introduced in February 2008, now is the time to get closer and try it out.

This article in ITworld reviews the latest R2 SP1 version of Windows Server 2008 as well as previous versions, and what is new and notable.

Google vs. China, our first cyber war

Last week we witnessed the first Cyber War, but it didn’t go down quite as many of us expected. Instead of a group of anonymous hackers trying to take over thousands of infected PCs or trying to cut off access to critical infrastructure, we saw Google declare the first salvo in its war against Chinese censorship by moving its servers to Hong Kong.

The more I thought about this, the more I realized that this was war, declared by a private company on a nation state. Just because Google doesn’t have its own army (yet), or that no actual physical weapons were fired doesn’t make it any less of a battle. And it is only going to get worse for all of us as other private firms realize that they need to take control over their servers and intellectual property. What is curious is how few companies signed up for the cyber equivalent of the coalition of the willing – GoDaddy was one of the few. Not Microsoft. Not Intel. No PC manufacturer of any shape or size.

Let’s face it. No one wants to declare war on China, whatever form that will take. Most of our PC hardware components are made there. More people are using the Internet in China than the US total population, and it is growing quickly, too. And while the breaches on several Google accounts had Chinese origins, getting accountability isn’t easy.

Coincidentally, while all this was going down I was reading a preview copy of Richard Clarke’s new book called Cyber War. I highly recommend pre-ordering a copy. Clarke was a national security advisor to several presidents and teaches now at the Kennedy School at Harvard.

The book is chilling account of exactly what is wrong with our government and how unprepared we are for Cyber World War I. How so? Think of a Cyber War in terms of nuclear proliferation and the Cold War preparation. But unlike what we did in the 1960s to defend ourselves against possible nuclear annihilation, we are doing everything wrong for a cyber defense. Instead, we have made America more of a target, because so much of our infrastructure, our weapons, our culture, and our PCs are out in the open, ripe for the picking. Look at how easy it is to hijack the drone video feed as a starting point (although the control systems are secured, for the moment.) Clarke talks about various war game scenarios and at one he mentions:

“If you have a mental image of every interesting lab, company, and research facility in the US being systematically vacuum cleaned by some foreign entity, you’ve got it right. That is what has been going on. Much of our intellectual property as a nation has been copied and sent overseas. Our best hope is that whoever is doing this does not have enough analysts to go through it all and find the gems, but that is a faint hope, particularly if the country has, behind the filtration, say, a billion people in it.”

He mentions how there were times when computer professionals working for the Hopkins Applied Physics Lab back in 2009 discovered a data breach. The only way they could solve it was to disconnect their entire organization from the Internet and clean each PC, one by one. “If you are connected to the Internet in any way, it seems, your data is already gone [overseas].”

The problem is that the best defense in a Cyber War isn’t the best offense. Nope: it is hardening your connections. Look at what China has done with its “Great Firewall.” Most of us think this is to keep the porn and liberal thinking out of China. And yes, it does do that. But what is really going on is that in the event of a Cyber War, China can quickly pull the plug and disconnect from the world, to defend itself. Trying asking AT&T or Level 3 to do that here. Ain’t gonna happen.

Another part of the problem is that there is no one actually “tasked,” as they say in DoD-speak, with defending our power grid control systems, transportation networks, and so forth. Where are the cyber equivalents of nuclear strike forces in case someone hits one of these targets? Nowhere. DoD has its own ships, planes, and troops to worry about. Homeland Security is trying to keep shoe bombers and the like out of our skies. What is left is up for grabs. Call it the cyber gap. “Can a nation shut off its cyber connectivity to the rest of the world, or spot cyber attacks coming from inside its geographical boundaries and stop them?” China probably can. We can’t. In an odd twist of irony, the less developed a nation is, say Afghanistan or North Korea, the better defended it can be, because so little of that country’s resources are hackable. How many power grid control rooms have VOIP phones, bringing the Internet literally to the right desktop?

In the past, spies had a harder time of it. They had to physically copy plans, or data, or compromise an actual human being. Now, they can sit in their jammies and download entire manuals without anyone noticing.

When Obama was elected in the fall of 2008, Clarke was an advisor to the transition team. He asked everyone on the team to stop working on their home PCs and even provided brand new Apple MacBooks that were locked down so they couldn’t connect to the public Internet. When the users complained about this when they tried to access public Wifi networks, he “tried to quietly point out that if you are a senior member of the informal national security transition team, you probably should not be planning the takeover of the White House from a Starbucks.” Gulp.

That is the problem. We are too used to our connectivity, and have gotten too complacent with our computers. A lot remains to be done. You have been warned.

Is Google the next evil empire?

The news last week that Italian authorities have convicted three Google executives with criminal privacy violations got my attention for two reasons. One, the charges are based on a video that shows an autistic boy being bullied, a video that Google did not create or post. It was filmed by cell cameras and posted more than three years ago, and indeed one of the executives has since retired from Google. Two, none of the three live or work in Italy, and a fourth executive – a product manager – was acquitted. We truly live in a global village, and one in which the legal operations move slower and slower. As someone who was bullied as a child, I get this, although not sure that justice really was served here.

This case comes on top of the company’s missteps with Buzz, where it had to alter the default privacy settings after a rather embarrassing launch and lots of fanfare.

Has Google become more evil, or is it just the contentious times we live in that makes this sad state of affairs possible? One thing is clear, though: Google is becoming bigger and buying more and more companies that have products or services that I use. Picnik (online photo editing) and Etherpad (online real time document collaboration) are just two of the more recent acquisitions. The Etherpad acquisition was also a bit troubling, where the company had first announced they were turning off the service, then had to restore it after numerous complaints.

I still think the vast majority of people at Google adhere to the company’s ten founding principles, which is more than I can say for my dealings with Microsoft over the years. Certainly both companies are hyper-competitive. But the very nature and pervasiveness of Google’s online services makes it more pernicious, and has a greater potential for abuse, as the recent news indicates. But it also means that they can turn more quickly when they make a mistake: the Etherpad issue was resolved in a day or so. Imagine Microsoft trying to do that. Indeed, try finding something similar to this document on Microsoft’s Web site: you will find a lot of corporate doublespeak, rather than the plain spoken “Ten Things” that Google professes.

While all this was going down in Italy, I was reviewing what information Google has stored on me in Google Accounts. If you haven’t had a look at your “dashboard” lately, it is instructive to see exactly what Google can track on you. In my case, I use a ton of different Google products, and recorded for posterity include the following:

  • My most posts to my Blogger blogs
  • What items Google Alerts has located that mention my keywords
  • The three people I most often email in my contact list
  • The most recent Google Doc that I have edited and how many of them have been supposedly “trashed” but are still accessible
  • My complete Google Chat history of more than 1500 conversations
  • The photos stored in Picasa, fans and favorites included
  • My history of calls made on my Google Voice account
  • My most recent Web browsing history, including search terms, images downloaded, maps visited and news items read
  • And there are 12 other Google products that aren’t yet tracked, including AdSense, Knol, and Groups too.

You get the picture: there is a lot you can learn about me when you scroll through all this data, and a lot that I would prefer remain private. All it takes is someone to guess a single password, too. That is scary, and I hope that “do no evil” thing is still very much in force in the years to come.

ITExpertVoice.com review on ZInstall

If you are looking for in-place migration of XP desktops, you could use Laplink’s PC Mover. But if you want to be able to preserve your XP desktop and switch back to it when you need to run an application that doesn’t work on Windows 7, then you should consider Zinstall’s XP7. It creates an XP virtual machine with all of your old apps and files that is just a mouse click away.

This sounds a bit like what Microsoft supports with its XP mode for Windows 7, but not quite. The problem, as you can see from
this Web page on Microsof’s site is that XP mode is only supported with limited “V-chip” CPUs. You also need to reinstall an entire XP desktop on the virtual machine from scratch.

Zinstall works by taking the “windows-old” directory that the Windows 7 installer creates and uses it to rebuild your original XP desktop. It is a neat trick, and I really wanted it to work. But no matter how many times I tried, I couldn’t get a stable machine from the product, and so I can’t recommend Zinstall until they do some additional quality control.

If you want to experiment, make sure you use a drive imaging tool (I use Acronis or Symantec’s Ghost) to create a backup copy of your XP desktop first. Next, you need to disable your firewalls and uninstall any anti-virus software. Now you install Windows 7, making sure to boot from the install CD and choose the custom in-place install option where it copies the Windows OS and all your applications to that “windows-old” directory.

Once that is done, you can start up Windows 7 and install the Zinstall software. Zinstall actually supports two different migration scenarios: besides the in-place one, the other is to migrate between two different computers. Choose the “only have this PC” and that you are doing an in-place migration and then hit the big GO button as you can see in the screen shot below.

The process will take several minutes to an hour to complete, depending on how large of a hard drive you have. Speaking of which, you want to make sure that you have plenty of extra room to install Windows 7 as well as the working copies of Zinstall’s files too. I would estimate a spare 30 or 40 GB should be enough. You can filter out particular files – like videos and mp3s — that you don’t want to migrate if you are tight on space.

Once this process is done, you can switch back and forth between XP and Windows 7 by clicking on an icon on the taskbar. Booting up your XP desktop will initially take some time – after all, you are loading a new VM here. But once that is done, switching between the OSs takes a second or two. If you have used VMware or something similar this will be very obvious. You leave your existing XP desktop unchanged, with its existing apps (that may not run under Windows 7). Everything on your old XP machine is still preserved, including files and applications. These aren’t migrated to Windows 7 – you have to install new apps now just as you would for any new OS install. This differs from PC Mover, where you give up your older XP machine and migrate it completely over to Win 7. You can even view and access the files on the other OS too, again by clicking on the taskbar icon.

Too bad this wasn’t quite my experience. I began this review trying to migrate the oldest PC that I had in my office, an old XP without any service packs. I couldn’t get the migration to complete without errors, and I wasn’t sure if it was because of my three drive partitions or unused video driver for a card that I no longer had in the PC or some other gremlin. Next I set up my Dell Dimension desktop with a virgin copy of XP with SP2, and got a fresh version of Windows 7 installed on top of it. The Zinstall setup worked just fine until I tried to reboot the PC, and then I somehow trashed the master boot record so all my efforts for the day were lost. After I jiggled my BIOS battery, I was able to get a working drive again and I could start taking complete breaths.

Windows 7 networking controls video screencast

This week I begin a new series of video screencasts for Dell’s IT Expert Voice Web site. The site has all sorts of useful information for corporate IT folks that are interested in migrating and using Windows 7, and my humble part will be to produce a regular series of videos similar to what I have been doing on my own over at WebInformant.tv. Do check out this video which talks about the differences between Windows 7 and earlier versions when it comes to networking controls.

When did the browser become the next OS?

“We view the Internet as the fourth desktop operating system we have to support after Windows, MacOS, and DOS.” That quote was from an executive at McAfee, and DOS gives it away that it was spoken back in 1996.

With the announcement that Google will develop a quick-start operating system by next year for instant-on netbooks, I thought it might be interesting to take a trip down memory lane and remind us how we have gotten to the point where the browser has become the next OS, and probably now moving into first place rather than fourth.

Of course, the smarmy retort to Google’s announcement is that we already have a quick-start, ultra-reliable Web OS, it is called OS X and my MacBook takes about five seconds from when I open the lid to when I can be surfing the Web. Unlike many Windows PCs, I don’t have to have a degree in advanced power management techniques with a minor in spam and virus prevention to get this to work.

But let’s go into the WayBack Machine to the early 1990s and see the context of that McAfee quote.

The first collection of Web browsers literally weren’t much to look at, because they only displayed characters and basically just a page of hyperlinked text. This was the then-popular Lynx that was initially designed back in 1992 for Unix and VMS terminal users (that was back when we called them that). Think about this for a moment: this was pre-iporn, pre-IPO Netscape, pre-real Windows — when the number of Web servers was less than a few hundred. Not very exciting by today’s standards.

Then Microsoft got into the game, and things started changing. With the introduction of Windows 95 we had the beginnings of a graphical Internet Explorer, which ironically was licensed from the same code that Netscape would use to create their browser (and eventually Firefox). Windows 95 came with both IE and Windows Explorer, and the two were similarly named for a reason: browsing pages of the Web was the beginnings of something similar to browsing files on your desktop. Things didn’t really get integrated until IE v4, which came out about the same time as Windows 98, and they were so integrated that they begat a lawsuit by the Justice Department. At the end of 2002, Microsoft was legally declared a monopolist and had to offer ways to extract IE from Windows going forward for users who wanted to install a different browser.

During the middle 1990s, we began to see better support for TCP/IP protocols inside the Windows OS, although it really wasn’t until the second edition of Windows 98 that we saw Microsoft improve upon the browser enough that they could include it as part of their Office 2000 product. Before then, we had separate drivers and add-on utilities that required all sorts of care and feeding to get online, in addition to using AOL and Compuserve dial-up programs.

As an example of how carefully integrated IE was with Windows, when Microsoft released IE v7 along with Vista, initially you needed to verify your license of Windows was legit before you could install the latest version of IE on earlier operating systems. That restriction was later removed.

And lately Microsoft has announced its next version of Office 2010 will have even further Web integration and the ability to create online documents similar to the way Google Docs works. Google Docs is an interesting development of itself, because now documents are stored outside of the desktop and managed through a Web browser. As long as I have an Internet connection, I don’t need any software on my local machine to edit a document or calculate a spreadsheet.

So what is the real purpose of an operating system? Originally, it was to manage the various pieces of your PC so that your applications could talk to your printer or your hard drive or display characters on your screen without having to write low-level programs to do these tasks. Three things have happened since the early PC era:

First, as the Web and cloud computing became more powerful, we stopped caring where our information is located. In some sense, having documents in the cloud makes it easier to share them across the planet, and not have to worry about VPNs, local area network file shares, and other things that will get in the way. And we even have cellphones like the Palm Pre that have a Web OS built in, so that applications don’t have to be downloaded to the phone but can run in the cloud. At least, when developers will finally get their kits to build these Pre apps later this summer.

Second, as the desktop OS matures, we don’t have to worry about the underlying hardware as much because that hardware has gotten more generic and the OS has taken on a bigger role (to match their bigger footprints too). Although printer drivers are still scarce for Vista, and 64-bit apps aren’t as plentiful, for the most part we don’t need a “thick” desktop OS. Yes, there are enterprise apps that need the OS, and some that need a specific version of Windows too, but most of our computing can be done without really touching much of the OS.

Finally, the browser is the de facto Windows user interface. Perhaps I should say the browser plus Ajax or the browser plus Flash. But most applications that were formerly client/server now just use browser clients, or run inside a browser with minimal desktop downloads. This has been long in coming, but now Rich Internet Applications can be considered on par with local Windows and Mac ones.

So here we are, at the dawn of the new Google OS. We have come full circle: from the green-screen character mode terminals of the mainframe and Unix era to the browser-based Webtops of the modern era. This doesn’t mean that Windows 7 or 8 or whatever will become obsolete. Just less important. And given the multiple billions of dollars that Microsoft has made over the years from Windows (and let’s not forget dear old DOS), you can imagine that there are some nervous folks up in Redmond these days.

One lesson learned from Google App Engine failure

If you are going to rely on cloud computing, make sure your vendor has a completely independent infrastructure set up to notify you when its cloud service fails. Google didn’t with its failure of its App Engine last week and as far as anyone knows, still doesn’t.

The System Status site is hosted separately from App Engine applications, and is not typically affected by availability problems. However, due to the
low level problem with GFS [Google File System] in this case, the System Status site was also affected.

Oops.