This week I begin a new series of video screencasts for Dell’s IT Expert Voice Web site. The site has all sorts of useful information for corporate IT folks that are interested in migrating and using Windows 7, and my humble part will be to produce a regular series of videos similar to what I have been doing on my own over at WebInformant.tv. Do check out this video which talks about the differences between Windows 7 and earlier versions when it comes to networking controls.
Category Archives: microsoft and google
When did the browser become the next OS?
“We view the Internet as the fourth desktop operating system we have to support after Windows, MacOS, and DOS.” That quote was from an executive at McAfee, and DOS gives it away that it was spoken back in 1996.
With the announcement that Google will develop a quick-start operating system by next year for instant-on netbooks, I thought it might be interesting to take a trip down memory lane and remind us how we have gotten to the point where the browser has become the next OS, and probably now moving into first place rather than fourth.
Of course, the smarmy retort to Google’s announcement is that we already have a quick-start, ultra-reliable Web OS, it is called OS X and my MacBook takes about five seconds from when I open the lid to when I can be surfing the Web. Unlike many Windows PCs, I don’t have to have a degree in advanced power management techniques with a minor in spam and virus prevention to get this to work.
But let’s go into the WayBack Machine to the early 1990s and see the context of that McAfee quote.
The first collection of Web browsers literally weren’t much to look at, because they only displayed characters and basically just a page of hyperlinked text. This was the then-popular Lynx that was initially designed back in 1992 for Unix and VMS terminal users (that was back when we called them that). Think about this for a moment: this was pre-iporn, pre-IPO Netscape, pre-real Windows — when the number of Web servers was less than a few hundred. Not very exciting by today’s standards.
Then Microsoft got into the game, and things started changing. With the introduction of Windows 95 we had the beginnings of a graphical Internet Explorer, which ironically was licensed from the same code that Netscape would use to create their browser (and eventually Firefox). Windows 95 came with both IE and Windows Explorer, and the two were similarly named for a reason: browsing pages of the Web was the beginnings of something similar to browsing files on your desktop. Things didn’t really get integrated until IE v4, which came out about the same time as Windows 98, and they were so integrated that they begat a lawsuit by the Justice Department. At the end of 2002, Microsoft was legally declared a monopolist and had to offer ways to extract IE from Windows going forward for users who wanted to install a different browser.
During the middle 1990s, we began to see better support for TCP/IP protocols inside the Windows OS, although it really wasn’t until the second edition of Windows 98 that we saw Microsoft improve upon the browser enough that they could include it as part of their Office 2000 product. Before then, we had separate drivers and add-on utilities that required all sorts of care and feeding to get online, in addition to using AOL and Compuserve dial-up programs.
As an example of how carefully integrated IE was with Windows, when Microsoft released IE v7 along with Vista, initially you needed to verify your license of Windows was legit before you could install the latest version of IE on earlier operating systems. That restriction was later removed.
And lately Microsoft has announced its next version of Office 2010 will have even further Web integration and the ability to create online documents similar to the way Google Docs works. Google Docs is an interesting development of itself, because now documents are stored outside of the desktop and managed through a Web browser. As long as I have an Internet connection, I don’t need any software on my local machine to edit a document or calculate a spreadsheet.
So what is the real purpose of an operating system? Originally, it was to manage the various pieces of your PC so that your applications could talk to your printer or your hard drive or display characters on your screen without having to write low-level programs to do these tasks. Three things have happened since the early PC era:
First, as the Web and cloud computing became more powerful, we stopped caring where our information is located. In some sense, having documents in the cloud makes it easier to share them across the planet, and not have to worry about VPNs, local area network file shares, and other things that will get in the way. And we even have cellphones like the Palm Pre that have a Web OS built in, so that applications don’t have to be downloaded to the phone but can run in the cloud. At least, when developers will finally get their kits to build these Pre apps later this summer.
Second, as the desktop OS matures, we don’t have to worry about the underlying hardware as much because that hardware has gotten more generic and the OS has taken on a bigger role (to match their bigger footprints too). Although printer drivers are still scarce for Vista, and 64-bit apps aren’t as plentiful, for the most part we don’t need a “thick” desktop OS. Yes, there are enterprise apps that need the OS, and some that need a specific version of Windows too, but most of our computing can be done without really touching much of the OS.
Finally, the browser is the de facto Windows user interface. Perhaps I should say the browser plus Ajax or the browser plus Flash. But most applications that were formerly client/server now just use browser clients, or run inside a browser with minimal desktop downloads. This has been long in coming, but now Rich Internet Applications can be considered on par with local Windows and Mac ones.
So here we are, at the dawn of the new Google OS. We have come full circle: from the green-screen character mode terminals of the mainframe and Unix era to the browser-based Webtops of the modern era. This doesn’t mean that Windows 7 or 8 or whatever will become obsolete. Just less important. And given the multiple billions of dollars that Microsoft has made over the years from Windows (and let’s not forget dear old DOS), you can imagine that there are some nervous folks up in Redmond these days.
One lesson learned from Google App Engine failure
If you are going to rely on cloud computing, make sure your vendor has a completely independent infrastructure set up to notify you when its cloud service fails. Google didn’t with its failure of its App Engine last week and as far as anyone knows, still doesn’t.
The System Status site is hosted separately from App Engine applications, and is not typically affected by availability problems. However, due to the
low level problem with GFS [Google File System] in this case, the System Status site was also affected.
Oops.
Cool map of yesterday’s Google traffic foul-up
From Arbor Networks — they claim 5% of total Internet traffic is Google-related. That big gap was caused by routing errors on Google’s part.
Why Microsoft’s Hyper-V isn’t really gonna cut it
This is the console screen that you get when you try to run the bare-metal version called Hyper-V Server, the stripped-down version of Windows that allows you to run virtual instances on it, similar to VMware’s ESX. Notice the return to the days of DOS? And this after trying to find a NIC that has 64 bit drivers too and can be recognized by the OS.
I have a dream
It seems fitting to resurrect an old column that I wrote seven years ago in honor of Dr. King’s birthday and the coronation activities tomorrow. I made a few changes, but it still works today.
Twenty-some years ago, the PC was invented and our desktops would never be the same. And now we must face the tragic fact that our desktops are still not free. Twenty years later, our lives are still sadly crippled by the manacles of frequent crashes and by numerous security problems. Twenty years we have lived on a lonely island of poverty in the midst of a vast ocean of material prosperity. We are still languishing in the corners of American society and find ourselves exiles in our own technological land.
So I have come here today to dramatize an appalling condition. Windows has to go from our desktops. It is time for the ‘nixes (Unix, Linux and Apple’s OS X) to play a more major role, and for Microsoft to get with the program and fix this broken buggy whip.
I say to you today, my readers, that in spite of the difficulties and frustrations of the moment, I still have a dream. It is a dream deeply rooted in the American dream.
I have a dream that one day this nation will rise up and live out the true meaning of productivity. I have a dream, that all PCs will live up to their original marketing potential, and free their owners from the evils of Vista and frequent application crashes. I have a dream that one day our desktop PCs, sweltering with the heat of their overclocked CPUs, will be transformed into an oasis of freedom and reliable operations.
I have a dream that one day all of my applications will be able to sit down together at a table of brotherhood and play nicely on my PC, no matter what version of drivers and odd video adapter is inside my computer.
I have a dream that your and my children will one day live in a nation where they will not be judged by the version of operating system running on their desktop computer, but by the content of their work output on their hard disk.
I have a dream today.
This is my hope. With this faith we will be able to work together, to pray together, to struggle together, to stand up for freedom together, knowing that we will be free one day from having to reboot our computers every day, from crashed applications and inexplicable blue screens and error messages.
How I wish most of us could free ourselves from the tyranny of Windows and have a desktop operating system that didn’t crash frequently, could support our legacy applications, were easy to install and wasn’t a security sinkhole. Dream on. When I wrote this back in 2002, we didn’t have XP, we didn’t have Vista, and we didn’t have Mac OS-X. Even with these new operating systems, we still have a very unstable OS, driver issues (still) with Vista, and more security issues by the week.
But a guy can dream, can’t he?
Back to the future with Windows NT
To start off the new year right, I decided to go back in time and see what I could learn from running an ancient (by computing standards, anyway) operating system and software combination. To appreciate how far we have come (or not), and to see whether I could actually get real work done. The idea came about from some conversations that Jason Perlow and I had. Jason is a fellow blogger and IT consultant who now works for IBM. He and I at one point in our lives (although not at the same time) lived in Port Washington, N.Y. and spent a lot of time with OS/2, but don’t let that influence you.
I picked NT v4 as my starting place. This operating system is more than ten years old, and was probably the last OS that Microsoft created that had some real simplicity to it. To get an idea of the power of the OS, there are still many corporate servers running it, even though Microsoft has tried to stamp it out and turn off support and push people to upgrade to something more recent. To get around the driver issues and other challenges, I decided to set up a virtual machine running NT, and I am using VMware’s Fusion on my Mac (just to make it interesting).
Jason and I have the hypothesis that the OS doesn’t really matter anymore, and that if you can get beyond some of the issues with running older software and applications, you may find that an older OS is perfect for your needs. We also thought that running an ancient OS was a good way to see how far we have come with newer computers, and perhaps a way to extract some additional performance because the older OSs are smaller and theoretically could run faster on the newer PCs.
To get NT working properly, you need find versions of software either online or in someone’s attic that are not so old as to be useless. First off, I had to install Service Pack 6, and I also needed to install the right version of the SP too for the encryption level of the OS. You then install the VMware tools software, which supplies the drivers to get the most out of your machine. Then you install Microsoft Office 2000 – which is the most recent version of Office that will run on NT. I messed up by installing the tools package after Office, and VMware didn’t like that. Office 2000 has the unfortunate side effect of updating your NT version with an almost-working version of Internet Explorer v5. The reason I say almost-working is that you need another piece of software called the Windows Installer to get other software installed on this machine. I couldn’t get past this point, however.
I also put on Firefox v2.0.0.20 browser on the machine, which is a fairly recent version of the browser, but apparently not recent enough as I had some problems with certain Web sites. I had to update my Adobe Flash plug-in too. Finally, I added AIM v5.9, which is an older version of Instant Messenger software. Skype doesn’t have any version that will run on NT, which is too bad.
So what I found was that the VM version of NT was pretty snappy. It would boot from scratch in under 30 seconds, and faster still from the suspended VM state. I liked the old-fashioned Windows and the lack of glitz and raw simplicity of the controls. No Aero Glass junk for this OS! Another plus with using VMs is that you don’t have to worry about personal firewalls and anti-virus as much – you can set up a protected environment and keep it isolated from your host machine, which is good because most of the AV programs have stopped supporting NT a long time ago.
All of my Office documents – some of which were created on Macs, some on Windows, came up just fine in Office 2000, which is because I am not using the 2007 version that introduced a new file format that isn’t compatible with the older versions. Shame on you Microsoft – and I know from hearing from some of you how vexing that version could be.
The other thing I noticed is how important the browser is to today’s computing world, and if you aren’t willing to stay current with your browser, you quickly get into trouble with many Web sites. The coming of IE v7 is a good case in point, and I know there will be a lot of grief to be had on both ends – the people that adopt the new browser and find sites that don’t work in it, and the sites that want to use its new features and piss off the people that aren’t upgrading yet.
I will have more to report on this experiment as I spend more time back in NT land. And those of you that want to try this on your own, email me privately and I will give you more specific tips.
Making Sense of Microsoft’s Azure (Infoworld)
Last week Microsoft announced its cloud computing effort called Azure. Fitting in between current offerings of Google’s and Amazon’s, it represents a very big step towards moving applications off the desktop and out of a corporation’s own datacenters. Whether or not it will have any traction with corporate IT developers remains to be seen. Think of Microsoft as bringing more of a Wild West feel to the whole arena of cloud computing.
How to distinguish the players? If we think back to the late 1880s, Amazon provides the land grants and raw Linux and Windows acreage to build your applications upon. Google’s General Goods Store will stock and give away all the APIs that a programmer could ever use. And some of the scrappy prospectors that come to build the new towns are from Microsoft. Ballmer as Billy the kid, anyone?
Enough of the metaphors. Mary Jo Foley’s excellent explanation of the different bits and pieces of Azure here is worth reading. But the first instance of Azure is long on vision and short on the actual implementation: Microsoft calls it a “community technology preview”, what the rest of us would call “alpha” version, given how long it takes them to actually get things nailed down and working properly (version 3 is usually where most of us start to think of their code as solid). Granted, Google calls many of its applications beta that are in much better shape – I mean, Gmail has been in beta about 17 years now.
What are some of the issues to consider before jumping on the Microsoft train? First, consider what your .Net skill set is and whether your programmers are going to be using Jscript or something else for the majority of their coding work. The good news is that Azure will work with .Net apps right from the start. (support for SOAP and REST and AJAX will be coming, they promise.)
Microsoft spoke about testing and debugging these apps on your local desktop, just as you do now, and then deploying them in their cloud. The bad news is that you probably have to re-write a good portion of these apps so that the user interface and data extraction logic can work across low-bandwidth connections.
CTO Ray Ozzie, in a cnet interview, talks about “fundamentally the application pattern does have to change. Most applications will not run that way out of the box [on Azure].” While good programming practice today is to separate Web page content from formatting instructions, most programmers assume they are running everything on the same box. Remember how miserable LAN apps were back in the days of Token Ring? We have 10 gig Ethernet now, and people have gotten sloppy.
This is no small issue, and my prediction is that most apps will need some major surgery before they can be cloud-worthy. One wag already has placed his bets: Stephen Arnold writes in his blog Beyond Search, “I remain baffled about SharePoint search running from Azure. In my experience, performance is a challenge when SharePoint runs locally, has resources, and has been tuned to the content. A generic SharePoint running from the cloud seems like an invitation to speeds similar to my Hayes 9600 baud dial up modem.” For those of you that are too young to remember, that means verrrrry slow.
While you are boning up on .Net, you might also want to get more familiar with Windows Server and SQL Server 2008, because many of the same technologies will be used for Azure. One thing that won’t be in Azure is Hyper-V, apparently, we have another hypervisor to run the Azure VMs. Too bad, I was just getting comfortable with Hyper-V myself. Nobody said this was gonna be easy.
Speaking of servers, Microsoft is in the midst of a major land grab of its own, building out data centers in multiple cities and beefing up the ones it already has:
More good news is that they plan on using Azure to run their own hosted applications, and are in the middle of moving them over to the platform (so far, only Live Mesh is there today, here is an explanation of what this does for those that are interested). Right now, all Azure apps will run in its Quincy, Wash. data center, 150 miles east of Redmond, but you can bet this will change as more people use the service. At least Microsoft tells you this, Amazon treats it as a state secret how many and where its S3 and EC2 data centers are.
Of course, the big attraction for cloud computing is scalability, and Ozzie, in the same cnet interview, had this to say about it: “Every company right now runs their own Web site and they’re always afraid of what might happen if it becomes too popular. This gives kind of an overdraft protection for the people who run their Web sites.” I like that. But given the number of outages experienced by Amazon and Google over the past year, what happens when we have bank failures in Dodge?
Second, where do you want to homestead your apps: just because you want to make use of the Microsoft services doesn’t mean your apps have to reside on their servers. If you are happy with all the Google Goods, stay with that. If you like someone else’s scripting language, tool set, likewise.
What Microsoft is trying to do is manage the entire cloud lifecycle development, similar to how they already manage local app development with Visual Studio and dot Net tools. Yes, Amazon will let you build whatever virtual machine you wish in your little piece of cloud real estate, but Microsoft will try to work out the provisioning and set up the various services ahead of time.
Next, the real trouble with all of this cloud computing is how any cloud-based app is going to play with your existing enterprise data structures that aren’t in nice SQL data bases, and may even be scattered around the landscape in a bunch of spreadsheets or text files. Snaplogic (and Microsoft’s own Popfly) has tools to mix and mash up the data, but figuring out the provenance of your data is also not taught in traditional CompSci classes and is largely unheard of around most IT shops, too. Do you need a DBA for your cloud assets? It is 10 pm, do you know what your Widgets are doing with your data?
Next, pricing isn’t set, although for the time being, Azure is free for all. If we look at what Amazon charges for the kind of real estate that Microsoft is offering (2000 VM machine hours, 50 GB of storage, and 600 GB/month of bandwidth), that works out to about $400 a month. Think about that for a moment: $400 a month can buy you a pretty high end dedicated Linux or Windows server from any of a number of hosting providers, and then you don’t have to worry about bandwidth and other charges. And there are many others, like Slicehost here in St. Louis, who can sell you a VM in their data center for a lot less, too.
However, Amazon’s S3 storage repository is amazingly cheap, and getting cheaper as of this week: in fact, they are now charging less per GB the more you store with them. Microsoft should set tiered, fixed monthly pricing and make the storage component free. I am thinking a basic account is free, and then $100 a month would be just about right for the next level. Look how Office Live Small Business sells its hosting services as an indication of what to expect.
Finally, take a moment to do a browser inventory as best you can. You’ll find that you are supporting way too many different versions from different vendors, and getting people to switch just because the IT sheriff says so is probably impossible. If you are going to enter the brave new world of cloud computing, this is yet another indication of the beginnings of where the Wild West will begin for you.
The new real-time researcher
So hopefully you got some sleep last night after you voted, for those of you that live in America. But apart from the obvious result, our election had some other radical changes in how we consume information, and I wanted to share some thoughts. It was a historic night for these reasons, too.
The Internets have transformed those of us that are information junkies in a new way, to be our own real-time researchers, trend spotters and fact checkers. A combination of better search analytics, new technologies such as Twitter and live feeds, and even the relatively innocuous Facebook “I’ve voted” counter have put some very powerful tools in the hands of ordinary citizens.
It also helps that we had three relative rookies for our election-night network news anchors: while their talents (and take-home pay) are considerable, many of us haven’t had the relationship with Katie, Brian, and Charlie that we once had with Tom, Peter and Dan — or even Walter, Chet and David. Part of this is the waning influence of network TV, part of it is the movement away from newspapers and newsmagazines. (US News and World Report going monthly? Who would have thought?)
But the real reason has to do with the fact that the technologies to enable our own exploration of the world have become easier and more powerful to use, and within our grasp, even if we don’t have exceptional search skills. Let’s examine each of the technologies that have contributed to this state of affairs.
First is the ability to keep on top of what people are Googling. In a post last month in the ReadWriteWeb.com, Marshall Kirkpatrick writes how listeners were doing their own fact checking by Googling certain terms during the VP debates. By looking at the aggregated searches, we see that many people learned exactly which article of the Constitution covers the powers of the Vice President and that Biden got it wrong.
From the Google stats, we also see that Tina Fey has become a political personality, and indeed even more popular among searchers than Mr. Biden himself. Having watched many of the SNL skits, I found myself getting confused over who was the real Palin, and indeed didn’t do well on the Chicago Tribune’s photo quiz to distinguish the two women.
Speaking of those SNL skits, how many of you first watched them online versus on your living room TV? What was curious for me was that I first went to YouTube to find the videos, only to realize that NBC.com was posting them for the next-day audience on their own site. About time they figured out. How long did it take you to realize that as well? Maybe the networks finally understand the word-of-mouth day-after effects and can capture some of those page views for themselves. Do we really need an HD TV picture to see these videos when the postage-stamp a 320×240 portion of a Web browser can be just as satisfying?
In past elections, I was online most of the night, looking at the major network news Web sites, tracking the exit polls and ballots. Last night, I still did this, but there were other sites, such as Mahalo.com, that aggregated historical information so I could put this into context, and see how voting patterns from previous elections have compared with this year’s. As deep as I wanted to dive, I could easily find it with a few mouse clicks. It made the broadcast blather from the major networks even more irrelevant to me.
Having waited in line for about 70 minutes to vote yesterday morning, I was curious to see how many people voted early in the day by watching Facebook’s real-time vote counter, which passed a million votes early in the day and topped out at somewhere around 4 million by the time the polls closed in the West. Granted, this counter was more of a stunt than any analytical tool, but it gave me a very real indication that yes, we as Americans (or at least the Americans that are Facebook active users who are registered) are voting – to the rate of several hundred every second, all day long.
What about Twitter and other live feeds? We could actually follow “reports” from various self-styled correspondents and what they found during the day. The local St. Louis paper hired a few students to do just that and you could read their Tweets here.
One of the students, Ian Darnell, summed it up this way: “This is it. This is our time. This is how history has unfolded before us.” He could have said historicity, which was one of the hot search terms for yesterday.
More Gmail Contacts annoyances
So I have either 6556 or 9788 contacts in my Gmail contacts address book. Why the difference? It has to do with the way the contacts are displayed, I kid you not.
I have tried to get an answer from Google. They sent me to this blog post, which unfortunately doesn’t really explain what I am observing. Yes, the new UI will no longer populate your “suggested contacts” list with people that you reply to.
Google updated its interface for Gmail substantially last year, and since then the new UI just doesn’t work for me: it takes forever to load, crashes my browser frequently, and also has a big bug that for the life of me I don’t understand why it isn’t fixed: if you have more than 20 or so groups of contacts (I have about 50, I use this feature a lot), you can only display the first 20 in the new UI. Google could easily fix this by adding a scroll bar to the groups listing when you want to add someone to a particular group. Right now, you can’t scroll down past the first 20 group names.
Here are screenshots showing you before (below) and after the new UI is turned on (at left), and the relevant numbers of contacts displayed.
I asked Google to explain this. Still waiting a response.