More than 20 years ago, the Web was just getting started. People were experimenting with all kinds of web servers as publishing mechanisms and as user interfaces for various devices. Back then, I thought this was a neat idea: having a web interface was a great way to demonstrate a product across the Internet, unify the user experience across different browsers and end user platforms without having to develop separate programs for them, and perhaps simplify end user training too. It was the brave new world.
Back then, there were some dissenting voices. Having more Web UIs would ”set computer programming back 30 years and is about the worst technology I’ve laid eyes on,” said one UI consultant that I interviewed at the time. Another pointed out that the Windows graphical interface (which was just getting going back then) was far superior to anything the Web could produce in terms of interactive controls. That distinction has largely disappeared over the decades. And having the cloud to handle various tasks (think calendar synch or database queries) makes the Web UI superior to a local Windows app under certain circumstances.
I wrote about these issues for Computerworld in the summer of 1996. Back then, Netscape (remember them?) and Microsoft were duking it out over which company’s HTML extensions were going to become more popular (we know how that fight went down). At the time, I said, “having all software go to the Web UI might hasten to have an all-Windows world: since multi-platform apps can be supported by web servers, developers have moved away from Everything Else and concentrated on Everything Windows.” I don’t think that has come true, and let’s not forget about smartphone apps that have their own wicked interface with their own screen real estate limitations.
I asked my favorite UX consultant, Danielle Cooley, what she thought about my comments from 1996. “Things have changed dramatically, of course, both on the technology side and the design side,” she told me in a recent email. “Speaking as the user advocate, I would say consumers’ standards are much higher across the board then they were 21 years ago. Thanks to the user-centered approach taken by large organizations like Amazon, Apple, and Google, laypeople have less patience for digital products that force them to contort their thinking and behavior. Now, they have more and more access to tools that fit the way they already think and behave. Many organizations still suffer from serious UX immaturity. Lack of investment and integration here has resulted in the confusing and frustrating interfaces we’ve all come to hate. The fact that there are still SO MANY of these, 21 years after your Computerworld article, is telling and alarming.”
But the Web UI is here to stay, one way or another. Now at least we have responsive design, so at least smaller or larger screens can view appropriate webpages automatically. And hopefully, developers will finally learn what makes for a better UI experience.
Haven’t sent you a comment/message in a long time. This just takes me back to 1992 when I started the university’s web site. Most people had no idea what I was talking about. Those that had a suspicion of what the www was thought I was way off. Getting content took phone calls and personal meetings. Today we get emails and calls for us to fix anything that people can get to in there browsers. Even if it is not on the official web site that we manage.
Dale