The day the (analog) music died: five mega-trends

big-bopper-pinMany of us of a certain age remember the “day the music died” when Buddy Holly and the Big Bopper’s plane crashed. Or if not the actual day we get the reference that was most notably chronicled in the song “American Pie” by Don McLean. But there is another day that is harder to pin down, when digital music finally took over and we never looked back on CDs, cassettes, 8 track tapes and vinyl. I put that date somewhere around 1999-2000, depending on how old you were and how much analog music you had already collected by then.

It certainly has been an amazing period of time when you recall what music was like back in the day before all the iThings came along. For the most part it has been a mixed bag. Here are five mega-trends to consider.

get_smart_listen1. Music is more mobile. Back then we had separate rooms of our homes where we could listen to music, and only in those rooms. The notion of carrying most of your music collection around in your pocket was about as absurd as Maxwell Smart’s shoe-phone. We had separate radio stations with different music formats too that helped with discovering new music, and would carry recordings to our friends’ homes to play on their expensive stereos. Stereos were so named because they had two speakers the size of major pieces of furniture.

Just about everything in that paragraph has changed in 20 years. Having two speakers to listen to your music is so limiting, and you can buy a multichannel system for a couple hundred bucks these days. My iTunes music library has more music that I can listen to continuously for two weeks and close to 30 GB of files, and I am sure yours is equally vast. Songs that I ripped to the digital format are still intact 10 years later (I know I have some of the CDs around here someplace), and I can listen to music whenever and whenever I want.

2. The whole music discovery process has also been transformed. You can listen to any of thousands of tracks before you buy them in the major digital music stores. And then there are sites such as NoiseTrade who offer thousands of entire tracks free for the downloading (tipping is suggested but not required): it has become one of my favorite places to find new artists. Playlists make it easy to set up groups of tracks for every activity, something that you had to be a love-struck teenager willing to spend the effort for a mix tape, or be a DJ at your college radio station.

4b25fb883. Sharing is caring. At the beginning of this digital music transformation was Napster. It was the undoing of the music industry, making it easy for anyone to share digital copies of thousands of songs across the Internet. While they were the most infamous service, there were dozens of other products, some legit and some fairly shadowy, which I describe in this story that I wrote back in November 2001 that shows some of the interfaces of these forgotten programs.

Certainly, the notion of peer-to-peer file sharing has never recovered its bad boy cred from then. Napster had a lot of lessons for us back at the turn of the century, some good and some bad. Sadly of the several suggestions that I had about how we could learn to build better networks, none of them panned out. One of my high school students back then had a modest proposal for the music industry: to “create their own network of P2P servers and charge a nominal monthly fee.” That didn’t happen either.
4. Movie studios haven’t learned much from the digital music era. Now the movie industry appears to be headed down a similar path, albeit with bigger criminal penalties for their customers who want to share their digital copies. But we’ll leave that for another discussion, since I want to stick to music.
The early days of digital music weren’t easy for anyone, unless you were a teenager then and didn’t mind stealing your songs. I wrote extensively about the several processes of ripping and cataloging your tunes for a couple of Web Informant essays. My former student and most of his generation didn’t view P2P file sharing as a criminal act, instead, “it is a new way of dealing with an outdated corporate power structure.”
Since the rise and fall of the peer sharing services, Amazon, Apple, and to some extent a few other digital music storefronts have taken hold for selling tracks and albums. I used to buy most of my music from Amazon: they were the first to eliminate digital rights management (aka copy protection) from their tracks, but they have tricked out their player and their process and it is just easier to use iTunes Store now anyway.
5. Streaming is taking hold. But owning your music is going the way of the dodo too: several streaming services have taken hold, and it is only a matter of time before their user interface and bandwidth requirements gets perfected before they make it easier to listen to anything at anytime. There are numerous subscription streaming services, and they are getting better in terms of song availability and software quality.
The digital era has also been a mixed blessing for artists as well: most don’t make much money from selling their songs outright to consumers: their cut is minimal from the digital music stores. This has driven many of them to hike their performance fees. While digital technologies have made it easier to sell music to the public, it has become more of a “long tail” kind of business, with just a few mega-groups that can actually support themselves on song revenues.
Would I turn back the clock to analog music? Nope. It has been a great 15 years, and I don’t mourn its death. But it has been a fascinating time to see how things have evolved.

The trials and tribulations of eCommerce: a look back

I have been a keen observer and sometimes participant of the eCommerce field since its very early days back in the late 1990s. Then the websites were wacky, the software shaky, and the tools touchy and troublesome. But somehow we managed to buy stuff online and Amazon and others have been raking in the dough every since.

In the beginning, IBM had its own NT-based eCommerce product that I reviewed back in 1999 for Windows Sources magazine. These suites of products had a lot of custom configuration, and really weren’t very good. Since that point, IBM has built quite a business around Websphere and other tools. Another article about evaluating payment systems for eCommerce that I wrote for back in 1999 described the sad state of affairs back then.

In those early days, I had fun assignments like trying to figure out how long it took staff from an online storefront to respond to my email queries. That seems fairly obvious, and there are still storefronts that don’t respond quickly enough to their potential customers.

But one area where we have come the furthest has been in online payments. A good example is the recent Apple Pay announcements last month. As the NY Times points out, even though nary a dollar has been spent with this new system, vendors are jumping on board Just Because It Is Apple. Even eBay has gotten so worried that they are in the process of spinning off PayPal, something that they have resisted for years. Here is my analysis of Apple Pay published in Ricoh’s blog.

If you are looking for some historical context of how payments have evolved, check out the following pieces that I wrote over the years:

From that last piece, I wrote:

Imagine how hard life with physical wallets would be if they acted like e-wallets. You would have to carry several different kinds of wallets around with you, since each store would accept different payment systems. You couldn’t convert your dollars from one system to another without a great deal of work. And if you lost your wallet, you would be out of luck.

sim2Today we have a lot of payment choices, including a little-known service from MasterCard called Simplify that is a web payment gateway that offers 2% rates (but only through software, no card reader yet.). We’ll see if my predictions will come true or not once again.

A short history of wireless messaging

As part of my tripping down memory lane and reading my archive, I naturally came across the dozens of articles that I have written over my career on wireless messaging. It made me think about how the industry has evolved so quickly that many of us don’t even give this technology a second thought — we just expect it to be part of our communications package.

Now our smartphones have multiple messaging apps: Email, SMS, What’s App, Skype, AOL IM, and Apple’s iMessage, just to name a few of them that are on my phone. We flip back and forth from one to the other easily. When you add in the social networks’ messaging features, there are tons more.

radiomailMy first brush with wireless messaging was when Bill Frezza stopped by my office and gave me one of his first prototypes of what would eventually turn into the BlackBerry. It was called the Viking Express, and it weighed two pounds and was a clumsy collection of spare parts: a wireless modem, a small HP palmtop computer running DOS, and a nice leather portfolio to carry the whole thing around in.

The HP ran software from Radiomail Corp. The company was one of the first to understand how to push emails to wireless devices. Its innovations were never patented due to the philosophy of its founder, Geoff Goodfellow. Ironically, after Research in Motion, the company behind the Blackberry, went on to become one of the more litigious computer vendors, it had to pay $615 million to obtain the rights for patents for its device.

Now wireless has gotten so fast, it can be faster than wired connections. Cisco’s latest networking report predicts that Internet traffic will carry more wireless than wired packets in a few years. And we have come full circle: with new desktop Macs, Apple has gone a bit retro on us. Now you can access iMessage from your desktop, which is great for those of us that want to type on regular keyboards, inside of with our thumbs or use Siri to compose messages.

So for those of you that don’t recall where we have been, I have posted the original articles that I wrote about some of the early wireless messaging apps for some perspective on my blog. Here are some links to them.

Review of smart pagers for Computerworld (1998)

Back then, BlackBerries weren’t available, and Motorola ruled the roost. Pagers were in transition from simple one-way devices that would just display numerals to more interactive messaging devices. However, they were pretty unsatisfying: The one-way pagers worked because they were tiny, their batteries lasted forever, and they could be used by anyone including my nine-year old. The smarter devices were harder to use, they aet batteries for lunch, and they didn’t always work without some specialized knowledge.

Evaluating wireless web technologies (2000):

Back then, I gave up my laptop and tried to use just a smart(er) phone and borrowed PCs when I traveled. There were some early apps back then that could actually work.

Supporting PDAs and wireless devices on your corporate network (2001):

The first messaging device to gain traction was the Palm Pilot, but we also had the Pocket PC too. This article was a piece of custom content for CDW that reviewed all the options available back then. For cellular data, you needed add-in radio cards.

The joys of wireless messaging (2003)

How about AOL’s IM running on a Palm i705? That was a pretty slick device, as this article can attest to.

Don’t buy a Treo 700w (2006):

Remember the Treo? I still have one somewhere in my closet. They were a combination of a phone and a Palm Pilot. Back then I wrote: “the Treo isn’t as cool as the Sidekick, doesn’t do iTunes like the Rokr, and isn’t as addicting as a CrackBerry.”

Sidebar conversations are here to stay (2009):

It isn’t just texting during driving, but texting while doing something else that is at issue.By 2009, the notion of having a side conversation using a wireless device was very common.

What is your favorite wireless messaging device or app from the past?



A look back with Web Informant (1996): Lessons Learned From Web Publishing

Nearly 19 years ago, I began writing a weekly column called Web Informant that was first exclusively distributed via email, then via various other technologies including a blog, push technology, and syndication to a Japanese print newspaper. It has been a wonderful journey, and hard to believe that it has lasted all this time. I first wanted to thank all of you readers who have stuck with me, sent me comments and encouragements over the years.

Over the next year and leading up to the big 20th birthday celebration, I thought I would resurrect a few of my favorite stories and see how well they have held up over time. This first piece was published by John December in a journal called Computer Mediated Communications back in May of 1996. My current commentary is in brackets so you can distinguish between the original me and the current me.

After writing and editing print publications, I threw caution to the winds last fall and put up my own website. I’m glad I did and have learned a few lessons along the way that I’d like to share with you. Here goes.

  1. Print still matters: it has the vast majority of advertising and is where the attention in our industry still lies. The industry still defines itself and pays attention to what these trade publications print. [Back in 1996, I mentioned one story that the online press did a better job than print in covering, that is still true today.]
  2. You may think otherwise, but the best way to get the word out about your site is for others to provide links on their Web sites back to yours, what I call inbound links. [With all the SEO expertise out there, this is still true today.]
  3. It is a good idea to review your access logs regularly to determine frequently-accessed pages, broken links, who is visiting, and when you have your peak periods. These logs are your best sources for measurements of success and a good way to figure out who your audience is.
  4. Community counts. If you are going to start a successful Web publishing venture, make sure you have a good idea whom your community is. By community I don’t just mean reader/viewers–I mean the entire life-cycle of information consumers, providers, and relay points along the way. Who creates the information? Who sends/interprets/messes it up? Who needs this information? The more you know this cycle, the better a Web publisher you’ll be. The more focused your publication, the better off you are.
  5. Just like running a “real” print magazine, you need to develop a production system and stick to it, and resist any temptations to fiddle with it. Online, the best feedback loop you have is when your reader/viewers drop you a note on email saying something doesn’t look right or a link is broken.
  6. Don’t get too enamored with the graphical look and feel of your publication: many reader/viewers will never see these efforts and they ultimately don’t matter as much as you think. While you are developing your production systems, don’t forget that many reader/viewers are running text-based browsers or turn their images off because they are coming in from dial-up connections. [Well, that has changed since 1996, but still lots of sites are filled with useless graphical junk and pop-ups that are annoying at any bandwidth.]
  7. The best Web publications make use of email as an effective marketing tool for the Web content, notifying reader/viewers when something is new on a regular basis. [This was in the days before blogs, RSS, social media, Twitter, and other notification mechanisms, all of which are great tools to complement the web.]

Overall, am I glad I am in the Web-publishing business? Yes, most definitely: it has given me a greater feel for my community, it has helped increase my understanding of the technologies involved, and I have had a great deal of fun too.

Has it been easy? Nope: Web technologies are changing so fast sometimes you can’t keep up no matter how hard you try. Setting up a Web publication will take more time and energy than you’ve planned, and keeping it fresh and alive is almost a daily responsibility. You need lots of skills: programming, publishing, library science, graphic design, and on top of this a good dose of understanding the nature and structure and culture of the Internet helps too. And a sense of humor and a thick skin come in handy from time to time too.

IBM and Akers, then and now

With the passing of John Akers last week, I was reminded how very different the IBM of today is from the company that I began covering when I first entered tech journalism back in the mid-1980s. Back then, IBM was a hardware powerhouse: key innovations in chip design, the first hard disk drives, PC networking, and more all came out of IBM’s research labs. Back in the 1980s, IBMers were frequent Nobel prize winners. They also wore white shirts and dark suits with rep ties to work. How many of you even know what a rep tie is, let alone seen one lately on anyone in your IT departments?

Almost none of that effort remains in the IBM of today. Hardly anyone gives token ring networks a second thought: Ethernet and its descendents won that battle long ago. Indeed, IBM got out of the PC business years ago when it sold off its assets to Lenovo. The chip designs that IBM invented are now popular in almost every other vendors, including Intel. IBM mainframes now run Linux, in addition to the languages and programs that I was familiar back in the 1980s.

Today’s IBM is all about software. Its Websphere and SoftLayer groups are growth areas for the company: both came out of acquisitions mixed with lots of internal development.

akers-john0001Akers presided over IBM at the peak of its population — some 400,000 people worked for him at one time but that was a major issue and unsustainable. By the time he was forced into retirement in 1993, about a quarter of this workforce was gone and thus began the great transition into a software company.

Despite these layoffs and running IBM during some turbulent times, Akers was the head of IBM when it was leading the PC revolution in corporate America. While he wasn’t in charge when the IBM PC was introduced back in 1981, he did oversee the expansion of the PC’s first decade. At Transamerica Life Insurance where I worked in the end user support department, it was heady times as we bought thousands of PCs for our knowledge workers. While IBM didn’t have the most innovative PCs, they had a solid brand awareness that made corporations initially comfortable with purchasing them.

IBM had some spectacular failures in the PC business back then too: the PCjr, the PS/2 with its proprietary hardware bus, the aforementioned token ring networks, and a misguided attempt to unify PCs with the first digital phone systems. But its presence in the PC business ultimately led to the success of Microsoft, Intel, Apple, Sun, Oracle and thousands of other companies that make up the industry today. The IBM PC was revolutionary at the time, not for what it had inside, but for what it didn’t have: any proprietary IBM technology. It was the first piece of hardware from IBM that could be built by anyone out of common parts, and many of its competitors did exactly that. Had IBM come out with its PS/2 or some other proprietary system in 1981, the PC industry would have had to come about differently. Now look at how cheap you can buy a Raspberry Pi device.

Akers missteps had another important legacy: he began IBM’s transition to a software company. Since those early days, IBM has had dozens of acquisitions including Cognos, Lotus, Tivoli, Rational, FileNet, Internet Security Systems, SPSS, CastIron, and BigFix — many of these were billion-dollar companies that are probably not familiar to you now. Software is now about a third of IBM’s overall revenues, and its hardware business continues to decline.

But while Akers ultimately couldn’t change IBM by himself, he did a great job mentoring many people who did, including his eventual successor Sam Palmisano and Steve Mills, who runs its software group. And now IBM’s head is Ginni Rometty, its first female CEO. I don’t think she wears a rep tie either.

The Mac and Me: Remembering Quark and AppleTalk, Netware and Gopher

The Apple Mac has played an important part of my professional journalism career for at least 20 of the years that I have been a writer. One Mac or another has been my main writing machine since 1990, and has been in daily use, traveling around the world several times and my more-or-less constant work companion. It is a tool not a religion, yet I have been quite fond of the various machines that I have used.

You can read more of my reflections on using a Mac for nearly 30 years over at Network World where they have put together a nice package of articles commemorating the event.

Remembering Ed Iacobucci

Another great tech manager has left our ranks this week, Ed Iacobucci. Ed lost a 16-month battle with pancreatic cancer. I last saw him two years ago when I was transiting Miami, and he was good enough to meet me at the airport on the weekend to brief me on his latest venture on desktop virtualization, Virtual Works. That is the kind of guy he was: coming out to the airport for a quick press meet on the weekend. There aren’t too many folks that would do that, and it shows the mutual respect we had for each other.

Ed was one of the originals in the PC industry. By that I mean that many of his ideas turned into products that we are still using today, or with companies that have gone on to become giants. He worked for many years as the IBM PC brain trust, first in their mainframe communications area and later on was one of the leads for the misguided OS/2 operating system. Both were big interests of mine and I followed his career since then.

You have to realize what a study in contrasts working for the PC division of IBM was back in the day. You had all these upstarts (such as Apple, Kaypro, Columbia, Osborne, and the like) that were building clones to run DOS. These companies were for the most part populated by people in the their 30s. Not at IBM: you had older folks who had come up the ranks in the mainframe world that were taking things into a new direction for IBM: using commodity parts that could be assembled quickly for very low cost. Ed was part of that revolutionary guard at IBM. Now IBM doesn’t even make PCs anymore.

You also have to realize what things were like in the early PC days for the trade press too. Aside from the fact that our publications used dead trees instead of electrons, we had tremendous access to these guiding lights of the industry. We could call up anyone and get anything. We would fly somewhere on a moment’s notice to meet someone or attend a briefing to see a new product.

Back in the early PC era, I just loved people like Ed: smart, articulate, open, funny, and did I mention smart? Tech reporters soaked up the information about their products, their worldview, their “vision” (although that term is overused now). We could always count on the ilks of Ed to ‘splain somethin’ and give us a pithy quote that actually shed some light on a tricky tech topic. I have forgotten more about operating system design that I learned from Ed than most reporters even know today.

When OS/2 was still a project that combined the best and brightest of IBM and Microsoft, I was writing my first book with Mike Edelhart, who was my mentor and editor at PC Week (now eWeek). The book, like the operating system, went through several revisions as we waited for it to take off and become the corporate standard. Sadly for us (and them), that never happened and the book was never published.  Mike and I did have some cool and memorable experiences: holing up at a hotel on Coronado Island to finish the first draft, scheduling a press briefing in Austin where 60 IBM’ers came to brief a few PC Week reporters the secrets and inner workings of OS/2, and getting to meet the cast of Star Trek: The Next Generation at another press briefing (as one version of OS/2 was called Warp).

Ed left IBM in 1989 to found Citrix, which was a very small company for several years until it became the software behemoth that it is today. That began his next career in virtualization, something that he was still working on at his death.

After Citrix he left the tech field temporarily to found NetJets, a time-sharing company for business aircraft. Just like his other startups, he was way ahead of his time: now there are many jet sharing companies around. I always regret that I didn’t get in touch with him during that era and get a chance to ride on one of his jets (a guy can dream, right?).

In the release announcing his death, he is quoted as saying “Every human being has his own vision of what’s happening in the future. I was lucky in that what I thought would happen did happen. When we know we can do it and the rest of the world doesn’t – that’s when things get interesting.” It sure does. It was a honor to know him.

So long Ed, and thanks for the wonderful memories and terrific times and great products over the years.

How to turn off your mainframe, c.1995

So you want to get rid of your mainframe?

In the spring of 1995, I had an opportunity to witness a historic occasion: the removal of a firm’s mainframe IBM computer. I wrote a couple of columns for Infoworld back then and was recently reminded about the moment. Here is what I wrote:

The notion of turning off your mainframe computers is somewhere between romantic and impossible, depending upon whom you talk to you. Our esteemed editor-in-chief gave an actual date (this month) when the last mainframe at Infoworld would be shut off — that is the romance part — while I am convinced that there will always be room for mainframes in corporate America, somewhere.

The company that I visited is Edison Electric Institute, the trade association for the nation’s several hundred investor-owned electric utilities. They are located in downtown Washington DC and had — until recently — a single IBM mainframe computer running their business.

At the core of any trade association are lists: “We are constantly putting on conferences, mailing things, and updating records of people in our member’s companies,” said Jon Arnold, the IS director at Edison. Edison publishes over 200 different handbooks, conducts training seminars, and even tests potential operators of nuclear plants.

Edison had originally purchased a 4331 about 12 years ago. “At the time, we were spending hundreds of thousands of dollars with an outsourcing company — only back then we called them remote job entry sites,” said Jeff Kirstein, one of Arnold’s managers. “We had written all sorts of applications in COBOL and everything was charged back to the end user.”

Edison decided to buy their own mainframe and began to write what would end up being a core of six different applications, most of them written in SAS: a list management routine that had over 70,000 names on over a thousand different lists, a meeting registration package to keep track of hotels, courses, and other details needed for the various conferences the trade association put on each year, a financial tracking package, a package to keep track of the several hundred publications and periodicals that Edison subscribed to, and a committee membership tracking system.

SAS was flexible, to be sure. “But it was also a hog, and as a statistical package it was poor at generating business reports, but once we got the hang of it, it worked fine for our purposes,” said Kirstein.

The mainframe-based applications “got the job done,” said Jim Coughlin, a programmer analyst at Edison. “We had setup our mainframe to make it easier for each user to maintain their own lists — every mainframe logon ID was tied to a particular virtual machine and memory space. However, the mainframe systems were fairly crude: each time a user opened a list it would take time to resort and their was minimal error trapping during data entry.” One plus for the mainframe was that “we didn’t have to spend a lot of time running the machine,” said Arnold. Using IBM’s VM operating system, Arnold had one system operator and an assistant to do routine maintenance and backups.

But the annual fixed costs were high, and when Edison moved four years ago into a new building in downtown DC they began to deploy LAN-based technologies. At the time of the move they began thinking about turning off their mainframe. However, they had to manage the transition slowly:

“First off, my predecessor signed a five-year lease with IBM shortly before I came on board,” said Arnold. That lease, along with the monthly software maintenance charges for the various applications and system software, ran to about a quarter of a million dollars a year. Breaking the lease early would have been costly, so Edison decided to start planning on turning the mainframe off when the lease expired in February, 1995.

Secondly, they needed to find another tool besides SAS that would run their applications on the PC. More on what they picked next week.

Third, they had to populate their desktops with PCs rather than terminals. They began to do this when they moved into their new building, first buying IBM PS/2 model 50s. Now Edison buys Dell Pentiums exclusively. All of the desktops are networked together with two NetWare 3.12 servers, and run a standard suite of office applications including Word Perfect GroupWise and word processing, Lotus 1-2-3 and Windows.

Fourth, they had to retrain their end-users. This was perhaps the most difficult part of the process, and is still on-going.

I’ve seen lots of training efforts, but my hat is off to Edison’s IS crew: they have the right mix of pluck, enthusiasm, and end-user motivations to make things work.
The IS staff raffles off a used model 50 for $50 at a series of informal seminars (“You have to be present to win,” said Arnold). And “we also kept a list of people who didn’t get trained, and they don’t get access to the new applications on the LAN,” said Jeanny Shu Lu, another IS manager at Edison. Finally, the applications developers in IS held a series of weekly brown-bag lunches with their end-users, and solicited suggestions for changes. These would be implemented if there was a group consensus.

I turned off my first mainframe at 1:11 pm on March 2nd, 1995 in the shadow of the building where the original Declaration of Independence is kept. Somewhat fitting, but somewhat sad. I was at the offices of the Edison Electric Institute, who happen to be located across the street from the National Archives in downtown DC.

When I first spoke to Jon Arnold (the IS manager at Edison) last year about doing this, I was somewhat psyched: after all, I had been using mainframes for about 15 years and had never been in the position of actually being able to flip the big red switch (a little bigger than the ones on the back of your PCs, but not by much) off before. However, when the time came to do the deed on March 2nd, my feelings had changed somewhat: more of a mixture of bathos and regret.

IBM mainframes were perhaps responsible for my first job in trade computer publishing in 1986: back then I was deeply involved in DISOSS (now called OfficeVision and almost forgotten), 3270 gateways, and products from IBM such as Enhanced Connectivity Facility and IND$FILE file transfer.

The mainframe I was ending was a small one: an IBM 4381 model 13. It had, at the end of its lifetime, a whopping 16 megabytes of memory and 7.5 gigabytes of disk storage — DASD to you old-line IBMers. Since this may be the last time you see this acronym, it stands for Direct Access Storage Device. It was running VM, which stands for Virtual Machine, as its operating system. And since it cost over a quarter of a million dollars annually to run, it had reached the end of its cost-effective life. It was replaced by two NetWare file servers, one of which had more RAM and disk than the 4381 had. The network was token ring, reflecting the IBM heritage of Edison and the utility industry it represents. Edison took existing SAS and COBOL applications running on VM and rewrote them in Magic and Btrieve on the LAN.
Magic is an Israeli-based software development company that provides tools to build screens and assemble database applications that can run on a variety of back-end database servers. One of the reasons Edison began using it was they had a contractor who began to build applications using it. “However, they never finished the work and we had to take it over,” said Coughlin. “It is a wonderful rapid application development language, and easy to make changes.”

The combination of the two products is very solid, according to Edison’s developers and end-users. “We’ve had no data corruption issues, no performance problems at all,” said Kirstein. “We had tried other products, such as writing X-Base applications in Clipper, but that was a mess.”

The Magic-based applications took less time than SAS. For example, rewriting the meeting registration package took two months, about half the time the equivalent SAS application required.

And the new LAN-based applications allowed Edison to improve service to their members, avoid duplicate mailings, and increase the quality of their databases. “We forced more cooperation among our end-users since now one person owns the name while another owns the address,” said Kirstein. “We’ve added lots of user-requested capabilities to our list management software, but we’ve also managed to keep it centralized,” said Arnold.

Getting rid of a mainframe, even a small one like this 4381, isn’t simple of course. It took Edison several years to migrate towards this LAN-centric computing environment and to rewrite their applications on PC platforms. And then they still had to pay several thousand dollars to have someone cart the 4381 off their premises. (Their original lease, written five years ago, stipulated that Edison would be responsible for shipping charges.)

So in the end, the machine had a negative salvage value to Edison. They managed to make a few hundred bucks, though, on a pair of 3174/3274 controllers that someone wanted to purchase. Ironic, though, that the communications hardware (which was even older technology than the 4381 itself) would prove to be worth more and outlive the actual beast itself.

Turning it off was relatively easy: I typed in “SHUTDOWN” at the systems console (forgetting for a moment where the “ENTER” key is on the 3270-style keyboard), waited a few minutes, and then hit the off switch on several components, including a tape drive and CPU unit. Ironically, the cabinet of this mainframe was red: perhaps Edison knew they were going to replace the mainframe with a NetWare network long ago? The hardest part was remembering the command to turn off forever their 3270 gateway, which was a piece of software so old from Novell that it no longer was supported or manufactured (why upgrade something that you eventually will retire?).

What made the moment of shutdown a sad one was that I was standing amidst people who had seen the 4381 first come in the door at Edison, and who had spent a large portion of their careers in the care and feeding of the machine.
“It is going to be pretty quiet in our computer room,” said Kirstein.

After we shutdown the mainframe, the Tricord running NetWare was still running. And it was alot quieter.

Where are they now? Arnold is now Managing Director WW Utilities Industry for Microsoft.

20 years of thanks

It has been 20 years since I set out to start my own business, and this column is a combination of a look back and a way of saying thanks to all of you that are still reading my work.

I thought about this milestone when I had a chance to meet one of my readers this past month that I had never met, an IT manager with a large non-profit organization. He was excited to finally meet me. We reminisced about one of my reviews of a now-defunct product that he ended up purchasing and using for many years. That interaction brought home to me the kind of influence that I have had over time, and made me feel proud of the body of work that I have created. So for all of you that I have met, and those that I haven’t, I just wanted to say thanks for your attention all these years. My work product is a partnership among readers and vendors who keep innovation alive in the world of IT. It has been a terrific run.

In 1992, I was just coming off a very successful launch of Network Computing magazine for CMP (now United Business Media). I had hired the staff, worked with our designers, and built one of the first digital content management systems that ran on desktop Apple computers. In the first year of publication we had turned a profit. The publication still exists today and many of the folks that I had hired are still working journalists. Back then we had PCs that ran megahertz clock speeds with megabytes of RAM and disk storage: even our phones today have dual core processors and gigabytes now of stuff. Speaking of cell phones, they were anything but smart: the first Blackberry precursor was still on the drawing boards. Wifi hadn’t yet happened, let alone wired networking; the Internet was still the province of academics and the military; and IBM still thought its minicomputer line was part of the picture for most businesses. Microsoft Windows was at version 3.1, and just beginning to catch on. Computer CD drives were still new and were read-only.

Back in 1992, many of today’s tech influencers had yet to hit their marks. Sergey and Larry were still undergrads that hadn’t yet even met each other at Stanford, let alone come up with the idea of Google. Zuck was barely out of kindergarten. Steve was still fooling around with Next and Pixar and hadn’t yet come back to guide Apple. Chambers was still running Cisco’s sales and not yet the entire company, and Linus’ doctoral dissertation was known to just a few tech-heads.

They were certainly different times.

In the 20 years I have written thousands of magazine articles. I wrote my second book on computer networking which came out the week after 9/11, much to my disadvantage. I have had the opportunity and honor to work with some of the most exceptional people in our industry. I have enjoyed staying in touch with many IT managers as they have grown their careers and continue to correspond with many entrepreneurs as they have moved from one startup to another.

Yes, those publications at the dawn of the PC era are mere shadows of themselves today: PC Week (now eWeek), Infoworld (where I wrote a weekly column in the mid-1990s), Datamation, Computerworld, and even my baby Network Computing. Print has been replaced by the Web, and tech advertising has migrated elsewhere. The stack of paper on my desk on Monday afternoons is now replaced by the initials http.

Will I still be at my computer 20 years from now? I have no idea. But I hope you continue to read what I compose each week, and I am proud to have you as my loyal readers. Thanks for a great first 20 years together. And as Natalie Merchant has written,

You’ve been so kind and generous
I don’t know why you keep on giving
For your kindness I’m in debt to you
For your selflessness–my admiration
For everything you’ve done
You know I’m bound–I’m bound to thank you for it.

Remembering Garry Betty

Another industry luminary has been taken from us. Garry Betty, the former CEO of Earthlink (stepping down last fall because of his health) and long-time industry veteran, died yesterday of liver cancer.

His tribute blog can be found here.

I first met Garry in the mid 1980s, when he was moving up the corporate ladder at Hayes. Back then the company was the leading modem communications vendor. Garry  went on to became the CEO at DCA and was able to do good things there.

DCA was one of those companies like Novell that incubated a lot of talented people who went on to run their own companies and have a significant influence in our industry. One of my IT colleagues went to work for him at DCA, and I had lots of ties with the company when I began my journalism career at PC Week, since I covered those products and was very familiar with them.

My favorite Garry story was when a bunch of us were flown up to Remote, Oregon for a DCA/Hayes product launch. At the time, DCA had a rather flamboyant PR manager, Bill Marks, who went on to run Atlanta Olympics PR. Bill was always coming up with gimmicks to get the trades to write about his products, and since he was launching a “remote” product line, it made sense to fly us to this rather, um, remote town. They rented jets to fly us from San Jose, and then we were bussed to this one-half-horse town in the mountains, not too far from where the Kim family got lost.

Well, the product launch went well. Garry was his usual charming self. It was actually a fun trip, because we all did some bonding on the long bus ride through the mountains. There was just one fly in this plan: it was Black Monday, the day the stock market lost more than 20% of its value in one day.

Here we all were in Remote, and this was pretty much in the era prior to cell phones, not that you could get coverage there anyway. There was a single phone line going into the Remote General Store (which was run by a woman who had a sister named Erma, as I recall, a nice coincidence since Irma was the name of the mainstream DCA product line). The executives were desperately trying to unload their stock positions as the market continued to tumble. Garry used to joke that that launch caused him a bunch of money personally.

One of my DCA colleagues writes this about Garry:

We used to joke at DCA about the “revolving door on the President’s office”. After a series of relatively ineffective presidents, during which much of the growth success of the company was due to strong middle management, Garry Betty hit the scene and actually made a positive difference at the CEO level. He quickly won favor among nearly everyone. He showed a lot of personal interest in employees and went out of his way to joke around with them and do a lot of little personal things that won over the hearts of many. He also spent more time with customers than his predecessors, which is important for any company that wishes to grow their customer franchise and revenue.

He also knew how to have a good time. I remember the day that he invited the product management and marketing team for a day out on his big cruiser power boat for a bonding day, drinking beer, swimming, and sun-bathing on a gorgeous day in which we managed to throw him off the boat into the water; as was so typical Garry, he got a laugh out of it.
Cheers, Stephen Kangas