Peter Coffee enters his next career

I had a chance to catch up with Peter Coffee, who recently ended his 18 years at Salesforce to focus on philanthropy and pro bono consulting. I first met Peter in the mid-1980s, when he was working for a defense contractor in IT, and I had just left working for an insurance company’s IT department. Both of us were living in LA and both of us were part of the advance guard of installing PCs around our companies. I had taken a job with PC Week, writing my little corporate IT heart out, and I had just hired Peter to be part of a team of product reviewers and in-house analysts.

Back in those days, there were many different PC makers, each running a slightly different collection of hardware and operating system. MS DOS, the Microsoft version, hadn’t yet become a standard, and there were also other operating systems that have since either died (like CP/M)  or have morphed into major big deals (like the early versions that became Linux). Peter recalls one debate that he had in person with Bill Gates in those early years, where he argued that MS DOS might be the technically superior product, but other DOS versions put more tools in the box. Those were the days where you could buttonhole Gates in person.

Before we came to PC Week, Peter and I would examine these products and make recommendations to our corporate user base and management about which ones would become the company standard. Given that both of our companies were huge IBM customers, you might think that IBM had the PC world locked up, but this wasn’t always the case.

Peter and the rest of my team at PC Week Labs were early to do product reviews and write about the issues that we saw in terms of our corporate context. “We created an entire new way of breaking news by doing tech investigations and analysis. We would write short pieces that were published the following week, originating this content from our technical backgrounds,” he said, giving me credit for creating this journalistic model that has since flourished and now seems in decline. We also did numerous stunts, such as testing which network topologies were actually faster (Ethernet) and why early Windows was a bust (it ran on top of DOS rather than replacing it) or about the 386 CPU. They were heady times, to be sure. It was a model that I brought over to Network Computing magazine, which I began in the summer of 1990.

Peter reminded me that many tech pubs — including most of the overseas ones — had a pay to play model, where the writers would offer up glowing reviews of the products of the major advertisers. What we did was having strong opinions and having the technical chops to back them up.

But times have changed. Now everyone is familiar with PCs, and takes them for granted. You don’t need a degree in Computer Science to be able to program, “because computer literacy is more about thinking about a problem than learning how to write code,” as Peter told me. “It is about finding the right tool to do the job, and assembling connections and anticipating the questions and problems that lie in the future. That has changed the whole notion of technical expertise into tying data sources and algorithms and understanding what the ultimate user wants to know.”

Several years ago, Peter and his wife started a non-profit foundation that will occupy their full-time attention. The foundation will focus on funding local efforts to improve climate, STEM education and other matters. His goal is to bootstrap these efforts into a better position to obtain national or international support. He said, “These are problems that could exponentially bloom into major issues, but they need help when they are still small and solvable.”  I wish them well.

CSOonline: 12 Attack Surface Management tools reviewed

Potential Attack Surface Management buyers need to understand how various network and other infrastructure changes happen and how they can neutralize them.

Periodic scans of the network are no longer sufficient for maintaining a hardened attack surface. Continuous monitoring for new assets and configuration drift are critical to ensure the security of corporate resources and customer data.

New assets need to be identified and incorporated into the monitoring solution as these could potentially be part of a brand attack or shadow IT. Configuration drift could be benign and part of a design change, but also has the potential to be the result of human error or the early stages of an attack. Identifying these changes early allows for the cybersecurity team to react appropriately and mitigate any further damage.

I review 12 different ASM tools and also provide some questions to ask your team and the vendors about their ASM offerings in this updated article for CSOonline.

 

Red Cross: Mizzou makes running a large blood drive look easy

Red Cross phlebotomist Jenise McKee standing next to Jake McCarthy who is sitting in chair about to donate blood.

Setting up a mammoth blood drive is akin to building a 100-bed hospital emergency department from scratch and then taking it down a few days later. I got to see this in person with what is reported to be the largest student-run blood drive in the nation. Columbia is the city where you can find the University of Missouri, popularly called Mizzou, home to more than 30,000 students. For more than 40 years, the school has hosted blood drives in partnership with the American Red Cross. This year they broke their own record, collecting over 5,000 units of blood. You can read my post about the blood drive last month here on the chapter blog.

(photo is of Red Cross phlebotomist Jenise McKee readies Mizzou student donor Jake McCarthy for his Power Red blood donation.)

What becomes a bottom feeder most?

I ask this question with serious intent and my focus is on vetting the best tech reviews websites. I have written around this problem in the past, but thanks to spending some time with my colleague Sam Whitmore, I have some new things to say. You can read the links to my past posts in the coda below.

With some modesty, I have some familiarity with this particular market, having written reviews for dozens of publications, both online and print, over the decades. When I began at PC Week (now sadly called eWeek) in the mid-1980s, we didn’t have the web, just the dead trees version. About a third of our pages were devoted to reviewing technologies and analyzing trends. These articles were written by people that actually touched the products and understood how enterprise IT folks would use them.

PC Week (and many others at the time) had a terrific business model, which was to charge a lot of money for print advertising, on the promise that our pub would control its circulation among what we would now call influencers. The web was the first big challenge: posting online content, these controls and promises went out the window. So began the fall of the Holy Roman Tech Empire.

In the late 1990s, we got the first wave of bottom-feeder websites, such as those created by Newsfactor and others. Instead of paying experienced writers and analysts to produce articles, they were “pay to play” operations that took pieces submitted by vendors who were anxious to get their names into print, or electrons. You could easily spot these sites because they have three things in common:

  1. Most articles quote no sources, or if they do they don’t actually use quotations,
  2. Most articles have no external links to any supporting materials, and
  3. Most articles have either no byline or no dateline, and as such aren’t tied to a particular news moment or product introduction or something else that would indicate timeliness.

What bugs me the most about these sites is that they are filled with posts which promise an actual review of a product or category. However they usually don’t deliver any insight or evidence that any author actually handled the product. It bugs me because these kinds of articles devalue my own expertise in product handling, and how I translated that to actionable insights for my readers.

Now these three things can happen in legit articles that professional writers create. But taken together they illustrate the pay-for-play milieu.

With the new millennium, we had a different tech publishing model best typified by TechTarget, now part of the Holy Informa Empire. These sites combined organic search with lead generation as their business model, and resulted in sites with domains such as searchsecurity.com and searchcloudcomputing.com. These were combined with print pubs in the beginning and eventually tied to conferences too. In its early years, I was proud to work for them because they emphasized high quality information.

With the advent of AI and LLMs, we now have a new era of tech publishing. Organic search has become a bottom-feeder operation, because queries are now asked and answered in natural language and stay within the confines of the chatbots. This is because AI can spin up batches of words and pictures easily and programmatically, there is no need to go any further. This means people like me have become buggy whips. Or hood ornaments. Or something that you put on a shelf.

Let’s examine one website for further analysis. This is tied to a print publication, so my guess is that many of these pieces were paid for by specific vendors or else generated by AI tools. No datelines. Bylines are suspect: I wasn’t able to ID anyone that I could independently verify is an actual human, and the authors’ pictures seem anodyne. There is a page of conferences that has odd mistakes in it, such as shows held in “Detroit City” and “Seattle City” and broken links. Again, a human proofreader would catch these in about three seconds. Articles are copies of other sites in this vendor’s “network.” The most curious thing is if you try to cut and paste some of the content, you get a popup that prevents you from doing so, saying that the work is copyrighted.

It is clearly the work of AI. The same company that owns this site runs about a dozen other websites, many with the work “review” in their domain names. These sites having a boring sameness about them, with articles that don’t reflect any news moments or trends to current events. These are not reviews.

Welcome to the new bottom-feeders of tech.

Coda: references

CSOonline: 5 steps for deploying agentic AI red teaming

Building secure agentic systems requires more than just securing individual components; it demands a holistic approach where security is embedded within the architecture itself. For my latest article for CSO Online, I delve into the world of using agentic AI for red teaming exercises. It is very much a work in progress. Many vendors of defensive AI solutions are still in their infancy when it comes to protecting the entirety of a generative AI model and the attack space is enormous.

CSOonline: Seven ASPM products compared

Having a central protections platform for application security requires a deep understanding of issues and product capabilities. Protecting your enterprise application collection requires near-constant vigilance and a careful choice of the right collection of defensive tools. As threats continue to become more complex and difficult to discover, applications have also become more complex and bridge the worlds of cloud, containers, and on premises. This presents all sorts of challenges for tools which have struggled to keep pace.

The latest category of products goes by the moniker of application security posture managers, or ASPM. I review seven different tools from these vendors in my latest post for CSOonline:

  • ArmorCode
  • Crowdstrike
  • Cycode
  • Ivanti
  • Legit Security
  • Nucleus Security
  • Wiz

 

Sam Whitmore podcast: AI strategies for PR folks

Last week I had a chat with Sam Whitmore on his pod about how I am using AI and working with a couple of developers. I have some thoughts about how PR folks should incorporate this technology into their daily workflows, and also point out this CJR piece where they interview several reporters on how they are using various AI tools.

My 14 minute discussion shows what a deep AI scan of my published work shows about my unique differentiators of my writing style, how PR can exploit LLM and agents, and why reading someone’s clips prior to a pitch is more important than ever. You might also want to read my post from a few months back about how 10Fold got on board the agentic AI train.

CSO: How to make your multicloud security more effective

The days of debating whether cloud or on-premises is the best location for your servers are thankfully far behind us. But lately, more enterprises are shifting their workloads as they realize that security and simplicity matter. This movement isn’t uniform because of the richness and complexity of multicloud computing in the modern era.

In this piece for CSOonline, I look at ways to be more purposeful about cloud security and focus on containing and managing tool sprawl with recommended courses of action that you can take.

My love affair with MS-DOS

I wrote this in 2011 when I was running a piece of the ReadWrite editorial, and recently discovered it. Other than making a few corrections and updating the dates, I still share the sentiment.

Can it be that DOS and I have been involved with each other for more than 40 years? That sounds about right. DOS has been a hard romance, to be sure.

Back then, I was a lowly worker for a Congressional research agency that no longer exists. I was going to write “a lowly IT worker,” until I remembered that we didn’t have IT workers 40+ years ago: Information Centers really didn’t come into vogue for several years, until the IBM PC caught on and corporations were scrambling to put them in place of their 3270 mainframe terminals. Back in 1981, we used NBI and Xerox word processors. These were big behemoths that came installed with their own furniture they were so unwieldy. We had impact printers and floppy discs that were eight inches in diameter. The first hard drives were a whopping 5 MB and the size of a big dictionary. But that came a few years later.

At the agency, one of the things that I figured out was how to hook up these word processors to a high-speed Xerox printer that also was the size of a small car. We had to use modems, as I recall: you know those things beeped that used to be included on every PC? When was the last time you used a PC with an internal modem, or a floppy disc? I can’t remember, but it has been probably more than a decade for both. Remember the hullabaloo when Apple came out with a laptop without a floppy? Now we have them without any removable storage whatsoever: they are called iPads. Steve Jobs always was ahead of curve.

Basic MS-Dos Commands - BCA Nepal

Anyway, back to DOS. I used to pride myself on knowing my mistress’ every command, every optional parameter. And we had EDLIN, a very primitive command line editor. It wasn’t all that hard – there weren’t more than a dozen different commands. (Of course they are preserved by Wikipedia.) When a new version came out, I studied the new manuals to ferret out tricks and hidden things that would help me slap my end users who would love to do format c:/s and erase their hard drives.

And new versions of DOS were a big deal to our industry, except for DOS 4, which was a total dog. One of my fondest memories of that era was going to the DOS 5 launch party in the early 1990s: Steve Ballmer was doing his hyperkinetic dance and sharing the stage with Dave Brubeck. To make a point of how bad DOS 4 was Brubeck tried to play “Take Five” in 4/4 time, before switching to 5/4 time as it was intended. Those were fun times.

But DOS wasn’t enough for our computers, and in the late 1980’s Microsoft began work on Windows. By 1990, we had Windows v3 that was really the first usable version. By then we also had the Mac OS for several years and graphical OS’s were here to stay. DOS went into decline. It didn’t help that a family feud with DR DOS kept many lawyers engaged over its origins either. As the 1990s wore on, we used DOS less and less until finally Windows 95 sealed its fate: the first version of Windows that didn’t need DOS to boot.

I won’t even get into OS/2, which had a troubled birth coming from both IBM and Microsoft, and has since disappeared. My first book, which was never published, was on OS/2 and was rewritten several times as we lurched from one version to another, never catching on with the business public.

Once PC networks caught on, DOS wasn’t a very good partner. You had 640 kilobytes of memory – yes, KB! — and network drivers stole part of that away for their own needs. Multitasking and graphical windows also made us more productive, and we never looked back. For a great ten minute video tour and trip down memory lane, see this effort by Andrew Tait showing successive upgrades of Windows OS .

But DOS was always my first love, my one and true. I still use the command line to ping and test network connectivity and to list files in a directory. There is something comforting about seeing white text on a mostly black screen.

Yes, we haven’t been in touch in many years, and now when I need a new OS I just bring up a VM and within a few minutes can have whatever I need, without the hassle of CONFIG.SYS or AUTOEXEC.BAT. (Here is a column that I wrote a few years ago about getting Windows NT to work in a VM.) But happy birthday, DOS, and thanks for the memories. It’s been lots of fun, all in all.