Sam Whitmore podcast: The presence of analytics in the online newsroom

I caught up with Sam Whitmore recently. Sam and I worked together at PC Week back in the 1980s. We had a ten minute discussion about the presence of analytics in the online newsroom, and their importance and utility to reporters and editors. The conversation came about after we both reviewed a presentation entitled, “Audience insights for the newsroom.” It was given at last year’s Online News Association annual conference by Tess Jeffers, who is the director of Newsroom Data and AI for the Wall Street Journal, and Fernanda Brackenrich, who is the US Audience Engagement editor for Financial Times.

 

Sam and I spoke about the role that analytics plays to help editors assign stories and shape coverage, comparing my decades of experience freelancing for dozens of publications. The ONA presentation is filled with helpful hints and suggested best practices, all in the name of improving content and increasing influence and reach within Tier 1 newsrooms.

This topic has long been an interest of mine. As I wrote back in 2014, for many years I dutifully kept track of how my blog posts were doing, who was commenting, where backlinks were coming from, and so forth. That post mentions how influence can be found in odd places, and can come from some long tail content that has been around for years, both things that Sam and I touched on during our talk.

This wasn’t the first time I have had a discussion about the relevance of analytics to publishing. Back in 2018, Paul Gillin and I did a podcast interview with Adam Jones with publisher Springer Nature. He spoke about the role of marketing analytics and how he creates stronger calls to action from these insights.

In 2012, I wrote about the work of two Boeing data analysts at a Gartner conference about various efforts using cloud computing and business intelligence projects. One of my insights from that era was to keep your data local and have consistent security controls, advice that is still relevant today (thanks DeepSeek).

Part of increasing the utility of data analytics is by using appropriate data visualization tools, such as data dashboards. The more patterns you can see graphically, the easier it is to glean something from the parade of numbers on the screen. I wrote about this topic back in 2015, reviewing several municipal applications.  During that era, I attended several Tableau user conferences (the company is now a part of Salesforce) where I learned of numerous analytics success stories.

PR people should get to know audience development and data analytics managers such as Jeffers and Brackenrich, because they have their fingers on the pulse of who is reading their pubs and posts.

As all my years writing about tech has taught me, the basics are still important, whether you are dealing with the first IBM PC or the latest AI whizbang model. If you can posit what can build engagement and gather interest, you are already ahead of the game when it comes to pitching a story that can resonate with the right audience.

Time to pare down your mobile app portfolio

When the iPhones and Android devices were first introduced, I recall the excitement. We would download apps willy-nilly, and many of them we would use maybe twice before souring on their bad or frustrating UX. The excitement was everywhere, and back in 2009, I attended the final presentations of a Washington University computer science class on how to develop new iOS apps. The class is still being taught today, and while 15 years may seem like a lifetime, we are still dealing with basic issues about app security and data privacy. With all the buzz surrounding DeepSeek this week comes the inevitable analysis by NowSecure about the major security and privacy flaws in its iOS app.

Ruh-oh. Danger Will Robinson! (Insert your favorite meme here.)

Pin page

So much for app excitement. I have come full circle: When I got my latest iPhone last year, I spent some time doing the opposite: paring down my apps to the barest minimum.

It is time to take another closer look at your app portfolio, and I suggest you spend part of your weekend doing some careful home screen editing. Now, I wasn’t one of the many millions (or so it seems) of folks who downloaded DeepSeek, or who freaked out when TikTok went down for a few hours and rushed to download Another Chinese Social Media App in its place.

But still. We should use the privacy abuses found in DeepSeek’s app as a teachable moment.

Your phone is the gateway to your life, to your electronic soul. It is also a major security sinkhole. It has become a major gateway for phishing attacks, because often we are scrolling around and not paying attention to what we are doing, especially when we get an “emergency” text or email.

But let’s talk about our apps. If you read the entire NowSecure report, you will see that you should run away from using the DeepSeek app. It will send your data across the intertubes unencrypted. When it does use encryption, it does so using older methods that are easily compromised, and has its keys hardcoded in the app making your data easy to read. It also hoovers up enough device fingerprinting info to track your movements. And its terms of service say quite plainly that all this information is sent to Chinese servers. Thanks, but no thanks.

Why did I initially pare down my apps last year?  I did this for a combination of reasons. First, it seemed like a good time to review all those cute icons and cut out the ones that were clogging my home screens. And I really wanted to get to a single screen, but accepted two screens full of apps. Also, I wasn’t comfortable with the level of private details that the bad apps were sending to their corporate overlords, or to data brokers, or to both.

To make it easier for your Great App Cull, I suggest the divide and conquer approach. I divided my apps into four categories:

Type 1 apps were those that I knew had major privacy concerns about, such as Facebook’s Messenger, Twitter, Google Meet and Maps . I am sure there were others that don’t immediately come to mind. You can debate whether the privacy concerns are real or not, but I think most of us would agree that DeepSeek would definitely fall into this bucket.

Type 2 were apps that really were so poorly designed that I would be better off using just the web versions, such as the T-Mobile and Instacart apps and several banking apps.

Type 3 were apps that I had to download to do some specific task, such as attend a conference, or because I used it maybe one or two times, such as the Bluesky app or the Ring camera app. These were also poorly designed.

Type 4 were apps that were no longer relevant to my life, such as to control my Ecobee thermostat in a place that I no longer lived, or to run a bunch of VPN apps that I was testing for CNN that I no longer used.

I am sure that years from now DeepSeek’s app will be a case study of what not to do to write secure mobile apps. This is why many countries and agencies have already banned its use on government-owned devices and why there is a bill before our Congress to do so.

Red Cross profile: The Life of a Blood Donor Ambassador Starts with Bob Hergert’s First Donation

bob hergert and jason ramlow holding award

Like many volunteers to the American Red Cross, Robert Hergert’s first contact was donating his blood at a local blood drive back in 2019. That led to his becoming a Blood Donor Ambassador where he lives in Leavenworth, Kansas which is part of the Greater Kansas City and Northwest Missouri chapter. “It started to grow on me, and I was interested in stepping up to doing more than just donating my blood.” I tell his story, his history with the Red Cross, and other details, on their blog here.

 

The new world of hybrid warfare: cutting undersea cables

The song from The Little Mermaid goes “Under the sea, darling it’s better down where it is wetter.” Well, not quite.

This week the hybrid war between Russia and the rest of the world took a new turn, with the seizure of the bulk carrier Vezhon by the Swedish Coast Guard. The details are provided in Sal Mercogliano‘s video – it is now the fourth ship that was suspected of dragging its anchor in the Baltic Sea and cutting an undersea cable. This time it was a cable which runs between Latvia and Sweden. Let’s unpack this situation.

There are hundreds of thousands of miles of cables that run across the ocean seabed around the world, divided mostly into those carrying communications and others that move electrical power to remote locations. Laying these cables is a tricky operation, and there are specialized ships that do this. By way of reference, you might enjoy this Fluctus video on how undersea cables are made, laid and repaired.

The videos show how the cables can deteriorate over time as various sea life attaches itself (think a thick layer of barnacles and lots of corrosion) – getting through that junk at underwater depths isn’t easy. Divers have had to develop new tools and repair methods.

Last summer we had a cable cut in front of my home as contractors installed new streetlights. They ended up severing a major AT&T fiber cable that had hundreds of individual strands. While working a few feet below the street wasn’t a picnic, it is nothing like the conditions faced to do these repairs on the seabed.

Much easier is how a ship’s anchor can be dropped and dragged along the bottom. The cables are clearly shown on marine maps – this is a feature and not a bug because prior to the Russian sabotage events, the marine transportation community wanted mariners to know where the cables ran so that ships would steer clear of them. Here is a more stylized map showing how many cables are laid in the Baltic region.

Sal’s analysis (I feel like we should be on a first-name basis because I have been a fan of his videos since the Baltimore bridge accident two years ago) concludes that having four cable cuts in a few months in nearly the same area is suspicious. An update has found that the ship’s anchor accidentally broke. Yup.

As a mariner himself he shows how anchors on these large ships are controlled and how easy it would be to lower one undetected by the ship’s navigators. He calls this part of a hybrid war – meaning that it combines the traditional “kinetic” war fighting (with guns, tanks, and missiles) with more technology-based things such as drones with careful targeting of digital infrastructure, such as seabed cables and satellite internet access. The hybrid nature also combines military targets with civilian ones, such as communication cables that connect countries.

Ukraine has deployed hybrid techniques in its war with Russia. And I am sure that we could find instances of US and our allies using similar tactics. What it means is that life under the sea has become the new battleground. Sorry, Sebastian!

MSFT @ 50

Microsoft was founded 50 years ago this coming April. Most of you are somewhat familiar with their origin story that began with a small office in Albuquerque by Bill Gates and Paul Allen. And thanks to this series in Geekwire, you can read about things from their perspective. This series has inspired my own POV.

Back in the spring of 1975, I was finishing up college in New York. The only computers I had access to were mainframes. When I got to grad school a few years later, we had a time-sharing system. That meant getting up at the crack of dawn or waiting until the middle of night to go across campus and sit in front of a monochrome character-mode terminal. The odd hours were caused by their popularity during the day. When I got my first post-grad-school job in DC, I used a downtown “remote job entry” storefront (I think it was on K Street, but don’t hold me to that) where I could submit my decks of punched cards and come back the next day to see if my programs had run without errors. (They usually took a few tries, just so you know.)

My first actual PC was an HP 85 running CPM, somewhere around 1979. I was using it to build math models for various consulting clients, and the models were built using Visicalc, the original digital spreadsheet. It had all of 8K of RAM, an amount of memory so small you can’t even buy it in a basic digital watch today.

My first interaction with Microsoft was IBM PC DOS in 1981. It would take several years before I joined PC Week in the mid-1980s, when I got my first breakout job there (now known as eWeek). Then I began using Microsoft’s local area network software, called LAN Manager, that it built to run on 3Com’s servers. The LAN Man era accounted for one of my favorite PC Week cover stories back then: we wrote about how anyone could take over a server with a simple boot floppy which granted unrestricted physical access to the machine. Ah, those were the days!

It was at PC Week that I began to develop relationships with many of the MSFT execs, including Ballmer and Gates, as we went around the country to various events and covered major product launches. It was a heady time for a former anonymous corporate user, now blessed with a huge expense account.

I got side tracked during this time period with OS/2, the failed IBM and Microsoft operating system project that resulted in a book deal for me (which remains unpublished) and a new server operating system for Microsoft called Windows NT. NT had an enduring and somewhat troubling legacy, which I first wrote about in 2003. I still have a soft spot in my keyboard for it.

NT is the OS that keeps on giving, as I recently updated a post for CSOonline about its infamous and enduring NTLM protocol, a favorite of hackers through the ages because it basically doesn’t require any authentication.

During the 1990s, Microsoft (along with many of us) discovered the web, or as we called it then, The Web. Microsoft stumbled here as well, trying to make the web its own proprietary playground, as I wrote about in 1998. This would be a common theme, one that I called attention to when ActiveX was on its way out in the mid 2000s for dynamic content. They tried and failed to squash the upstart browser innovator Netscape with their own Internet Explorer. That eventually failed and now Microsoft’s Edge browser is based on Google’s Chromium.

It was during this decade that Microsoft began to understand the open-source community. Some of this understanding was the result of court judgements (at one point in time, the company had 130 lawsuits to deal with), and some due to a transformation of its collective engineering mindset. I went to one of its 2007 conferences where it was clear that it still had  a love/hate relationship with the internet and viewed many OSS projects as competitors for its own commercial products. You can see how that attitude has changed somewhat on this current splash page, where they claim to use thousands of OSS projects every day.

It was also in 2008 that Gates announced his retirement, and where I developed a clever speaking gig giving thanks to him for making my career so interesting. The speech is somewhat tongue-in-cheek: if Microsoft had made better products or gotten on board trends sooner, I would have had a more boring arc in writing my stories, not to mention fewer support issues. Remember Bob and Clippy? Windows 8 and ME?

Thanks to all of you for reading my work over the years, and sharing your own MSFT memories along the way.

CSOonline: Python administrator moves to improve software security

The administrators of the Python Package Index (PyPI) have begun an effort to improve the hundreds of thousands of software packages that are listed. The attempt, which began earlier last year, is to identify and stop malware-laced packages from proliferating across the open-source community that contributes and consumes Python software.

The effort called Project Quarantine is described in blog post by Mike Fiedler, who is the sole administrator responsible for Python security. The project allows PyPI administrators and a select group of developers to mark a project as potentially harmful and prevent it from being easily installed by users, avoiding further harm.

In my blog post for CSOonline, I describe this effort and how it came about.

CSOonline: SOAR buyer’s guide, 11 products compared

The class of products called SOAR, for Security Orchestration, Automation and Response, has undergone a major transformation in the past few years. Features in each of the four words in its description that were once exclusive to SOAR have bled into other tools. For example, responses can be found now in endpoint detection and response tools. Orchestration is now a joint effort with SIEM tools. Many of these features are now found in managed security products that go by other names, such as threat and incident response or cloud security posture management (CSPM). And many of the SOAR tools are no longer just focused on security but have expanded to cover the wider context of how an enterprise infrastructure operates.

In this review for CSOonline, I cover some of the major issues for enterprises that are looking for a SOAR tool and briefly mention 11 vendors (out of dozens that offer such products). Be warned that these products are pricey, and finding true price transparency is almost impossible without engaging the vendors’ sales teams.

Book review: The Perfect Home

The Perfect Home by Daniel KenitzThe novel The Perfect Home by Daniel Kenitz centers on a power couple who are behind a leading shelter reality TV show and what could go wrong. If you are a fan of such shows you might enjoy the novel, which chronicles the decline of their relationship when the husband plans on getting rid of his family in the quest to garner more fame, more power, and more money on his own. Twin babies are involved, an affair happens, and the wife reunites with her long estranged father, all in the quest to figure out the shifting reality — in this case, their actual lives — rather than what is depicted on screen as they renovate various homes around the country as the cameras and scripted witty banter roll. Having been through a divorce from my own cheating spouse, I still found this novel interesting and engaging, and the exploration of the shifting understanding of marital trust worthy of the author’s treatment.

How to best disconnect from Twitter

Last month, I suggested that it is time to remove ourselves from Twitter. There are several ways to do this. You’ll see my process and you can make your own decisions. TL:DR is:

  • I stopped posting more than a year ago, but still kept my account to protect my brand as a placeholder.
  • I downloaded an archive of my tweets in multiple ways, and will tell you why this is necessary.
  • I then deleted all my tweets, using the Windows software from Cyd.social.
  • I also deleted the other digital effluvia of my account, including retweets, likes and follows).

Before we go down this journey, I would urge you all to read how to backup all of your social data, a blog that I wrote many years ago and have tried to keep current. If you haven’t ever done this, now is the time to create and download these archives. This should be a part of your regular backup processes of your data.

Backing up your social network data

That post will direct you to a particular link where you can prepare and then download an archive of each of your accounts. For Twitter, it might take a day or two to gather it together, but Twitter will send you an email when it is ready, and then you have a few days to download it. The others have somewhat different processes and schedules. You can pick and choose various options and data types to include in your archive: for example Google has dozens of services that may or may not be meaningful to save periodically.

But, and this is important: the archive of your tweets depends on having a working Twitter account. There is a page of HTML that will bring up a summary of your archive, but the tweets and follows and so forth have to exist online. It is a somewhat half-solution. I had a small hiccup with my archive that I will get to in a moment.

A better solution is to use one of the dedicated archive/deletion tools, and as I said, I ran Cyd.social, which logs into your Twitter account and then creates a complete offline archive. The hiccup was that Cyd didn’t like the very long file name that Twitter created, so I renamed it and that passed muster. Cyd uses this archive as a starting point to seek out and delete your content history.

There are two versions, as you can see from the screenshot: the free one will archive and delete all of your tweets. The paid version (which you can purchase an annual subscription initially for $36) will also allow you to be more selective and keep some of your tweets, and also delete other aspects of your account, such as followers, likes, and DMs.

I upgraded to the premium version so I could delete everything. I liked the design of the software, which tells you in advance what it is about to do to your account. Because Twitter has put in place rate limits to prevent these mass deletion operations, Cyd has to work around them and sometimes pauses during its housekeeping to foil these limitations.

One content type you might notice is not covered by Cyd is list management. I have quite a few lists, and ideally would like to convert them to followers on LinkedIn before I delete them, but I haven’t found a tool to do that.

Another thing that I noticed browsing my archive is how few of my words of wisdom were retweeted or liked. Almost all of them had no engagement whatsoever. You would think with all the years of using Twitter and various analysis tools I would have noticed this before now. Sigh.

I came across a free analysis tool from Cleve.AI that does summarize my LinkedIn activity. You can see an excerpt from my report below, which has a nice summary of my words of wisdom, shown below.

Best wishes and happy new year to you!

How IT can learn from Target and Walmart

With all the holiday shopping happening around now, you probably have visited the websites at Target and Walmart, and maybe that prime Seattle company too. What you probably haven’t visited are two subsidiary sites of the first two companies that aren’t selling anything, but are packed with useful knowledge that can help IT operations and application developers. This comes as a surprise because:

  • they both contain a surprising amount of solid IT information that while focused on the retail sector have broader implications for a number of other business contexts
  • they deal with many issues that are at the forefront of innovation, (such as open source and AI) not something normally associated with either company
  • both sites are a curious mixture of open source tool walkthroughs, management insights, and software architecture and design.
  • many of the posts on both sites are very technical deep dives into how they actually use the software tools, again not something you would ordinarily think you could find from these two sources

Let’s take a closer look. One post on Target’s site is by Adam Hollenbeck, an engineering manager. He wrote about their IT culture: “If creating an inclusive environment as a leader is easy for you, please share your magic with others. The perfect environment is a challenge to create but should always be our north star as leaders.” Mark Cuban often opines on this subject. Another post goes into details about a file analysis tool that was developed internally and released on open source. It has a user-friendly interface specifically designed to visualize files, their characteristics, and how they interconnect.

Walmart’s Global Tech blog site goes very heavy into its AI usage. “AI is eliminating silos that developed over time as our dev teams grew”, Andrew Budd wrote in one post, and GenAI chatbot solutions have been rolled out to optimize Walmart’s Developer Experience, a central tool repository. There are also posts about other AI and open source projects, along with a regular cyber report about recent developments in that arena. This is the sort of thing you might find on FOSSForce.com or something like TheNewStack, both news sites.

Another Walmart article, posted on LinkedIn, addresses how AI is changing the online shopping experience this season with more personalized suggestions and predictive content, (does this sound familiar from another online site?) and mentions how all Sam’s Club stores have the “just walk out” technology that was first pioneered by Amazon. (I wrote about my 2021 experience here.)

One other point: both of these tech sub-sites are not easily found: tech.target.com (not to be confused with techtarget.com) and tech.walmart.com — have no link from either company’s home pages. ” I’m not sure these pages should be linked from the home pages,” said Danielle Cooley, a UX expert whom I have known for decades. “As cool as this stuff is for people like you and me and your readers, it’s not going to rise to home page level importance for a company with millions of ecommerce visitors per day.” But she cautions that finding these sites could be an issue. “I did a quick google of ‘programming jobs target’ and ‘cybersecurity jobs target’ and still didn’t get a direct link to tech.target.com so they aren’t aiming at job openings. But also, the person interested in cybersecurity will not also the person interested in an AI shopping assistant for example.” Given their specificity, even if a visitor lands on them, they still might go away frustrated because the content is pretty broad.

You’ll notice that I haven’t said much about Amazon here. It really isn’t fair to compare the two tech sites to what they are doing, because of Amazon’s depth in all sorts of tech knowledge. And to be honest, in my extended family, we tend to shop more at Amazon than either Target or Walmart. But it is nice to know that both Target and Walmart are putting this content out there. I welcome your own thoughts about their efforts.