Why we need more 15-minute neighborhoods

I have split my years living part of the time in suburbs and part in urban areas. This is not counting two times that I lived in the LA area, which I don’t quite know how to quantify. I have learned that I like living in what urbanist researchers (as they are called) classify as a “15-minute neighborhood” — meaning that you can walk or bike to many of the things you need for your daily life within that time frame, which works out to about a mile or so walk and perhaps a three mile bike ride. I also define my neighborhood in St. Louis as walk-to-Whole Foods and walk-to-hospital, somewhat tongue-in-cheek.

Why is this important? Several reasons. First, I don’t like being in a car. On my last residency in LA, I had a 35 mile commute, which could take anywhere from 40 minutes to hours, depending on traffic and natural accidents. At my wife’s suggestion, I turned that commute into a 27 mile car ride and got on my bike for the last (or first) leg. While that lengthened the commute, it got me to ride each day. Now my commute is going from one bedroom (the one I sleep in) to another (that I work in). Some weeks go by where I don’t even use the car.

Second, I like being able to walk to many city services, even apart from WF and the doctors. When the weather is better, I bike in Forest Park, which is about half a mile away and is a real joy for other reasons besides its road and path network.

This research paper, which came out last summer, called “A universal framework for inclusive 15-minute cities,” talks about ways to quantify things across cities and takes a deep dive into specifics. It comes with an interactive map of the world’s urban areas that I could spend a lot of time exploring. The cities are mostly red (if you live here in the States) or mostly blue (if you live in Europe and a few other places). The colors aren’t an indication of political bent but how close to that 15-minute ideal most of the neighborhoods that make up the city are. Here is a screencap of the Long Island neighborhood that I spent many years living in: the area shown includes both my home and office locations, and for the most part is a typical suburban slice.

 

The cells (which in this view are the walkable area from a center point) are mostly red in that area. Many commuters who worked in the city would take issue with the scores in this part of Long Island, which has one of the fastest travel times into Manhattan, and in my case, I could walk to the train within 15 or so minutes.

The paper brings up an important issue: cities to be useful and equitable have to be inclusive and have services spread across their footprints. Most don’t come close to this ideal. For the 15 minute figure to apply, you need density high enough where people don’t have to drive. The academics write, “the very notion of the 15-minute city can not be a one-size-fits-all solution and is not a viable option in areas with a too-low density and a pronounced sprawl.”

Ray Delahanty makes this point in his latest video where he focuses on Hoboken, New Jersey. (You should subscribe to his videos, where he talks about other urban transportation planning issues. They have a nice mix of entertaining travelogue and acerbic wit.)

Maybe what we need aren’t just more 15-minute neighborhoods, but better distribution of city services.

Sam Whitmore podcast: The presence of analytics in the online newsroom

I caught up with Sam Whitmore recently. Sam and I worked together at PC Week back in the 1980s. We had a ten minute discussion about the presence of analytics in the online newsroom, and their importance and utility to reporters and editors. The conversation came about after we both reviewed a presentation entitled, “Audience insights for the newsroom.” It was given at last year’s Online News Association annual conference by Tess Jeffers, who is the director of Newsroom Data and AI for the Wall Street Journal, and Fernanda Brackenrich, who is the US Audience Engagement editor for Financial Times.

 

Sam and I spoke about the role that analytics plays to help editors assign stories and shape coverage, comparing my decades of experience freelancing for dozens of publications. The ONA presentation is filled with helpful hints and suggested best practices, all in the name of improving content and increasing influence and reach within Tier 1 newsrooms.

This topic has long been an interest of mine. As I wrote back in 2014, for many years I dutifully kept track of how my blog posts were doing, who was commenting, where backlinks were coming from, and so forth. That post mentions how influence can be found in odd places, and can come from some long tail content that has been around for years, both things that Sam and I touched on during our talk.

This wasn’t the first time I have had a discussion about the relevance of analytics to publishing. Back in 2018, Paul Gillin and I did a podcast interview with Adam Jones with publisher Springer Nature. He spoke about the role of marketing analytics and how he creates stronger calls to action from these insights.

In 2012, I wrote about the work of two Boeing data analysts at a Gartner conference about various efforts using cloud computing and business intelligence projects. One of my insights from that era was to keep your data local and have consistent security controls, advice that is still relevant today (thanks DeepSeek).

Part of increasing the utility of data analytics is by using appropriate data visualization tools, such as data dashboards. The more patterns you can see graphically, the easier it is to glean something from the parade of numbers on the screen. I wrote about this topic back in 2015, reviewing several municipal applications.  During that era, I attended several Tableau user conferences (the company is now a part of Salesforce) where I learned of numerous analytics success stories.

PR people should get to know audience development and data analytics managers such as Jeffers and Brackenrich, because they have their fingers on the pulse of who is reading their pubs and posts.

As all my years writing about tech has taught me, the basics are still important, whether you are dealing with the first IBM PC or the latest AI whizbang model. If you can posit what can build engagement and gather interest, you are already ahead of the game when it comes to pitching a story that can resonate with the right audience.

Time to pare down your mobile app portfolio

When the iPhones and Android devices were first introduced, I recall the excitement. We would download apps willy-nilly, and many of them we would use maybe twice before souring on their bad or frustrating UX. The excitement was everywhere, and back in 2009, I attended the final presentations of a Washington University computer science class on how to develop new iOS apps. The class is still being taught today, and while 15 years may seem like a lifetime, we are still dealing with basic issues about app security and data privacy. With all the buzz surrounding DeepSeek this week comes the inevitable analysis by NowSecure about the major security and privacy flaws in its iOS app.

Ruh-oh. Danger Will Robinson! (Insert your favorite meme here.)

Pin page

So much for app excitement. I have come full circle: When I got my latest iPhone last year, I spent some time doing the opposite: paring down my apps to the barest minimum.

It is time to take another closer look at your app portfolio, and I suggest you spend part of your weekend doing some careful home screen editing. Now, I wasn’t one of the many millions (or so it seems) of folks who downloaded DeepSeek, or who freaked out when TikTok went down for a few hours and rushed to download Another Chinese Social Media App in its place.

But still. We should use the privacy abuses found in DeepSeek’s app as a teachable moment.

Your phone is the gateway to your life, to your electronic soul. It is also a major security sinkhole. It has become a major gateway for phishing attacks, because often we are scrolling around and not paying attention to what we are doing, especially when we get an “emergency” text or email.

But let’s talk about our apps. If you read the entire NowSecure report, you will see that you should run away from using the DeepSeek app. It will send your data across the intertubes unencrypted. When it does use encryption, it does so using older methods that are easily compromised, and has its keys hardcoded in the app making your data easy to read. It also hoovers up enough device fingerprinting info to track your movements. And its terms of service say quite plainly that all this information is sent to Chinese servers. Thanks, but no thanks.

Why did I initially pare down my apps last year?  I did this for a combination of reasons. First, it seemed like a good time to review all those cute icons and cut out the ones that were clogging my home screens. And I really wanted to get to a single screen, but accepted two screens full of apps. Also, I wasn’t comfortable with the level of private details that the bad apps were sending to their corporate overlords, or to data brokers, or to both.

To make it easier for your Great App Cull, I suggest the divide and conquer approach. I divided my apps into four categories:

Type 1 apps were those that I knew had major privacy concerns about, such as Facebook’s Messenger, Twitter, Google Meet and Maps . I am sure there were others that don’t immediately come to mind. You can debate whether the privacy concerns are real or not, but I think most of us would agree that DeepSeek would definitely fall into this bucket.

Type 2 were apps that really were so poorly designed that I would be better off using just the web versions, such as the T-Mobile and Instacart apps and several banking apps.

Type 3 were apps that I had to download to do some specific task, such as attend a conference, or because I used it maybe one or two times, such as the Bluesky app or the Ring camera app. These were also poorly designed.

Type 4 were apps that were no longer relevant to my life, such as to control my Ecobee thermostat in a place that I no longer lived, or to run a bunch of VPN apps that I was testing for CNN that I no longer used.

I am sure that years from now DeepSeek’s app will be a case study of what not to do to write secure mobile apps. This is why many countries and agencies have already banned its use on government-owned devices and why there is a bill before our Congress to do so.

Red Cross profile: The Life of a Blood Donor Ambassador Starts with Bob Hergert’s First Donation

bob hergert and jason ramlow holding award

Like many volunteers to the American Red Cross, Robert Hergert’s first contact was donating his blood at a local blood drive back in 2019. That led to his becoming a Blood Donor Ambassador where he lives in Leavenworth, Kansas which is part of the Greater Kansas City and Northwest Missouri chapter. “It started to grow on me, and I was interested in stepping up to doing more than just donating my blood.” I tell his story, his history with the Red Cross, and other details, on their blog here.

 

The new world of hybrid warfare: cutting undersea cables

The song from The Little Mermaid goes “Under the sea, darling it’s better down where it is wetter.” Well, not quite.

This week the hybrid war between Russia and the rest of the world took a new turn, with the seizure of the bulk carrier Vezhon by the Swedish Coast Guard. The details are provided in Sal Mercogliano‘s video – it is now the fourth ship that was suspected of dragging its anchor in the Baltic Sea and cutting an undersea cable. This time it was a cable which runs between Latvia and Sweden. Let’s unpack this situation.

There are hundreds of thousands of miles of cables that run across the ocean seabed around the world, divided mostly into those carrying communications and others that move electrical power to remote locations. Laying these cables is a tricky operation, and there are specialized ships that do this. By way of reference, you might enjoy this Fluctus video on how undersea cables are made, laid and repaired.

The videos show how the cables can deteriorate over time as various sea life attaches itself (think a thick layer of barnacles and lots of corrosion) – getting through that junk at underwater depths isn’t easy. Divers have had to develop new tools and repair methods.

Last summer we had a cable cut in front of my home as contractors installed new streetlights. They ended up severing a major AT&T fiber cable that had hundreds of individual strands. While working a few feet below the street wasn’t a picnic, it is nothing like the conditions faced to do these repairs on the seabed.

Much easier is how a ship’s anchor can be dropped and dragged along the bottom. The cables are clearly shown on marine maps – this is a feature and not a bug because prior to the Russian sabotage events, the marine transportation community wanted mariners to know where the cables ran so that ships would steer clear of them. Here is a more stylized map showing how many cables are laid in the Baltic region.

Sal’s analysis (I feel like we should be on a first-name basis because I have been a fan of his videos since the Baltimore bridge accident two years ago) concludes that having four cable cuts in a few months in nearly the same area is suspicious. An update has found that the ship’s anchor accidentally broke. Yup.

As a mariner himself he shows how anchors on these large ships are controlled and how easy it would be to lower one undetected by the ship’s navigators. He calls this part of a hybrid war – meaning that it combines the traditional “kinetic” war fighting (with guns, tanks, and missiles) with more technology-based things such as drones with careful targeting of digital infrastructure, such as seabed cables and satellite internet access. The hybrid nature also combines military targets with civilian ones, such as communication cables that connect countries.

Ukraine has deployed hybrid techniques in its war with Russia. And I am sure that we could find instances of US and our allies using similar tactics. What it means is that life under the sea has become the new battleground. Sorry, Sebastian!

MSFT @ 50

Microsoft was founded 50 years ago this coming April. Most of you are somewhat familiar with their origin story that began with a small office in Albuquerque by Bill Gates and Paul Allen. And thanks to this series in Geekwire, you can read about things from their perspective. This series has inspired my own POV.

Back in the spring of 1975, I was finishing up college in New York. The only computers I had access to were mainframes. When I got to grad school a few years later, we had a time-sharing system. That meant getting up at the crack of dawn or waiting until the middle of night to go across campus and sit in front of a monochrome character-mode terminal. The odd hours were caused by their popularity during the day. When I got my first post-grad-school job in DC, I used a downtown “remote job entry” storefront (I think it was on K Street, but don’t hold me to that) where I could submit my decks of punched cards and come back the next day to see if my programs had run without errors. (They usually took a few tries, just so you know.)

My first actual PC was an HP 85 running CPM, somewhere around 1979. I was using it to build math models for various consulting clients, and the models were built using Visicalc, the original digital spreadsheet. It had all of 8K of RAM, an amount of memory so small you can’t even buy it in a basic digital watch today.

My first interaction with Microsoft was IBM PC DOS in 1981. It would take several years before I joined PC Week in the mid-1980s, when I got my first breakout job there (now known as eWeek). Then I began using Microsoft’s local area network software, called LAN Manager, that it built to run on 3Com’s servers. The LAN Man era accounted for one of my favorite PC Week cover stories back then: we wrote about how anyone could take over a server with a simple boot floppy which granted unrestricted physical access to the machine. Ah, those were the days!

It was at PC Week that I began to develop relationships with many of the MSFT execs, including Ballmer and Gates, as we went around the country to various events and covered major product launches. It was a heady time for a former anonymous corporate user, now blessed with a huge expense account.

I got side tracked during this time period with OS/2, the failed IBM and Microsoft operating system project that resulted in a book deal for me (which remains unpublished) and a new server operating system for Microsoft called Windows NT. NT had an enduring and somewhat troubling legacy, which I first wrote about in 2003. I still have a soft spot in my keyboard for it.

NT is the OS that keeps on giving, as I recently updated a post for CSOonline about its infamous and enduring NTLM protocol, a favorite of hackers through the ages because it basically doesn’t require any authentication.

During the 1990s, Microsoft (along with many of us) discovered the web, or as we called it then, The Web. Microsoft stumbled here as well, trying to make the web its own proprietary playground, as I wrote about in 1998. This would be a common theme, one that I called attention to when ActiveX was on its way out in the mid 2000s for dynamic content. They tried and failed to squash the upstart browser innovator Netscape with their own Internet Explorer. That eventually failed and now Microsoft’s Edge browser is based on Google’s Chromium.

It was during this decade that Microsoft began to understand the open-source community. Some of this understanding was the result of court judgements (at one point in time, the company had 130 lawsuits to deal with), and some due to a transformation of its collective engineering mindset. I went to one of its 2007 conferences where it was clear that it still had  a love/hate relationship with the internet and viewed many OSS projects as competitors for its own commercial products. You can see how that attitude has changed somewhat on this current splash page, where they claim to use thousands of OSS projects every day.

It was also in 2008 that Gates announced his retirement, and where I developed a clever speaking gig giving thanks to him for making my career so interesting. The speech is somewhat tongue-in-cheek: if Microsoft had made better products or gotten on board trends sooner, I would have had a more boring arc in writing my stories, not to mention fewer support issues. Remember Bob and Clippy? Windows 8 and ME?

Thanks to all of you for reading my work over the years, and sharing your own MSFT memories along the way.

CSOonline: Python administrator moves to improve software security

The administrators of the Python Package Index (PyPI) have begun an effort to improve the hundreds of thousands of software packages that are listed. The attempt, which began earlier last year, is to identify and stop malware-laced packages from proliferating across the open-source community that contributes and consumes Python software.

The effort called Project Quarantine is described in blog post by Mike Fiedler, who is the sole administrator responsible for Python security. The project allows PyPI administrators and a select group of developers to mark a project as potentially harmful and prevent it from being easily installed by users, avoiding further harm.

In my blog post for CSOonline, I describe this effort and how it came about.

CSOonline: SOAR buyer’s guide, 11 products compared

The class of products called SOAR, for Security Orchestration, Automation and Response, has undergone a major transformation in the past few years. Features in each of the four words in its description that were once exclusive to SOAR have bled into other tools. For example, responses can be found now in endpoint detection and response tools. Orchestration is now a joint effort with SIEM tools. Many of these features are now found in managed security products that go by other names, such as threat and incident response or cloud security posture management (CSPM). And many of the SOAR tools are no longer just focused on security but have expanded to cover the wider context of how an enterprise infrastructure operates.

In this review for CSOonline, I cover some of the major issues for enterprises that are looking for a SOAR tool and briefly mention 11 vendors (out of dozens that offer such products). Be warned that these products are pricey, and finding true price transparency is almost impossible without engaging the vendors’ sales teams.

Book review: The Perfect Home

The Perfect Home by Daniel KenitzThe novel The Perfect Home by Daniel Kenitz centers on a power couple who are behind a leading shelter reality TV show and what could go wrong. If you are a fan of such shows you might enjoy the novel, which chronicles the decline of their relationship when the husband plans on getting rid of his family in the quest to garner more fame, more power, and more money on his own. Twin babies are involved, an affair happens, and the wife reunites with her long estranged father, all in the quest to figure out the shifting reality — in this case, their actual lives — rather than what is depicted on screen as they renovate various homes around the country as the cameras and scripted witty banter roll. Having been through a divorce from my own cheating spouse, I still found this novel interesting and engaging, and the exploration of the shifting understanding of marital trust worthy of the author’s treatment.

How to best disconnect from Twitter

Last month, I suggested that it is time to remove ourselves from Twitter. There are several ways to do this. You’ll see my process and you can make your own decisions. TL:DR is:

  • I stopped posting more than a year ago, but still kept my account to protect my brand as a placeholder.
  • I downloaded an archive of my tweets in multiple ways, and will tell you why this is necessary.
  • I then deleted all my tweets, using the Windows software from Cyd.social.
  • I also deleted the other digital effluvia of my account, including retweets, likes and follows).

Before we go down this journey, I would urge you all to read how to backup all of your social data, a blog that I wrote many years ago and have tried to keep current. If you haven’t ever done this, now is the time to create and download these archives. This should be a part of your regular backup processes of your data.

Backing up your social network data

That post will direct you to a particular link where you can prepare and then download an archive of each of your accounts. For Twitter, it might take a day or two to gather it together, but Twitter will send you an email when it is ready, and then you have a few days to download it. The others have somewhat different processes and schedules. You can pick and choose various options and data types to include in your archive: for example Google has dozens of services that may or may not be meaningful to save periodically.

But, and this is important: the archive of your tweets depends on having a working Twitter account. There is a page of HTML that will bring up a summary of your archive, but the tweets and follows and so forth have to exist online. It is a somewhat half-solution. I had a small hiccup with my archive that I will get to in a moment.

A better solution is to use one of the dedicated archive/deletion tools, and as I said, I ran Cyd.social, which logs into your Twitter account and then creates a complete offline archive. The hiccup was that Cyd didn’t like the very long file name that Twitter created, so I renamed it and that passed muster. Cyd uses this archive as a starting point to seek out and delete your content history.

There are two versions, as you can see from the screenshot: the free one will archive and delete all of your tweets. The paid version (which you can purchase an annual subscription initially for $36) will also allow you to be more selective and keep some of your tweets, and also delete other aspects of your account, such as followers, likes, and DMs.

I upgraded to the premium version so I could delete everything. I liked the design of the software, which tells you in advance what it is about to do to your account. Because Twitter has put in place rate limits to prevent these mass deletion operations, Cyd has to work around them and sometimes pauses during its housekeeping to foil these limitations.

One content type you might notice is not covered by Cyd is list management. I have quite a few lists, and ideally would like to convert them to followers on LinkedIn before I delete them, but I haven’t found a tool to do that.

Another thing that I noticed browsing my archive is how few of my words of wisdom were retweeted or liked. Almost all of them had no engagement whatsoever. You would think with all the years of using Twitter and various analysis tools I would have noticed this before now. Sigh.

I came across a free analysis tool from Cleve.AI that does summarize my LinkedIn activity. You can see an excerpt from my report below, which has a nice summary of my words of wisdom, shown below.

Best wishes and happy new year to you!