Time to pare down your mobile app portfolio

When the iPhones and Android devices were first introduced, I recall the excitement. We would download apps willy-nilly, and many of them we would use maybe twice before souring on their bad or frustrating UX. The excitement was everywhere, and back in 2009, I attended the final presentations of a Washington University computer science class on how to develop new iOS apps. The class is still being taught today, and while 15 years may seem like a lifetime, we are still dealing with basic issues about app security and data privacy. With all the buzz surrounding DeepSeek this week comes the inevitable analysis by NowSecure about the major security and privacy flaws in its iOS app.

Ruh-oh. Danger Will Robinson! (Insert your favorite meme here.)

Pin page

So much for app excitement. I have come full circle: When I got my latest iPhone last year, I spent some time doing the opposite: paring down my apps to the barest minimum.

It is time to take another closer look at your app portfolio, and I suggest you spend part of your weekend doing some careful home screen editing. Now, I wasn’t one of the many millions (or so it seems) of folks who downloaded DeepSeek, or who freaked out when TikTok went down for a few hours and rushed to download Another Chinese Social Media App in its place.

But still. We should use the privacy abuses found in DeepSeek’s app as a teachable moment.

Your phone is the gateway to your life, to your electronic soul. It is also a major security sinkhole. It has become a major gateway for phishing attacks, because often we are scrolling around and not paying attention to what we are doing, especially when we get an “emergency” text or email.

But let’s talk about our apps. If you read the entire NowSecure report, you will see that you should run away from using the DeepSeek app. It will send your data across the intertubes unencrypted. When it does use encryption, it does so using older methods that are easily compromised, and has its keys hardcoded in the app making your data easy to read. It also hoovers up enough device fingerprinting info to track your movements. And its terms of service say quite plainly that all this information is sent to Chinese servers. Thanks, but no thanks.

Why did I initially pare down my apps last year?  I did this for a combination of reasons. First, it seemed like a good time to review all those cute icons and cut out the ones that were clogging my home screens. And I really wanted to get to a single screen, but accepted two screens full of apps. Also, I wasn’t comfortable with the level of private details that the bad apps were sending to their corporate overlords, or to data brokers, or to both.

To make it easier for your Great App Cull, I suggest the divide and conquer approach. I divided my apps into four categories:

Type 1 apps were those that I knew had major privacy concerns about, such as Facebook’s Messenger, Twitter, Google Meet and Maps . I am sure there were others that don’t immediately come to mind. You can debate whether the privacy concerns are real or not, but I think most of us would agree that DeepSeek would definitely fall into this bucket.

Type 2 were apps that really were so poorly designed that I would be better off using just the web versions, such as the T-Mobile and Instacart apps and several banking apps.

Type 3 were apps that I had to download to do some specific task, such as attend a conference, or because I used it maybe one or two times, such as the Bluesky app or the Ring camera app. These were also poorly designed.

Type 4 were apps that were no longer relevant to my life, such as to control my Ecobee thermostat in a place that I no longer lived, or to run a bunch of VPN apps that I was testing for CNN that I no longer used.

I am sure that years from now DeepSeek’s app will be a case study of what not to do to write secure mobile apps. This is why many countries and agencies have already banned its use on government-owned devices and why there is a bill before our Congress to do so.

CSOonline: Python administrator moves to improve software security

The administrators of the Python Package Index (PyPI) have begun an effort to improve the hundreds of thousands of software packages that are listed. The attempt, which began earlier last year, is to identify and stop malware-laced packages from proliferating across the open-source community that contributes and consumes Python software.

The effort called Project Quarantine is described in blog post by Mike Fiedler, who is the sole administrator responsible for Python security. The project allows PyPI administrators and a select group of developers to mark a project as potentially harmful and prevent it from being easily installed by users, avoiding further harm.

In my blog post for CSOonline, I describe this effort and how it came about.

How IT can learn from Target and Walmart

With all the holiday shopping happening around now, you probably have visited the websites at Target and Walmart, and maybe that prime Seattle company too. What you probably haven’t visited are two subsidiary sites of the first two companies that aren’t selling anything, but are packed with useful knowledge that can help IT operations and application developers. This comes as a surprise because:

  • they both contain a surprising amount of solid IT information that while focused on the retail sector have broader implications for a number of other business contexts
  • they deal with many issues that are at the forefront of innovation, (such as open source and AI) not something normally associated with either company
  • both sites are a curious mixture of open source tool walkthroughs, management insights, and software architecture and design.
  • many of the posts on both sites are very technical deep dives into how they actually use the software tools, again not something you would ordinarily think you could find from these two sources

Let’s take a closer look. One post on Target’s site is by Adam Hollenbeck, an engineering manager. He wrote about their IT culture: “If creating an inclusive environment as a leader is easy for you, please share your magic with others. The perfect environment is a challenge to create but should always be our north star as leaders.” Mark Cuban often opines on this subject. Another post goes into details about a file analysis tool that was developed internally and released on open source. It has a user-friendly interface specifically designed to visualize files, their characteristics, and how they interconnect.

Walmart’s Global Tech blog site goes very heavy into its AI usage. “AI is eliminating silos that developed over time as our dev teams grew”, Andrew Budd wrote in one post, and GenAI chatbot solutions have been rolled out to optimize Walmart’s Developer Experience, a central tool repository. There are also posts about other AI and open source projects, along with a regular cyber report about recent developments in that arena. This is the sort of thing you might find on FOSSForce.com or something like TheNewStack, both news sites.

Another Walmart article, posted on LinkedIn, addresses how AI is changing the online shopping experience this season with more personalized suggestions and predictive content, (does this sound familiar from another online site?) and mentions how all Sam’s Club stores have the “just walk out” technology that was first pioneered by Amazon. (I wrote about my 2021 experience here.)

One other point: both of these tech sub-sites are not easily found: tech.target.com (not to be confused with techtarget.com) and tech.walmart.com — have no link from either company’s home pages. ” I’m not sure these pages should be linked from the home pages,” said Danielle Cooley, a UX expert whom I have known for decades. “As cool as this stuff is for people like you and me and your readers, it’s not going to rise to home page level importance for a company with millions of ecommerce visitors per day.” But she cautions that finding these sites could be an issue. “I did a quick google of ‘programming jobs target’ and ‘cybersecurity jobs target’ and still didn’t get a direct link to tech.target.com so they aren’t aiming at job openings. But also, the person interested in cybersecurity will not also the person interested in an AI shopping assistant for example.” Given their specificity, even if a visitor lands on them, they still might go away frustrated because the content is pretty broad.

You’ll notice that I haven’t said much about Amazon here. It really isn’t fair to compare the two tech sites to what they are doing, because of Amazon’s depth in all sorts of tech knowledge. And to be honest, in my extended family, we tend to shop more at Amazon than either Target or Walmart. But it is nice to know that both Target and Walmart are putting this content out there. I welcome your own thoughts about their efforts.

CSOonline: Top 5 security mistakes software developers make

Creating and enforcing the best security practices for application development teams isn’t easy. Software developers don’t necessarily write their code with these in mind, and as the appdev landscape becomes more complex, securing apps becomes more of a challenge to handle cloud computing, containers, and API connections. It is a big problem: Security flaws were found in 80% of the applications scanned by Veracode in a recent analysis.

As attacks continue to plague cybersecurity leaders, I compiled a list of five common mistakes by software developers and how they can be prevented for a piece for CSOonline.

The Cloud-Ready Mainframe: Extending Your Data’s Reach and Impact

(This post is sponsored by VirtualZ Computing)

Some of the largest enterprises are finding new uses for their mainframes. And instead of competing with cloud and distributed computing, the mainframe has become a complementary asset that adds new productivity and a level of cost-effective scale to existing data and applications. 

While the cloud does quite well at elastically scaling up resources as application and data demands increase, the mainframe is purpose-built for the largest-scale digital applications. But more importantly, it has kept pace as these demands have mushroomed over its 60-year reign, and why so many large enterprises continue to use them. Having them as part of a distributed enterprise application portfolio could be a significant and savvy use case, and be a reason for increasing their future role and importance.

Estimates suggest that there are about 10,000 mainframes in use today, which may not seem a lot except that they can be found across the board in more than two-thirds of Fortune 500 companies, In the past, they used proprietary protocols such as Systems Network Architecture, had applications written in now-obsolete coding languages such as COBOL, and ran on custom CPU hardware. Those days are behind us: instead, the latest mainframes run Linux and TCP/IP across hundreds of multi-core  microprocessors. 

But even speaking cloud-friendly Linux and TCP/IP doesn’t remove two main problems for mainframe-based data. First off, many mainframe COBOL apps are their own island, isolated from the end-user Java experience and coding pipelines and programming tools. To break this isolation usually means an expensive effort to convert and audit the code. 

A second issue has to do with data lakes and data warehouses. These applications have become popular ways that businesses can spot trends quickly and adjust IT solutions as their customer’s data needs evolve. But the underlying applications typically require having near real-time access to existing mainframe data, such as financial transactions, sales and inventory levels or airline reservations. At the core of any lake or warehouse is conducting a series of “extract, transform and load” operations that move data back and forth between the mainframe and the cloud. These efforts only transform data at a particular moment in time, and also require custom programming efforts to accomplish.

What was needed was an additional step to make mainframes easier for IT managers to integrate with other cloud and distributed computing resources, and that means a new set of software tools. The first step was thanks to initiatives such as the use of IBM’s z/OS Connect. This enabled distributed applications to access mainframe data. But it continued the mindset of a custom programming effort and didn’t really provide direct access to distributed applications.

To fully realize the vision of mainframe data as equal cloud nodes required a major makeover, thanks to companies such as VirtualZ Computing. They latched on to the OpenAPI effort, which was previously part of the cloud and distributed world. Using this protocol, they created connectors that made it easier for vendors to access real-time data and integrate with a variety of distributed data products, such as MuleSoft, Tableau, TIBCO, Dell Boomi, Microsoft Power BI, Snowflake and Salesforce. Instead of complex, single-use data transformations, VirtualZ enables real-time read and write access to business applications. This means the mainframe can now become a full-fledged and efficient cloud computer. 

VirtualZ CEO Jeanne Glass says, “Because data stays securely and safely on the mainframe, it is a single source of truth for the customer and still leverages existing mainframe security protocols.” There isn’t any need to convert COBOL code, and no need to do any cumbersome data transformations and extractions.

The net effect is an overall cost reduction since an enterprise isn’t paying for expensive high-resource cloud instances. It makes the business operation more agile, since data is still located in one place and is available at the moment it is needed for a particular application. These uses extend the effective life of a mainframe without having to go through any costly data or process conversions, and do so while reducing risk and complexity. These uses also help solve complex data access and report migration challenges efficiently and at scale, which is key for organizations transitioning to hybrid cloud architectures. And the ultimate result is that one of these hybrid architectures includes the mainframe itself.

CSOonline: Third-party software supply chain threats continue to plague CISOs

The latest software library compromise of an obscure but popular file compression algorithm called XZ Utils shows how critical these third-party components can be in keeping enterprises safe and secure. The supply chain issue is now forever baked into the way modern software is written and revised. Apps are refined daily or even hourly with new code which makes it more of a challenge for security software to identify and fix any coding errors quickly. It means old, more manual error-checking methods are doomed to fall behind and let vulnerabilities slip through.

These library compromises represent a new front for security managers, especially since they combine three separate trends: a rise in third-party supply-chain attacks, hiding malware inside the complexity of open-source software tools, and using third-party libraries as another potential exploit vector of generative AI software models and tools. I unpack these issues for my latest post for CSOonline here.

Using Fortnite for actual warfare

What do B-52s and a Chinese soccer stadium have in common? Both are using Epic Games’ Unreal Engine to create digital twins to help with their designs. Now, you might think having a software gaming engine would be a stretch to retrofit the real engines on a 60-plus year old bomber, but that is exactly what Boeing is doing. The 3D visualization environment makes it easier to design and provide faster feedback to meet the next generation of military pilots.

This being the military, the notion of “faster” is a matter of degree. The goal is for Boeing to replace the eight Pratt and Whitney engines on each of 60-some planes, as well as update cockpit controls, displays and other avionics. And the target date? Sometime in 2037. So check back with me then.

Speaking of schedules, let’s look at what is happening with that Xi’an stadium. I wrote about the soccer stadium back in July 2022 and how the architects were able to create a digital twin of the stadium to visualize seating sight lines and how various building elements would be constructed. It is still under construction, but you can see a fantastic building taking shape in this video. However slowly the thing is being built, it will probably be finished before 2037, or even before 2027.

Usually, when we talk about building digital twins, we mean taking a company’s data and making it accessible to all sorts of analytical tools. Think of companies like Snowflake, for example, and what they do. But the gaming engines offer another way to duplicate all the various systems digitally, and then test different configurations by literally putting a real bomber pilot in a virtual cockpit to see if the controls are in the right place, or the new fancy hardware and software systems can provide the right information to a pilot. If you look at the cockpit of another Boeing plane — the iconic 747, now mostly retired, you see a lot of analog gauges and physical levers and switches.

Now look at the 777 cockpit — see the difference? Everything is on a screen.

product image

It is ironic in a way: we are using video gaming software to reproduce the real world by placing more screens in front of the people that are depicted in the games. A true Ender’s Game scenario, if you will.

SiliconANGLE: Security threats of AI large language models are mounting, spurring efforts to fix them

A new report on the security of artificial intelligence large language models, including OpenAI LP’s ChatGPT, shows a series of poor application development decisions that carry weaknesses in protecting enterprise data privacy and security. The report is just one of many examples of mounting evidence of security problems with LLMs that have appeared recently, demonstrating the difficulty in mitigating these threats. I take a deeper dive into a few different sources and suggest ways to mitigate the threats of these tools in my post for SiliconANGLE here.

 

SiliconANGLE: Google’s Web Environment Integrity project raises a lot of concerns

Earlier last month, four engineers from Google LLC posted a new open-source project on GitHub and called it “Web Environment Integrity.” The WEI project ignited all sorts of criticism about privacy implications and concerns that Google wasn’t specifically addressing its real purpose.

Remember the problems with web cookies? WEI takes this to a new level. I tell you why in my latest piece here:

 

SiliconANGLE: Apps under attack: New federal report suggests ways to improve software code pipeline security

The National Security Agency and the Cybersecurity and Infrastructure Security Agency late last month issued an advisory memo to help improve defenses in application development software supply chains — and there’s a lot of room for improvement.

Called Defending Continuous Integration/Continuous Delivery (CI/CD) Pipelines, the joint memo describes the various deployment risks and ways attackers can leverage these pipelines. I describe their recommendations and the issues with defending these pipelines in my latest blog for SiliconANGLE.