How IT can learn from Target and Walmart

With all the holiday shopping happening around now, you probably have visited the websites at Target and Walmart, and maybe that prime Seattle company too. What you probably haven’t visited are two subsidiary sites of the first two companies that aren’t selling anything, but are packed with useful knowledge that can help IT operations and application developers. This comes as a surprise because:

  • they both contain a surprising amount of solid IT information that while focused on the retail sector have broader implications for a number of other business contexts
  • they deal with many issues that are at the forefront of innovation, (such as open source and AI) not something normally associated with either company
  • both sites are a curious mixture of open source tool walkthroughs, management insights, and software architecture and design.
  • many of the posts on both sites are very technical deep dives into how they actually use the software tools, again not something you would ordinarily think you could find from these two sources

Let’s take a closer look. One post on Target’s site is by Adam Hollenbeck, an engineering manager. He wrote about their IT culture: “If creating an inclusive environment as a leader is easy for you, please share your magic with others. The perfect environment is a challenge to create but should always be our north star as leaders.” Mark Cuban often opines on this subject. Another post goes into details about a file analysis tool that was developed internally and released on open source. It has a user-friendly interface specifically designed to visualize files, their characteristics, and how they interconnect.

Walmart’s Global Tech blog site goes very heavy into its AI usage. “AI is eliminating silos that developed over time as our dev teams grew”, Andrew Budd wrote in one post, and GenAI chatbot solutions have been rolled out to optimize Walmart’s Developer Experience, a central tool repository. There are also posts about other AI and open source projects, along with a regular cyber report about recent developments in that arena. This is the sort of thing you might find on FOSSForce.com or something like TheNewStack, both news sites.

Another Walmart article, posted on LinkedIn, addresses how AI is changing the online shopping experience this season with more personalized suggestions and predictive content, (does this sound familiar from another online site?) and mentions how all Sam’s Club stores have the “just walk out” technology that was first pioneered by Amazon. (I wrote about my 2021 experience here.)

One other point: both of these tech sub-sites are not easily found: tech.target.com (not to be confused with techtarget.com) and tech.walmart.com — have no link from either company’s home pages. ” I’m not sure these pages should be linked from the home pages,” said Danielle Cooley, a UX expert whom I have known for decades. “As cool as this stuff is for people like you and me and your readers, it’s not going to rise to home page level importance for a company with millions of ecommerce visitors per day.” But she cautions that finding these sites could be an issue. “I did a quick google of ‘programming jobs target’ and ‘cybersecurity jobs target’ and still didn’t get a direct link to tech.target.com so they aren’t aiming at job openings. But also, the person interested in cybersecurity will not also the person interested in an AI shopping assistant for example.” Given their specificity, even if a visitor lands on them, they still might go away frustrated because the content is pretty broad.

You’ll notice that I haven’t said much about Amazon here. It really isn’t fair to compare the two tech sites to what they are doing, because of Amazon’s depth in all sorts of tech knowledge. And to be honest, in my extended family, we tend to shop more at Amazon than either Target or Walmart. But it is nice to know that both Target and Walmart are putting this content out there. I welcome your own thoughts about their efforts.

CSOonline: Top 5 security mistakes software developers make

Creating and enforcing the best security practices for application development teams isn’t easy. Software developers don’t necessarily write their code with these in mind, and as the appdev landscape becomes more complex, securing apps becomes more of a challenge to handle cloud computing, containers, and API connections. It is a big problem: Security flaws were found in 80% of the applications scanned by Veracode in a recent analysis.

As attacks continue to plague cybersecurity leaders, I compiled a list of five common mistakes by software developers and how they can be prevented for a piece for CSOonline.

The Cloud-Ready Mainframe: Extending Your Data’s Reach and Impact

(This post is sponsored by VirtualZ Computing)

Some of the largest enterprises are finding new uses for their mainframes. And instead of competing with cloud and distributed computing, the mainframe has become a complementary asset that adds new productivity and a level of cost-effective scale to existing data and applications. 

While the cloud does quite well at elastically scaling up resources as application and data demands increase, the mainframe is purpose-built for the largest-scale digital applications. But more importantly, it has kept pace as these demands have mushroomed over its 60-year reign, and why so many large enterprises continue to use them. Having them as part of a distributed enterprise application portfolio could be a significant and savvy use case, and be a reason for increasing their future role and importance.

Estimates suggest that there are about 10,000 mainframes in use today, which may not seem a lot except that they can be found across the board in more than two-thirds of Fortune 500 companies, In the past, they used proprietary protocols such as Systems Network Architecture, had applications written in now-obsolete coding languages such as COBOL, and ran on custom CPU hardware. Those days are behind us: instead, the latest mainframes run Linux and TCP/IP across hundreds of multi-core  microprocessors. 

But even speaking cloud-friendly Linux and TCP/IP doesn’t remove two main problems for mainframe-based data. First off, many mainframe COBOL apps are their own island, isolated from the end-user Java experience and coding pipelines and programming tools. To break this isolation usually means an expensive effort to convert and audit the code. 

A second issue has to do with data lakes and data warehouses. These applications have become popular ways that businesses can spot trends quickly and adjust IT solutions as their customer’s data needs evolve. But the underlying applications typically require having near real-time access to existing mainframe data, such as financial transactions, sales and inventory levels or airline reservations. At the core of any lake or warehouse is conducting a series of “extract, transform and load” operations that move data back and forth between the mainframe and the cloud. These efforts only transform data at a particular moment in time, and also require custom programming efforts to accomplish.

What was needed was an additional step to make mainframes easier for IT managers to integrate with other cloud and distributed computing resources, and that means a new set of software tools. The first step was thanks to initiatives such as the use of IBM’s z/OS Connect. This enabled distributed applications to access mainframe data. But it continued the mindset of a custom programming effort and didn’t really provide direct access to distributed applications.

To fully realize the vision of mainframe data as equal cloud nodes required a major makeover, thanks to companies such as VirtualZ Computing. They latched on to the OpenAPI effort, which was previously part of the cloud and distributed world. Using this protocol, they created connectors that made it easier for vendors to access real-time data and integrate with a variety of distributed data products, such as MuleSoft, Tableau, TIBCO, Dell Boomi, Microsoft Power BI, Snowflake and Salesforce. Instead of complex, single-use data transformations, VirtualZ enables real-time read and write access to business applications. This means the mainframe can now become a full-fledged and efficient cloud computer. 

VirtualZ CEO Jeanne Glass says, “Because data stays securely and safely on the mainframe, it is a single source of truth for the customer and still leverages existing mainframe security protocols.” There isn’t any need to convert COBOL code, and no need to do any cumbersome data transformations and extractions.

The net effect is an overall cost reduction since an enterprise isn’t paying for expensive high-resource cloud instances. It makes the business operation more agile, since data is still located in one place and is available at the moment it is needed for a particular application. These uses extend the effective life of a mainframe without having to go through any costly data or process conversions, and do so while reducing risk and complexity. These uses also help solve complex data access and report migration challenges efficiently and at scale, which is key for organizations transitioning to hybrid cloud architectures. And the ultimate result is that one of these hybrid architectures includes the mainframe itself.

CSOonline: Third-party software supply chain threats continue to plague CISOs

The latest software library compromise of an obscure but popular file compression algorithm called XZ Utils shows how critical these third-party components can be in keeping enterprises safe and secure. The supply chain issue is now forever baked into the way modern software is written and revised. Apps are refined daily or even hourly with new code which makes it more of a challenge for security software to identify and fix any coding errors quickly. It means old, more manual error-checking methods are doomed to fall behind and let vulnerabilities slip through.

These library compromises represent a new front for security managers, especially since they combine three separate trends: a rise in third-party supply-chain attacks, hiding malware inside the complexity of open-source software tools, and using third-party libraries as another potential exploit vector of generative AI software models and tools. I unpack these issues for my latest post for CSOonline here.

Using Fortnite for actual warfare

What do B-52s and a Chinese soccer stadium have in common? Both are using Epic Games’ Unreal Engine to create digital twins to help with their designs. Now, you might think having a software gaming engine would be a stretch to retrofit the real engines on a 60-plus year old bomber, but that is exactly what Boeing is doing. The 3D visualization environment makes it easier to design and provide faster feedback to meet the next generation of military pilots.

This being the military, the notion of “faster” is a matter of degree. The goal is for Boeing to replace the eight Pratt and Whitney engines on each of 60-some planes, as well as update cockpit controls, displays and other avionics. And the target date? Sometime in 2037. So check back with me then.

Speaking of schedules, let’s look at what is happening with that Xi’an stadium. I wrote about the soccer stadium back in July 2022 and how the architects were able to create a digital twin of the stadium to visualize seating sight lines and how various building elements would be constructed. It is still under construction, but you can see a fantastic building taking shape in this video. However slowly the thing is being built, it will probably be finished before 2037, or even before 2027.

Usually, when we talk about building digital twins, we mean taking a company’s data and making it accessible to all sorts of analytical tools. Think of companies like Snowflake, for example, and what they do. But the gaming engines offer another way to duplicate all the various systems digitally, and then test different configurations by literally putting a real bomber pilot in a virtual cockpit to see if the controls are in the right place, or the new fancy hardware and software systems can provide the right information to a pilot. If you look at the cockpit of another Boeing plane — the iconic 747, now mostly retired, you see a lot of analog gauges and physical levers and switches.

Now look at the 777 cockpit — see the difference? Everything is on a screen.

product image

It is ironic in a way: we are using video gaming software to reproduce the real world by placing more screens in front of the people that are depicted in the games. A true Ender’s Game scenario, if you will.

SiliconANGLE: Security threats of AI large language models are mounting, spurring efforts to fix them

A new report on the security of artificial intelligence large language models, including OpenAI LP’s ChatGPT, shows a series of poor application development decisions that carry weaknesses in protecting enterprise data privacy and security. The report is just one of many examples of mounting evidence of security problems with LLMs that have appeared recently, demonstrating the difficulty in mitigating these threats. I take a deeper dive into a few different sources and suggest ways to mitigate the threats of these tools in my post for SiliconANGLE here.

 

SiliconANGLE: Google’s Web Environment Integrity project raises a lot of concerns

Earlier last month, four engineers from Google LLC posted a new open-source project on GitHub and called it “Web Environment Integrity.” The WEI project ignited all sorts of criticism about privacy implications and concerns that Google wasn’t specifically addressing its real purpose.

Remember the problems with web cookies? WEI takes this to a new level. I tell you why in my latest piece here:

 

SiliconANGLE: Apps under attack: New federal report suggests ways to improve software code pipeline security

The National Security Agency and the Cybersecurity and Infrastructure Security Agency late last month issued an advisory memo to help improve defenses in application development software supply chains — and there’s a lot of room for improvement.

Called Defending Continuous Integration/Continuous Delivery (CI/CD) Pipelines, the joint memo describes the various deployment risks and ways attackers can leverage these pipelines. I describe their recommendations and the issues with defending these pipelines in my latest blog for SiliconANGLE.

SiliconANGLE: Databases then and now: the rise of the digital twin

When I first started in IT, back in the Mainframe Dark Ages, we had hulking big databases that ran on IBM’s Customer Information Control System, written in COBOL. These mainframes ran on a complex collection of hardware and operating systems that was owned lock, stock, and bus and tag barrel by IBM. The average age of the code was measured in decades, and code changes were measured in months. They contained millions of transactions, and the data was always out of date since it was a batch system, meaning every night new data would be uploaded.

Contrast that to today’s typical database setup. Data is current to the second, code is changed hourly, and the nature of what constitutes a transaction has changed significantly to something that is now called a “digital twin,” which I explain in my latest post for SiliconANGLE here.

Code is written in dozens of higher-level languages that have odd names that you may never have heard of, and this code runs on a combination of cloud and on-premises equipment that uses loads of microprocessors and open source products that can be purchased from hundreds of suppliers.

It really is remarkable, and that these changes have happened all within the span of a little more than 35 years. You can read more in my post.