Simon and Garfunkel once sang:
The revelations last week that Chinese hackers had breached a number of U.S. government email accounts indicate the problem is a lot worse than was initially thought, according to new research today by Wiz Inc. Indeed, this hack could turn out to be as damaging and as far-reaching as the SolarWinds supply chain compromises of last year.
In my post for SiliconANGLE, I summarize what Wiz learned about the attack, what you have to do to scan and fix any potential problems, and why people who choose “login with Microsoft” are playing with fire.
I posted two stories on SiliconANGLE about lots of news coming from new security services on Google Cloud and similar news from AWS. Both are showing that we are at watershed events — AWS is making architectural changes and adding new depth with programming languages such as Cedar. Google is finally building some solid tools into its Chronicle platform that has been available for four or so years now. Both are also paying attention to LLMs/Generative AI methods to provide threat intelligence.
Both vendors are trying to consolidate their services with their channel partners large and small.
Yesterday Google announced that they will completely eliminate third-party browser cookies. Calling it a move towards a more privacy-first web, as their director of product management who wrote the post claimed, is a bit of a misnomer. Yes, they will phase out tracking these cookies on their Chrome browser. But they will still track what you do on your mobile phone, especially an Android phone, and track what you do on their own websites, including YouTube and its main search page. And they will still target the ads that you see from these activities.
The announcement was expected: last year they announced their plan to de-cookiefy their browser. They basically had to — Safari and Firefox have blocked these cookies for years, so it was high time Google got on board this train. They have come up with a variety of technologies and tools that sound good at first blush, but I am not sure that these replacements are better, especially for preserving privacy. One of them is called the Privacy Sandbox. Now, sandboxes have certain implications, especially for security researchers. The goal is to limit who can view what is going on inside the sandbox, and more importantly, who can’t. It seems that smaller advertisers will have to find some other place to play, but the big guys will still have the means to figure out who you are and more importantly, what you are interested in, to target their advertising. Vox’s Recode says that “Google will still technically deliver targeted ads to you, but it will do so in a more anonymous and less creepy way.”
Firefox has a better idea: to limit the reach of cookies to just the website that places them on your hard drive. They call it Total Cookie Protection and you can follow the links on their blog to understand more of the details. It does seem to eliminate web tracking cookies, but we’ll see as they roll it out across their browsers.
In the meantime, if you use any Google products, go to your Google Account and review the numerous personalization settings you have at your disposal to rid yourself of tracking, including their activity controls, ad personalization, and recorded activity history. And if you are using an iOS phone or tablet, make sure you update to iOS v14 and enable the ability to block cross-app tracking.
As teaching methods advance and especially during the pandemic, online learning is starting to approach a physical classroom experience, and it’s great for conceptual learning. A good online learning experience should include not only content, but should also feature practice drills, integrate with real-world case studies, and contain a social component to make learning more effective. I cover some of the things to look for in selecting the right professional IT certifications to increase your potential value to your company.
You can read my blog for Network Solutions here for more about this topic.
My latest blog for Network Solutions is about identity and access management. Our email accounts have become our identity, for better and worse. Hackers exploit this dependency by using more clever phishing lures. Until recently, enterprises have employed very complex and sophisticated mechanisms to manage and protect our corporate identities and control access to our files and other network resources. What has changed recently are two programs from Microsoft and Google that are designed to help combat phishing. They are aimed at helping higher-risk users who want enterprise-grade identity and access management security without the added extra cost and effort to maintain it. The two programs are called AccountGuard (Microsoft) and Advanced Security (Google). In my blog post, I explain what these two programs are all about.
Last month I caught this news item about Microsoft building in a new command-line feature that is commonly called a network protocol sniffer. It is now freely available in Windows 10 and the post documents how to use it. Let’s talk about the evolution of the sniffer and how we come to this present-day development.
If we turn back the clock to the middle 1980s, there was a company called Network General that made the first Sniffer Network Analyzer. The company was founded by Len Shustek and Harry Saal. It went through a series of corporate acquisitions, spin outs and now its IP is owned by NetScout Systems.
The Sniffer was the first machine you could put on a network and trace what packets were being transmitted. It was a custom-built luggable PC that was typical of the “portable” PCs of that era — it weighed about 30 pounds and had a tiny screen by today’s standards. It cost more than $10,000 to purchase, but then you needed to be trained how to use it. You would connect the Sniffer to your network, record the traffic into its hard drives, and then spend hours figuring out what was going on across your network. (Here is a typical information-dense display.) Decoding all the protocols and tracking down the individual endpoints and what they were doing was part art, part science, and a great deal of learning about the various different layers of the network and understanding how applications worked. Many times Sniffer analysts would find bugs in these applications, or in implementations of particular protocols, or fix thorny network configuration issues.
My first brush with the Sniffer was in 1988 when I was an editor at PC Week (now eWeek). Barry Gerber and I were working on one of the first “network topology shootouts” where we pit a network of PCs running on three different wiring schemes against each other. In addition to Ethernet there was also Token Ring (an IBM standard) and Arcnet. We took over one of the networked classrooms at UCLA during spring break and hooked everything up to a Novell network file server that ran the tests. We needed a Sniffer because we had to ensure that we were doing the tests properly and make sure it was a fair contest.
Ethernet ended up wining the shootout, but we did find implementation bugs in the Novell Token Ring drivers. Eventually Ethernet became ubiquitous and today you use it every time you bring up a Wifi connection on your laptop or phone.
Since the early Sniffer days, protocol analysis has moved into the open source realm and WireShark is now the standard application software tool used. It doesn’t require a great deal of training, although you still need to know your seven layer network protocol model. I have used Sniffers on several occasions doing product reviews, and one time helped to debug a particularly thorny network problem for an office of the American Red Cross. We tracked the problem to a faulty network card in one user’s PC which was just flaky enough to operate correctly most of the time.
Today, sniffers can be found in a number of hacking tools, as this article in ComputerWorld documents. And now inside of WIndows 10 itself. How about that?
I asked Saal what he thought about the Microsoft Windows sniffer feature. “It is now almost 35 years since its creation. Seeing that some similar functionality is now hard wired into the guts of Windows 10 is amusing. Microsoft makes a first class Windows GUI tool, NetMon, available for free and of course there is WireShark. Why Microsoft would invest design, programming and test resources into creating a text-based command line tool is beyond me. What unfilled need does it satisfy? Regardless, more is better, so I say good luck to Redmond and the future of Windows.”
Many of us started out with database software with something like Microsoft Access. It came included as part of the Office suite, was fairly easy to get started and infinitely customizable for light database programming. But with all these advantages, it might be time to look elsewhere for alternatives, especially for citizen developers who want to build more sophisticated online database applications.
You can read my post here about ways to recognize when your Access is running out of steam.
When the Internet was first getting used by ordinary commercial businesses back in the early 1990s, those businesses didn’t own any of the underlying infrastructure besides their own connection to the nearest peering point. My how times have changed. This week, Microsoft and Facebook announced they are building the next transatlantic cable to exclusively carry their own Internet traffic between their American and European data centers.
When you think about this, it isn’t that surprising. After all, it represents the next step in the evolution of how private companies have built their computing systems on top of the public Internet. It isn’t the first such private cable: Google has been partnering with telcos for years to share their bandwidth, and a new connection from the US to Japan went online earlier this year. Facebook (and others) are even building their own servers, routers and racks out of specialized spare parts, since they need so many of them to run their online businesses.
Another result of this is how many businesses are running without any data centers at all, using private clouds and co-locating their servers at peering points. What used to be all about the connection to the Internet is now about owning the entire stack, down under the sea if you are big enough to afford it.
Certainly, the cost of Internet bandwidth has plummeted in the 25 or so years that a business could buy it. In the early 1990s, if your business was big enough, you purchased a T-1 digital line that topped out at the then amazing speed of 1.5 Mb/s. If you had a lot of demand, you could get a T-3 line that was 30 times as fast.
Of course, when you tell folks today about these early speeds, they look at you strangely and start thinking you date back to the days when there were payphones with actual dials. Given that today the worst DSL speed you can get from your local phone company still gives you a better connection than that old T-1 line it is pretty amazing how fast and far we have come. Today a cable Internet connection can deliver at least 10 Gb/s rates (at least in the download direction). Google and other fiber providers are talking about 100 and 1000 GB/s speeds in both directions, and there are cities (such as Chattanooga) where you can get a gigabit connection at every address. These places have realized that supplying a ultra-fast Internet is an essential part of their municipal services, just like supplying water and staffing the fire and police departments.
And that Microsoft/Facebook undersea cable? It will start out carrying 160 Tb/s, which is at least twice as fast as older cables. I can’t even think at those speeds.
This week several thousand IT managers, developers ISVs, and others who work with Microsoft’s operating systems, tools, and software products are gathered in Houston for the annual TechEd conference. I couldn’t make it, but I have been talking to a number of independent software vendors and Microsoft channel partners for a custom consulting research project.
In the course of doing the research, I was reminded of why Microsoft is such a powerhouse when it comes to understanding developers and having such a rich ecosystem that continues to be self-sustaining and an expanding universe. When you look behind the scenes, there is usually a Microsoft API that is lurking about, and that people are using to build something, which is being used as the basis of some other software developer who is building there thing, which continues on and on.
As an example, take a gander at the Azure Store (it is under the “Add-os” tab at the top level menu for Microsoft’s cloud computing site. It has add-on tools that range widely across the SaaS and IaaS spectrum, and can be used to manage Azure VM collections, setup and provision Azure servers, or access particular datasets. The store is still very much a work in progress, and its search function is somewhat limited. And sure, Amazon’s Web Services has more activity, because it has been around longer. But the idea is catching on, and the ISVs are coming to sell their stuff, and others are taking notice. The extensibility of Azure will be untouchable in a few years.
We saw this movie already with the first wisps of .Net and Visual Studio: there was a time that people used those tools to just build standalone apps that had nothing to do with the Internet or Web services. Those universes were eventually mind-melded together, and now no one thinks it odd when someone builds a Web app with .Net.
Look at what you can do with using Excel as a query tool for all sorts of databases, some that aren’t even on your own hard drive. Again, it is all about extending the things we know and love (or at least tolerate) well. The same thing happened when Windows Explorer became Internet Explorer, even if IE is now a malware distribution mechanism of the first order.
I realize that you could make the same claims about building-on-top-of-the-builders with the open source movement: just look at the 57 different Hadoop-oriented projects (if not 1057 by now) that have been spawned by that Big Data database. And yes, there are other claims to the uber-ISV throne too.
But it is what Microsoft is best at doing and you would be wrong to sell them short in this area. Yes, their actual software dev tools aren’t the best on the block. And there may be better desktop environments better than Windows. And yes, more people are buying Apple’s iThings because they are just cooler, no doubt. No one is going to ooh and aah over a Windows Phone anymore, no matter how honking many megapixels its camera can consume. But that is missing the point.
With Microsoft, it is all about the API-enabled ISV, who can sell to other API-enabled ISVs, who can use all these interfaces to build powerful apps with just a few lines of actual code. There are lots of other things wrong with the company, but this is one they continue to get right.