Avast blog: Back to campus means understanding your data security

As college students try to return to campus, some are being asked to allow the college unprecedented access to their whereabouts and health information, as we posted last week. Many are learning about the personal implications of their data security for the first time, let alone dealing with being quarantined. I’ve previously explored the wide ranging methods colleges are using to try to bring students back to campus safely and how they are planning to track their students (and staff). In this post, I talk about some of the infosec issues with tracking the college crowd. It all comes down to having solid IT leadership and necessary skills on staff to do proper security vetting.

You can read more on my blog for Avast today.

RSA blog: Security Is No Longer A Binary Decision

IT security has evolved from being a completely binary operation to taking a more nuanced approach. Back in the days when R, S, and A first got together, it was sufficient to do security on this pass/fail basis – meaning a large part of security was letting someone in or not to your network. Or, it could mean allowing them to use a particular application or not, or allowing them access to a particular network resource (e.g. printer, server) or not.

One example is over-protective endpoint security. While it is great to plug as many holes as possible across your endpoint collection, if you lock down your endpoints too much, employees will shift their work to the cloud and their personal devices. That is also self-defeating.

You can read more of my examples of nuanced security here on RSA’s blog.

I remember c|net: a look back on computing in the mid-1990s

The news this week is that c|net (and ZDnet) have been sold to a private equity firm. I remember when c|net was just starting out, because I wrote one of the first hands-on reviews of web server software back in 1996. To test these products, I worked with a new company at the time called Keylabs. They were the team that built one of the first huge, 1000-PC testing labs at Novell and were spun out as a separate company, eventually spinning off their own endpoint automation software company called Altiris that was acquired by Symantec and now is part of Broadcom. They were eager to show their bona fides and worked with me to run multiple PC tests involving hundreds of computers trying to bang away on each web server product. “1996 was an exciting time for computing,” said Jan Newman who is now a partner at the VC firm SageCreek and was one of the Keylabs founders. “The internet was gathering steam and focus was changing from file and print servers to the web. I believe this project with David was the very first of its kind in the industry. It was exciting to watch new platforms rise to prominence.” Now we have virtual machines and other ways to stress test products. The review shows how the web was won back in the early days.

Here are some observations from re-reading that piece.

  1. The demise of NetWare as a server platform. Back in the mid 1990s, NetWare — and its associated IPX protocol — was everywhere, until Microsoft and the Internet happened. Now it is about as hard to find as a COBOL developer. One advantage that NetWare had was it was efficient: you could run a web server on a 486 box at about the same performance as any of the Windows servers running on a faster Pentium CPU.
  2. Remember Windows NT? That was the main Microsoft server platform at the time. It came in four different versions: running on Intel, DEC Alpha, MIPS and PowerPC processors. Those latter two were RISC processors that mostly have disappeared, although Apple Macs and Microsoft Xbox’s  ran on PowerPCs for years.
  3. Remember Netscape? In addition to their web browser that made Mark Andreesen rich, they also had their own web server, called FastTrack, that was in beta at the time of my review. Despite being a solid performer, it never caught on. It did support both Java and JavaScript, something that the NT-only web servers didn’t initially offer.
  4. The web wasn’t the only data server game. Back in the mid-1990s, we had FTP, and Gopher as popular precursors. While you can still find FTP (I mainly use to transfer files to my web server and to get content to cloud images), Gopher (which got its name from the University of Minnesota team mascot) is gone into a deep, dark hole.
  5. Microsoft’s web server, IIS, was underwhelming when first was released. It didn’t support Java, didn’t do server-side includes (an early way to use dynamic content), didn’t have a web-based management tool, didn’t support hosting multiple domains unless you used separate network adapters, didn’t have any proxy server support and made use of an unsecured user accounts. Of course, now it is one of the top web server platforms with Apache.
  6. You think your computer is slow? How about a 200 MHz Pentium. That was about as fast as you could expect back then. And installing 16 MB of RAM and using 10/100 Ethernet networks were the norm.

Network Solutions blog: How Passwordless Authentication Works and How to Deploy It

Passwords are known as the bane of every IT security manager, but often it’s the way they’re used that creates the most problems. Passwords are shared and reused across numerous logins and can frequently be easily guessed by using pet and children’s names. In other cases, passwords are compromised by users who stick with the default manufacturer settings years after their hardware is installed. This has given rise to a number of solutions that are labeled ‘passwordless,’ even though they technically still use some form of authentication.

You can read more with my post for Network Solutions blog here.

CSOonline: 10 common cloud security mistakes that put your data at risk

The news is filled regularly with attacks on misconfigured cloud servers and the leaked data that criminals obtain from them. The errors happen because we are all human. We might set up a cloud server with loose (or no) credentials and forget to tighten them when the server is placed into production. Or we fail to keep software up to date when exploits are discovered or get IT involved to audit the finished production app to ensure that it is as secure as possible.

You can read my post for CSOonline here on the 10 most common cloud configuration mistakes.

Getting my kicks on the old Route 66

Like many of you this past Labor Day weekend, my wife and I took a drive to get out of our pandemic bubble. And as the NY Times ran this piece, we also got our kicks on Route 66. Their photographer went to the portion through Arizona and New Mexico; we stayed a lot closer to home, about an hour’s drive from St. Louis. This wasn’t our first time visiting this area, but we wanted to see a few sights from a safe distance, and also for my wife to visit an ancestral home not far off the Mother Road, as it is called.

St. Louis has a complicated relationship with Route 66: there are many different paths that the road took through the city to cross the Mississippi River at various bridges over the years the road was active. And for those of you that want to discover the road in other parts of the country, you will quickly find that patience is perhaps the biggest skill you’ll need. Different parts were decommissioned or rerouted after the freeways were constructed that brought about its demise. In our part of the country, that is I-44, which goes between St. Louis and Oklahoma City, where it connects up with I-40.

My favorite Route 66 memory spot within city limits has to be the old Chain of Rocks Bridge, which was opened in the 1930s and was featured in that now classic film “Escape From New York.” The bridge is now a bike/pedestrian path and it is one of the few bridges that is deliberately bent in the middle. It lies on the riverfront bike trail that I have been on often.

Once you leave the city and head west you need to be a determined tracker. Many parts of it are on the map as the I-44 service road, but that doesn’t tell the entire story about the actual original roadbed that in many cases no longer exists. Speaking of which, one of the places that you might have heard of is Times Beach. The beach refers to the Meramec River and the reason for its memory is this is the town that became contaminated with Dioxin. Now the streets remain but not much else, and the state has turned it into a state park. The visitor center is a former roadhouse that was built in 1936. Speaking of other bygone inns, in a few miles you’ll pass the Gardenway Motel near Gray’s Summit. The motel had 40 rooms and was built in 1945 and eventually closed in 2014. It was owned by the same family during its entire run. A separate advertising sign still stands down the road.

There are a lot of other classic signs nearby too, but like I said you have to spend some time exploring to find them. If you are looking to stay in one of the period motels that is still operating, you might try the Wagon Wheel in Cuba, a few miles further west.

Another example of the bygone era that Route 66 spanned was captured by this National Park Service webpage on the Green Book. This was a guide for Black motorists who couldn’t stay at the then-segregated lodgings mentioned above. Mrs. Hilliard’s Motel in St. Clair, which is in the area, operated briefly in the 1960s. The guide (which was published annually from 1936 to 1964 by Victor Green) had other recommended and safe places for Black travelers such as dining and gas stations. Our history museum has an excellent explanation of its use and some sample pages here, which you can contrast with what was portrayed in the 2018 film.

One of the things that I learned when traveling in Poland is that history is often what you don’t see, sometimes painfully removed, other times left to rot and decay. That will require some investigation. Route 66 is a real time capsule to be sure.

CSOonline: Securing Microsoft Teams

As more remote work from home happens, your collaboration tools need more scrutiny. A popular choice for instant messaging and video conferencing is Microsoft’s Teams, and securing this application will be a challenge. There have been Teams-specific exploits observed, for example. And even if Teams isn’t targeted, it could fall victim to general DDoS or ransomware attacks, which would be an issue if you depend on Teams for internal communications post-attack. And while Microsoft has published numerous suggestions on how to better secure Teams, the process is vexing and error-prone.

You can read my published analysis for CSOonline here. I also compare how Teams security stacks up with Slack. Avanan, pictured above, has versions for both.

Avast blog: Everything you should know about social media scraping

Last month, a massive data leak exposed more than 300 million different accounts from social media platforms. The collection included 192 million records scraped from two different Instagram collections, along with 42 million records scraped from TikTok and an additional 4 million records scraped from YouTube.

The records include usernames, profile photos, emails, phone numbers, age and gender along with specifics about followers and other engagement for each account. The leak involved a set of three open data shares from the company Social Data: a few hours after being notified, the shares were properly secured.

There are several things that are interesting about this leak: its source, how the data was obtained, and what this means for your own social media consumption. You can read more on the Avast blog. 

Network Solutions blog: Understanding SSO

One of the best ways to manage your password collection is to use a single sign-on (SSO) tool. These tools centralize the administration of user authentication services by having one login credential that can be used for multiple applications. 

You might think this creates a security loophole. We all have been drilled into not sharing the same login across multiple apps, right? The way that SSO works is somewhat different. Yes, you have a single login to gain overall access to an SSO tool. But once that is accomplished, the tool then automatically sends out separate credentials to sign in so you can use each of your apps. In many cases, you don’t even know what the details of each credential is — they could be using very complex passwords that are created at random by the tool. The good news is that you don’t need to remember each one, because the SSO does it for you. The bad news is that implementing SSO can be confounding, costly and complex.

You can read more on this topic on my blog post for Network Solutions here.

RSA blog: Why authentication still holds the key for RSA’s success after nearly 40 years

Today, RSA once again becomes an independent company, after being owned by EMC and then Dell Technologies for the past several years. I’m commemorating this milestone by looking at a few of my favorite products from the RSA portfolio and set some context for the longevity of this iconic company. You can read my post on their blog here.