Does Your SOC Belong in the Smithsonian?

The Security Operations Center (SOC) may be going the way of the dodo bird as security professionals outsource their protection to managed and cloud services. While many large organizations still have SOCs, smaller enterprises are finding that new technologies and better security architectures lessen the need to assemble large teams. This combination can make an IT team more proactive in protecting their infrastructure even without having a formal operations center.

Outsourcing the Security Operations Center

Many organizations are finding that they don’t really need a SOC, and instead have outsourced its function to cloud or hosting providers. Running these operations centers can be costly, both in terms of employing staff members with a high level of experience available 24/7 and with purchasing all the various tools that have to be maintained and monitored.

“Mostly, we still see them in very large organizations,” said John Joyner, director of product development at Arkansas-based managed services provider Clearpointe. “A large enterprise needs a big security analysis team that can actively engage in fighting incidents and security issues. But smaller organizations can avoid this if they have implemented a cloud-based architecture and liberally employ encryption and protection technologies.” Additionally, they should rely on their hosting partners as a first line of defense against attackers.

Changing the SOC Pyramid With the Times

Joyner feels the security pyramid made popular by the SANS Institute and others isn’t really relevant to as many companies anymore. “We shouldn’t have to worry about this if we have built our systems correctly. While it is true that a denial-of-service attack can bring down a public website, an organization doesn’t have to host that website internally. Instead, they should move it to a cloud provider and let them handle the necessary security,” he said. “It makes more sense to put [our customer-facing websites in the cloud] than to run them on our own networks.” They do this with many of their customers’ websites, and because they are a Microsoft partner, use Azure as their cloud provider.

Joyner feels that today’s enterprises should harden their security infrastructure, perhaps by using network access controls or application-based security, which would make them that much more difficult to penetrate. “Why should anyone waste resources when there are so many great alternatives available?” he asked. “Certainly, for backups and disaster recovery, the cloud offers some solid and very secure solutions. But you don’t need a SOC for these functions.”

He talks about using “thoughtful applications architecture” — now there is a term that I like — and making sure that you can compartmentalize your various apps so when you do get penetrated the threat can be better contained, or better yet, alter your infrastructure so it doesn’t matter if you are penetrated. “We can replace most of our sensitive data so its capture doesn’t reveal anything.”

PC Magazine: Self-service business intelligence tools

When most people think of business intelligence (BI) tools, they first think of using a spreadsheet for their data analysis and graphing needs. While Excel has been around for decades and is used by millions in this fashion, they aren’t always suitable tools for BI kinds of tasks. Up until very recently, BI was mostly for specialists. The tools were hard to operate and required knowledge akin to database administrators. But that has changed, and lately the market has made it easier for normal folks to use them, under what is now being called “self-serve BI.”

domo sample cardsI look at five of the leading BI tools for PC Magazine, and you can read my review here of these products: Domo (an example shown here), Qlik Sense Enterprise, Clearify’s Qqube, Tableau Software’s Desktop (which received my Editor;s Choice award), and Zoho Reports. All of them are better than using Excel for BI purposes.


CDW StateTech Review: FireEye NX 1400

fireeye report on malware analysisAs cybercriminals exploit infected web pages to launch targeted attacks on state networks, security appliances are essential to thwarting them. The FireEye Network Threat Prevention NX-1400 1U appliance can protect up to 100 users from a variety of zero-day malware and multiprotocol attacks.

You can read the full review in this month’s StateTech Magazine here.

20 years of Web Informants

Can it really be 20 years ago that I had the strange idea of writing a weekly series of self-published essays and sending them out first via email, then via a variety of Web technologies? Time flies. Last year I began the celebrations early with a column that looked at some of the lessons I have learned from online publishing all these years. More recently, I wrote about some of the influential people that I have had the opportunity to interview.

In a retrospective column that I wrote in 2006, I recalled how back in 1995 we had browsers that were just beginning to display tables and images in-line, and Netscape was still the dominant force in browsing technology. We also had PCs that still booted from floppy disks, and FTP and Gopher were the dominant Internet protocols.

When it came to broadband, there wasn’t much of it in 1995. ISDN was still found in more places than DSL. In another retrospective column, I wrote that finding an Internet service provider wasn’t easy. Most of us got online via dial-up modems: there was no Wifi, no iPhones or any other smartphones that could do anything besides voice calls. Blackberries hadn’t yet been invented, and many of us used pagers when we wanted someone to get in touch with us because the mobile minutes were expensive. Most of the world still relied on land lines.

They were certainly simpler times: cybersquatting, phishing, ad banner tracking, malware exploit kits and cookie stuffing were all relatively unknown concepts. Blogs hadn’t been invented, nor podcasts, wikis, or mashups. We were still using Yahoo to search the Internet.

Back in 1995, there were no music or video streaming services, and Napster and its peer-to-peer cousins hadn’t yet been invented either. Here is an column from 2000 where I offer some lessons to be learned from Napster, sadly little of this advice took hold. In the past 20 years, as I wrote last fall, music has gotten more mobile, more discoverable, and now streaming is here to stay. One evidence of this is that Kate Mulgrew is now better known for her role as a imprisoned Russian crime boss rather than as a starship captain, thanks to the streaming Netflix series.

The Web enabled an entire eCommerce industry. In those early days, as I wrote in this retrospective, the websites were wacky, the software shaky, and the tools touchy and troublesome. Now most of us don’t give it a second thought that we can buy something with a browser and a credit card. We have lots of new payment systems, including Square and phone-based wallets, and even bitcoin: a new form of money that is entirely online.

Certainly, the biggest changes in the past 20 years have been how we collaborate on our work. Back then if our teams weren’t all under one roof, there were painful remote access tools that slowly moved information around. We spent a lot of time sending large graphics files around on our network because there wasn’t any other way to share them. In many ways, back then we were still in the dark ages of collaboration tools. Now I can bring together a staff from all over the world with almost free and quite capable tools. (Last week I mentioned some of the great people that I have had the honor of working with.)

Thanks to all of you loyal readers who have stuck with me all these years, and all the kind (and even not-so-kind) words and thoughts you have sent my way after I pen another of these columns. What a long, strange trip it’s been but– you know me well enough by now — I will keep on truckin’.

Giving thanks to my mentors

Next week is the 20th anniversary of these essays. I wanted to take the time today to thank the various people that have guided my career, advise me and help advance my career in tech journalism. I have to start out with one of my first bosses, a man who taught me how to write and convinced me that I could become a writer, Grant Thompson. Grant and I were working at the time for a non-profit organization called the Conservation Foundation in Washington, D.C. I came there to help the organization build mathematical models for energy and environmental policy analysis, but came away from that job learning how to write. Little did I know how that would shape my career, and for all the countless hours that Grant spent marking up my drafts and teaching me my craft, I am forever grateful.

Several years later, I was working in IT for a large insurance company based in downtown LA called Transamerica Occidental Life. We had built one of the first end user computing departments to support a massive rollout of PCs. It was a great job, and I worked with Bob Zucker, Mark Will and Mike Storms there. All of them taught me to examine how people used technology in their jobs. This context would also be important to my subsequent career.

From Transamerica, I went on to work at PC Week (now eWeek) at my first editorial position. That was a dream job and I wrote about how I was hired here. I was hired by Mike Edelhart, who was a long-time Ziff Davis veteran and taught me how to become a great manager. PC Week was a massively talented organization and I learned from some of the best people in the tech journalism field, including Sam Whitmore, John Dodge, Paul Bonner, Peter Coffee, Gail Shaffer and Rob O’Regan, just to name a few.

Edelhart and I would write a book together, a book that was never published because we picked the wrong horse (OS/2) in an operating systems race that was eventually won by Microsoft WIndows. But years later I was ready to write another book when Marshall Rose asked me to collaborate with him on a book on corporate email in the late 1990s. Marshall is another brilliant man who invented the core Internet email protocols while in his early 20s. Writing a book with him was another life-changing experience for me.

After PC Week I went on to build my first publication from scratch, Network Computing. I worked for Al Perlman, and came to CMP at a time when it was starting many titles in an era when tech journalism was flourishing. I had the great fortune to learn from Mike Azzara, who was running another pub called Unix Today. Perlman went on to start many publications at CMP and elsewhere, and taught me a lot about startups.

Network Computing is still around, although online rather than in print. The team that I created from that publication has gone on to accomplish some great things, and many of those people are still in the tech community. I had the great fortune to hire Barry Gerber as my technical editor: Barry wrote articles for me at PC Week, and we would go on to work together at Tom’s Hardware, where he remained when I left there ten years ago. Barry taught me how to build and operate a test lab, and also how to be a better boss.

Over the years as a freelancer, I have had the good fortune to work for some of the best editors in the tech business, including Jackie Gavron, Rachel Parker, Jodie Naze, Neal Weinberg, Jennifer Bosavage and Stewart Alsop. Before Stewart became a VC, he was the editor at Infoworld in the early 1990s and gave me the assignment of writing a column by traveling to different businesses around North America and upgrading their networks over the weekend. These editors polished my prose and made me a better writer and it has been a blast to work for them.

Thanks everyone, including many others that I haven’t mentioned here. Next week I will write about some of the significant events that I have covered over my career and link back to some memorable columns.

Time to secure your website with an SSL EV certificate

This post is going to be a bit more technical than the most, but I will try to keep it as simple as I can. Last month I wrote about how domain owners can mask their identity by purchasing extra-cost private domain services. Today I want to talk about the opposite: where domain owners want to prove who they really are by making use of special encrypted certificates, called Secure Sockets Layer Extended Validation or SSL EV certs. It is something whose time has finally come.

One of the many problems with the average website is that you don’t necessarily know if the server you are browsing is for real or not. Scammers do this all the time when they send you a phished email: they copy the “real” site’s images and page design for say your local bank, and then try to trick you to login using their scammy page, where they capture your credentials and then steal your money. Rinse and repeat several million times and even if just a few folks take the bait, they can grab some significant coin.

So along came the SSL certificate many years to try to solve this problem. They did, for a while, until the scammers figured out a way to spoof the certificates and make it look like they came from the “real” site operator. So the certificate issuers and several other interested parties got together and formed two efforts:

First was a standards body where they would up the ante for how certs were vetted, to make sure that the real owner was who they say they were. This involves checking the domain ownership and making sure there actually is a Real Corporation (or some other trackable entity) behind the Internet registration. Now there are three different levels of certs that are available: the regular, old-school cert called domain validated (DV), a medium grade one called organization validated, and the most stringent of them all, the EV cert. Only the EV cert will turn the URL address bar of your browser green, showing you that you are connecting on the real site. Steve Gibson has a nice explanation on his site of how this works under the covers and how it is tamper-proof, at least so far.

That is nice and welcomed, but the second effort is also interesting, and that is a non-profit corporation is just getting ready to issue their own SSL certs for free. Called the Let’s Encrypt Project, they have begun with a few test accounts and will be ramping up over the next couple of months. The cost is nice — some of the issuing authorities such as Thawte and Digicert charge $300 per year for their SSL EV certs, and GoDaddy has recently discounted their SSL EV certs to $100 per year. (Wikipedia has a more complete list of those vendors that offer the EV certs.) But the real issue is that installing the certs is a multi-step process that requires some care. If you don’t do it very often (and why would you), it is easy to mess up. The Let’s Encrypt certs are supposedly easier to install.

One downside is the free Let’s Encrypt certs aren’t EV-class ones: they are just the old school DV low-level certs. So if you are serious about your certs and want that nice green label in your browser, you still have to buy one. But at least the issue has been raised, and one of the reasons why I am writing about this arcane topic today. If you own a domain and are doing ecommerce from it, look into getting at least the free certs when they are available or pay for one of the EV models. Securing the nonprofit

Running an IT security department in a nonprofit or charitable agency is very different from what’s found in a typical for-profit corporation. I spoke to David Goodman, who has held CIO jobs in a variety of nonprofits and is now the CIO-in-residence for the international benefit company NetHope. In his universe, Goodman rarely sees the kinds of regulatory and compliance structures and level of security that are commonplace in the average bank or even a local business.

You can read my post for here. Five ways CIOs tackle hybrid cloud security

As CIOs adopt hybrid-cloud strategies, some quickly learn that these environments need new kinds of security models or, at least, contexts in which to apply existing controls and security technologies. Most organizations also find that their environments are not as simple as a pure private plus public cloud. Legacy on-premises systems and SaaS applications come into play.

You can read my article in SearchSecurity here as I interview several CIOs and what they are doing to protect their hybrid cloud deployments.

How to waste time upgrading your Macs

For the past week I have not been a happy camper with my computers. An attempted upgrade of my main Mac desktop has caused a lot of heartache and pain. It all started when my daughter was visiting and was running into problems with her own Mac, which was running slowly. I suggested she upgrade to Yosemite, the latest production Mac OS. She eventually dd, and while she had to restart it from an unexpected crash, eventually her machine is now operating faster and more reliably.

So I thought I should practice what I preach and upgrade my own systems. Big mistake. In hindsight, it was my own Windows bias that was my downfall. Let me explain. When I initially setup my machine, I split the hard drive into two partitions: one for booting the system, and one for my data. That is how I have setup various Windows machines over the years, making it easier to upgrade the OS when the inevitable time comes to clean up my PC.

But Macs don’t need and as I found out don’t like to have partitioned system drives. Time Machine doesn’t back up the second partition (as I later found out), and the online OS upgrade service doesn’t know what to do with them. I went from a functioning PC to a brick in about an hour. After several phone calls and hours spent down at the local Apple store, I had my explanation, and a new system running Yosemite, with the chore of restoring my data and apps. I wasn’t entirely successful. Some of this wasn’t my fault: Apple has updated its photo app and my iPhoto libraries are stuck in limbo.

I will spare you the details and jump to some lessons learned. First, don’t partition your Mac boot drives. Use external or multiple disks if you want more redundancy.

Second, you can’t have enough backups. In addition to Time Machine, I also use SuperDuper, which makes a complete and bootable copy of my drives. That is what saved my bacon when it came to restoring my data partition.

And test your backups with some regularity to make sure they contain what you expect. I did this with the SuperDuper-created ones but not with Time Machine. Oops.

Finally, make sure you understand the progression of software tools that you will need to migrate your iPhoto library before you move into Yosemite. If you want to examine the cloud-based photo organizing alternatives, read this article that compares what Apple offers with Google, Microsoft and Amazon. My experience with using the Google Photo Backup tool, which transfers photos from iPhoto to their own service, has been abysmal: the app has crashed multiple times and still hasn’t finished copying my 7,000 or so photos from an older Mac. Now I realize that bulk uploading all these files isn’t easy. But it shouldn’t be this difficult either.

By the way, my Windows PCs upgraded to Windows 10 just fine. No show-stoppers, no grief. Of course, I can’t run any of my browser plug-ins on Edge now, but that is a feature, not a bug. Some times I miss those simpler times when we had an OS that I could actually understand on my own.

Authentication for the next generation

mobileThe new “my way” work style and the demand for on-the-go access to any service from any device and virtually any location requires that you bring your best encryption game with you when you’re on the move. This is especially true for the group of people often labeled Gen Y, or 20-somethings. Why? Because they are so digitally native and so used living their lives with instant access to their money, their friends, really anything that they do. As they are so steeped in technology, they tend to forget that there are lots of folks online who want to steal their identities, empty their bank accounts, and cause other havoc with their digital lives. But Gen Y is also more likely to use mobile banking than their elders, and more likely to go elsewhere if banks do not offer the mobile services they desire.

For a white paper for Vasco, I wrote about the challenges around providing better and more native authentication technologies for Gen Y and indeed, all users.