I spoke to Yassir Abousselham, the CSO for Okta, an identity management cloud security vendor. Before joining Okta this past summer, he worked for SoFi, a fintech company where he built the company’s information security and privacy program. He also held leadership positions at Google, where he built both the corporate security for finance and legal departments and the payments infrastructure security programs, as well as at Ernst & Young, where he held a variety of technical and consultancy roles during his 11-year tenure.
When first started at E&Y, he worked for an entertainment company that hired them to examine their security issues. He found a misconfigured web server that enabled them to enter their network and compromise systems within the first 30 minutes of testing. This got him started in finding security gaps and when he first realized that security is only as good as your weakest link. “The larger the environment and more IT infrastructure, the harder it is to maintain these systems.” Luckily they weren’t billing by the hour for that engagement! He went on to produce a very comprehensive look at the company’s security profile, which is what they needed to avoid situations like what he initially found.
“The worse case is when companies do what I call check mark compliance assessments,” he said, referring to when companies are just implementing security and not really looking closely at what they are doing. “On the other hand, there are a few companies who do take the time to find the right expertise to actually improve their security posture.”
“To be effective, you have to design many security layers and use multiple tools to protect against any threats these days. And you know, the tools and the exploits do change over time. A few years ago, no one heard about ransomware for example.” He recommends looking at security tools that can help automate various processes, to ensure that they are done properly, such as automated patching and automated application testing.
Although he has been at Okta only a few months, they have yet to experience any ransomware attack. “The first line of defense is educating our employees. No matter how much you do, there is always going to be one user that will open an phished attachment. Hackers will go through great lengths to socially engineer those users.” Okta employs a core security team that has multiple functions, and works closely with other departments that are closer to the actual products to keep things secure. They also make use of their own mobile management tool to secure their employees’ mobile devices. “We allow BYOD but before you can connect to our network, your device has to pass a series of checks, such as not being rooted and having a PIN lock enabled and running the most updated OS version,” he said.
How does securing the Google infrastructure compare to Okta? “They have a much more complex environment, for sure.” That’s an understatement.
Working for an identity vendor like Okta, “I was surprised that single sign-on or SSO is not more universally deployed,” he said. “Many people see the value of SSO but sometimes take more time to actually get to the point where they actually use this technology. Nevertheless, SSO and multi-factor authentication are really becoming must-have technologies these days, just like having a firewall was back 20 years ago. It makes sense from a security standpoint and it makes sense from an economics standpoint too. You have to automate access controls and harden passwords, as well as be able to monitor how accounts are being used and be able to witness account compromises.” He compares not having SSO to putting a telnet server on the public Internet back in the day. “It is only a matter of time before your company will be compromised. Passwords aren’t enough to protect access these days.”
When you hear about an IT staff that has to build their infrastructure from scratch to support a new business, you think, “That couldn’t be that hard – they had no legacy infrastructure to support. What a dream job.” Well, it wasn’t a piece of cake for the crew at the Okada Manila resort hotel, and in an interview with Dries Scott, the SVP of IT for Okada, I got to see why.
Okada was built on a huge site and is similar to the resort-style properties that can be found in Las Vegas and Macau. It will house 2,300 guest rooms when it is fully built and have 10,000 employees. Scott’s IT department has at least 100 of them full-time — plus contractors — to support 2,000 endpoints and numerous physical and virtual servers placed in two separate datacenters on the property.
Scott actually worked for a few of the Macau resort hotels before coming to Manila, and he wanted to create the ideal IT environment for a five-star luxury hotel. “The biggest decision we had to make was to try to steer clear of having actual desktop PCs as our workstations,” he said to me when he sat down for an interview yesterday. “When you are starting from a clean sheet of paper, you want something that could last 10 to 20 years and want products that could evolve over this time period.” He decided to choose VDI for his endpoints. “I wanted to move away from the usual desktop PC environment, although we ending up having a few of them for our staff. PCs are a pain to manage, because hard drives crash, getting updates and patches distributed isn’t easy, and other issues.” To support their VDI deployment they purchased a variety of products, including XenDesktop, XenApp and NetScaler, HP thin clients and Dell servers.
One of the key enabling technologies is FSLogix Office 365 Container. “This makes Outlook running on XenApp and XenDesktop able to mount users’ profiles as if they were on a local C: drive, so Windows acts normally and Outlook works like it is running on a regular PC desktop,” he said. This means you get the performance of the virtual workspace but the ease of management too.
Having a VDI solution meant some initial support hurdles. “We had to have a lot of patience with our users, some of whom were using VDI workstations for the first time,” he told me. “I could have taken the easy way out and just bought desktops for everyone, but I knew eventually VDI will pay off and benefit us in the long run.”
One concern Scott had was keeping corporate data secure. Given the market of his resort, he wanted to ensure that customers’ information stayed on the corporate systems; “It is one of our most critical assets,” he said. “Users don’t have the ability to remove any corporate data from the company.” His thin clients locked out USB access, for example, and he also set up appropriate data leak policies too. Through ShareFile, he has other policies for how files can be shared across his staff, and he prevents access to public SaaS repositories, like consumer file-sharing services whenever possible. Finally, he figured out ways to keep data from his construction contractors on his servers. “I didn’t want them to pack up their PCs and leave with my data on them,” he said.
Building a new resort’s IT infrastructure wasn’t as easy as I was assuming, mainly because some IT elements needed to be put in place during the construction phase to support those workers on the job site. This meant erecting temporary buildings and networks and then migrating these resources to the production environment once the hotel was built. “That migration wasn’t easy, but we are just about through that process,” he said. “We have certainly been through a bit of a bumpy road.” One of his recommendations was to use Citrix consulting services in setting up his environment and helping define the appropriate computing architecture. “They can help make everything stable from the beginning and figure out your app and server configurations.”
What helped him pull off this project? Executive buy-in. “Our chairman is an engineer and very much into technology. It was a massive help that he supported our decisions from day one. All he wanted was to implement my vision and he gave me the ability to implement it.”
I spoke to Krishnan Chellakarai about his thoughts. He is currently the Director, IT Security & Privacy at Gilead Sciences and has been a security manager at several biotech firms in the past. One thing he is concerned about is the increasing threats from IoT. He gave me a theoretical example. “What happens if you are reading your emails on your Apple Watch and you click on a phished link. This could lead to a hacker gaining access to credentials and use this information to stealing information from your network.” As users bring in more Fitbits and other devices with Internet access to corporations, “every company needs to worry about this threat vector because it is a foot in the door.” This is part of a bigger trend, where “we have less data stored on individual devices, but there is more access” across the corporation. What this means is that there is “less visibility for IT security pros in case of an exploit.”
Certainly, some of the responsibility with keeping a firm’s infrastructure secure has to lie with each individual user. Chellakarai asks if “people ever look at their Gmail last account activity in the right bottom corner?” Or do we ever click on the security link that pops up when you are signed in to your account from multiple places? This is food for thought. “IT managers need to put some common sense controls in place so they can have better network visibility,” he says. Another example: when was the last time anyone checked their printer firmware or other legacy devices to ensure that they have brought up to their latest versions. “It is time to stop thinking of security after an app is built, and start thinking about security from the beginning, when you are planning your architecture and building your apps.”
Chellakarai says, “One of my first things when I start working for a new company is to do a data analysis and network baseline, so that I can understand what is going on across my infrastructure. It is so critical to do this, and especially when you join a company. I look at policies that aren’t being enforced and other loopholes too. Then I can prioritize and focus on the risks that I find.”
Lenny Zeltser has been teaching security classes at SANS for more than 15 years now and has earned the prestigious GIAC Security Expert professional designation. He is not some empty suit but a hands-on guy who developed the Linux toolkit REMnux that is used by malware analysts throughout the world. He is frequently quoted in the security trades and recently became VP of Products of Minerva Labs and spoke to me about his approach to understanding incident response, endpoint protection and digital forensics.
“I can’t think about malware in the abstract,” he said. “I have to understand it in terms of its physical reality, such as how it injects code into a running process and uses a command and control network. This means I have to play with it to learn about it.”
“Malware has become more elaborate over the past decade,” he said. “It takes more effort to examine it now. Which is interesting, because at its core it hasn’t changed that much. Back a decade or so, bots were using IRC as their command and control channel. Now of course there is much more HTTP/HTTPS-based connections.”
One interesting trend is that “malware is becoming more defensive, as a way to protect itself from analysis and automated tools such as sandboxes. This makes sense because malware authors want to derive as much value as they can and try to hide from discovery. If a piece of malware sees that it is running or a test machine or inside a VM, it will just shut down or go to sleep.”
Why has he made the recent move to working for a security vendor? “One reason is because I want to use the current characteristics of malware to make better protective products,” he said. Minerva is working on products that try to trick malware into thinking that they are running in sandboxes when they are sitting on user’s PCs, as a way to shut down the infection. Clever. “Adversaries are so creative these days. So two can play that game!”
Another current trend for malware is what is called “fileless,” or the ability to store as little as possible in the endpoint’s file system. While the name is somewhat misleading – you still need something stored on the target, whether it be a shortcut or a registry key – the idea is to have minimal and less obvious markers that your PC has been infected. “Something ultimately has to touch the file system and has to survive a reboot. That is what we look for.”
Still, no matter how sophisticated a piece of malware is, there is always user error that you can’t completely eliminate. “I still see insiders who inadvertently let malware loose – maybe they click on an email attachment or they let macros run from a Word document. Ultimately, someone is going to try to run malicious code someplace, they will get it to where they want to.”
“People realize that threats are getting more sophisticated, but enterprises need more expertise too, and so we need to train people in these new skills,” he said. One challenge is being able to map out a plan post-infection. “What tasks do you perform first? Do you need to re-image an infected system? You need to see what the malware is doing, and where it has been across your network, before you can mitigate it and respond effectively,” he said. “It is more than just simple notification that you have been hit.”
I asked him to share one of his early missteps with me, and he mentioned when he worked for a startup tech company that was building web-based software. The firm wanted to make sure their systems were secure, and paid a third-party security vendor to build a very elegant and complex series of protective measures. “It was really beautiful, with all sorts of built-in redundancies. The only trouble was we designed it too well, and it ended up costing us an arm and a leg. We ended up overspending to the point where our company ran out of money. So it is great to have all these layers of protection, but you have to consider what you can afford and the business impact and your ultimate budget.”
Finally, we spoke about the progression of technology and how IT and security professionals are often unsure when it comes to the shock of the new. “First there was vLANs,” he said. “Initially, they were designed to optimize network performance and reduce broadcast domains. And they were initially resisted by security professionals, but over time they were accepted and used for security purposes. The same thing initially happened with VMs and cloud technologies. And we are starting to see containers become more accepted as security professionals get used to them. The trick is to stay current and make sure the tools are advancing with the technology.”
His earliest memory of a security issue was with managing people: “I have found that no matter how comprehensive our policies may be, if you don’t have the right culture among your workforce they won’t matter. Education, understanding, and inclusion are the ways to build the right security environment.”
He is drawn to tools that provide useful analytics. “With TB of data available to your team, trying to find the needle in the haystack can be a challenge. Each tool has its place in your security architecture so picking one is difficult, but those which are capable of providing me good information for analysis are the ones I prefer. That said, knowing your use cases and setting up your tools is probably the biggest impact to any security organization.”
His best advice for dealing with insider threats is to first, start with the basics. “Many companies have not taken adequate measures to protect their information or environments. At the lowest level, access provisioning, data classification, and updated antivirus and firewalls are all mandatory but when new systems or services get introduced into your environment the effects are often not well known. Protect against the drift.”
He sees MDM as a careful balance between protecting the employee and preventing unauthorized access. “At the core of the issue, no one wants their data put at risk and most users and organizations are willing to conform to a good policy in order to protect themselves.”
Paul Lanzi is the COO and co-founder of Remediant, an IT security startup that has created a product to protect privileged accounts. Prior to this startup, he worked for many years as an IT manager in the biotech field, managing various engineering teams for Genetech and then Roche.
Back 11 years ago when he started at Genentech, the first security problem he helped tackle was dealing with managing multiple accounts. “Everyone had multiple accounts and multiple passwords, and we built our own home-grown system to consolidate these accounts, and make it easier for everyone to use a single username and password to get all of their work done. That actually improved security, since lessened the chance that someone would have to write down their multiple passwords somewhere — but it also made it easier to ensure that every employee had the right access to do their job.”
Of course, today we have both single sign-on products to federate identities, such as Okta and Ping Identity, and identity governance products such as Sailpoint and RSA Archer. But back then this was hard work.
Lanzi’s best security tool has been multi-factor authentication. “I turn it on wherever I can, it is truly one of the most under-appreciated tools around. While it isn’t perfect, this technology sits in that rare sweet spot between simplicity and security,” he said. In his present firm he uses a combination of Google Authenticator and Yubikey Nano devices for this purpose. “I am amazed at how much crypto they can cram into that Nano form factor,” he said, which is about the size of thumbnail (shown here).
A decade or so ago, Lanzi was involved in rolling out 110,000 iPads globally at Genentech/Roche. “At the time, it was the largest non-education deployment of iPads in the world, and we used the MobileIron’s MDM software to protect both our data at rest and in flight. Their MDM-based security capabilities gave us the ability to remotely wipe the fewer than 20 devices that were lost or misplaced each month. Its combined capabilities gave us assurance that when those devices were lost, the data on them was still secure. We could also enforce minimum OS version standards, to ensure that users were keeping them up to date with OS security updates.”
Genentech/Roche had a very unusual security staff, composed of folks from different departments. “We had separate teams for patching desktops, maintaining our network infrastructure, an IT Security policy writing group, an account provisioning engineering group for maintaining that piece, and an overall Security Architect as well. They contributed to an overall defense in depth because they were mutually supportive and worked together. That isn’t going to be possible in every enterprise, but we had terrific coverage across the various skills and potential threats areas. And given that we had personnel split across South San Francisco, Madrid and Basil, Switzerland, it was pretty impressive.”
How has security changed among his various employers over the years? “It really depends on the level of support at the executive level. At Genentech/Roche, we had executives who understood the risks and the investment needed to minimize the security risks. Other places were behind the curve and more focused on creating policies and lagged with their investment in security infrastructure. Part of the issue is that unlike in the retail or government sectors, biotech hasn’t had the big-news breaches to motivate organizations towards security improvements.”
Bryan Doerr has been involved with tech companies for decades, most recently leaving Savvis/Century Link as their CTO before agreeing help bootstrap Observable Networks. I asked him to reflect back on his career and where the infosec industry is headed in general. “There is a lot of security industry maturation still to come, a lot of wood left to chop,” he told me in a phone interview last week. “While there are still some pockets of maturity here and there, they usually are only found with the largest companies who can afford it.”
Looking back more than a decade, the biggest change has been being able to deliver security as a subscription service, he said. “First we had pre-built security appliances, but lately we have seen managed detection and response services,” such as what his company delivers. “And it isn’t just a change in how protection is delivered, but how the subscription service can be more affordable for mid-market customers.”
Another big change is how end user customers finally are getting some benefit from sharing threat intelligence. “No one wanted to talk about where or how they were attacked and share these specifics with anyone else,” he said. This intelligence sharing has made the subscription service vendors more potent and compelling and has boosted the ability to respond effectively to threats.
“Ten years ago security was built on a simple idea: that we know about our attackers and threats, and through some means we could prevent those bad guys from getting inside our networks. Back then, we had a limited number of threats, so we could more readily recognize and block them. That is so far from where we are today. The fundamental nature of what is a threat and how attacks use technology has changed completely. The idea of tracking attack signatures makes a lot less sense when every attack is unique.”
Doerr agrees that the days of the perimeter being the sole point of defense are also long over. As an example, he points out the recent IoT botnet attacks.
One benefit from the last decade has been the move towards increasing virtualization. “This absolutely was a positive influence, and helped us to better design and operate more secure systems and more complex infrastructure,” he said. Before virtualization, we had too many different fiefdoms dedicated to particular circumstances. Each one had different configurations and staffs who were maintaining them. All of that variation left us vulnerable.”
But with virtual machines, “a lot of automation has been brought to bear to keep a consistent environment running. That means we can provision VMs, kill them off, and recreate them easily. This makes it more efficient to scale up and down and we don’t have to spend our time patching systems.”
Another issue is the nature of modern network traffic. “Our networks are becoming increasingly encrypted, we can’t even see what is going on over the wire and view the payloads, and this adds another layer of difficulty. Right now less than half of all traffic is encrypted, but it won’t be long before it becomes 100%. We won’t be able to readily examine any of this traffic, which will make networks harder to defend and detect exploits.”
When he was at Savvis, one memorable experience was upgrading one of their data centers. Thanks to a routing bug the entire data center couldn’t come back online. “We tripped over it on a Saturday, and didn’t immediately understand what we were doing. It was easy to miss a single use case that caused the problem. That was a humbling experience and gave me an appreciation of the magnitude of the business that we had running. You don’t feel it until something terrible happens and you see how significant these outages are.” The situation drove home the point that he needed to stay in touch with his technology and understand that it is not just an abstraction, but also a very real entity.
I asked him who had the better job, the CTO or the CIO? He was firmly behind the CTO position. “CTOs will have jobs for forever, because they help organizations understand the evolution of technology and anticipate the direction of that evolution. The CIOs still have some soul searching to do.”
David O’Berry was a former CIO at a state agency with 1000 employees, now he works for a security vendor. To give you an idea of his credentials, he has CISSP‑ISSAP, ISSMP, CCSP, CRISC, CSSLP, MCNE, CSPM and a CRMP!
He met his wife in college when a virus erased his senior thesis text and backups: luckily she was both a fast typist and a good sport. “That was by far the most expensive virus of my entire career!” Later on he had to attack another floppy-based virus, which was difficult because he had to run around the office finding infected disks and literally destroying them. He also faced down the Nachia/Welchia worm, which infected a PC that was not patched because the user was out on maternity leave.
“When I was a CIO, imaging software probably saved us the most time and had the strongest impact initially along with mail filtering products and endpoint management tools for remote control. Besides these products, I believe that standardization of what we did and how we did it had the single largest impact on our organization being able to progress as rapidly as we did with as limited resources as we had.”
For fighting insider threats, “you have to have contextually aware DLP and scanning products as well as what I call “Distributed Peer Review” by the nodes that attach to the environment. Each node has to contribute to the survival of the organism by being a sensor in the larger scheme of things.” He has seen plenty of ransomware, and feels that “first and foremost it is a test of backup and recovery plans. Having a known-state in that area fell out of vogue for a while but now it is more important than ever even if it seems like boring blocking and tackle.”
At his current employer, “we do use MDM and they also allow BYOD. As a former CIO, we had not adopted BYOD when I left but had made the entire workforce mobile and managed it accordingly. We also had implemented Imprivata for its single sign-on package.”
When it comes to securing the cloud and his cloud-based servers, “there are similar challenges to what we have been pursuing since the dawn of time. Visibility is king. Constructs that give you real-time visibility give you the edge over any other type of product when coupled with real-time mitigation and resilience.”
Now that he is on the vendor side, “I would say that the state of cybersecurity has gotten a lot worse since I made the jump because the pace of innovation and change has hit a vertical level and never stops. Malware creators have become more and more adept at how to attack the exploding number of devices. I believe we have a chance to get out in front of the next phase of this is, but to do so we have to share information in real-time as well as allow companies to participate without artificial barriers to entry. However, our window of opportunity is closing rapidly.”
Being the CIO of a non-profit gives you an entirely different perspective in terms of managing people, resources, and technologies.
David Goodman would know. He has been involved with managing IT operations for different non-profits for most of his professional career. He used to be the CIO of International Rescue Committee, and currently is the CIO-in-Residence at NetHope, an umbrella organization that is a resource for some of the world’s largest non-profit aid organizations.
“The biggest challenge for non-profits about IT is that few people understand it in that context. We usually don’t have any roadmap or a sizable staff for how we are going to implement any new technology. Many organizations don’t have any dedicated infosec staff, or if they do they only have one person for this task.”
Often, IT takes a hit due to unplanned consequences that is more because of the where the non-profit is located than anything related to the technology itself. For example, he tells the story of a nonprofit that opened an office in a very insecure country. “We opened an office there to help benefit refugees, which is our mission. We made connections with the local militia to make sure that we were permitted to do this and didn’t have any issues until one day our office was overrun by the militia and our people were taken hostage. They didn’t like what we were doing. While that doesn’t happen too often, it was pretty scary for our staff and volunteers. They took all of our computing equipment. Eventually, we were able to get them to release everyone, although two Americans were held in a hotel for a few extra weeks.”
Planning for this situation is a challenge, as you might expect. But the office had no incident response frameworks, no security policies. “There were passwords written on whiteboards. There were staffers using personal Skype accounts to communicate with headquarters. Because all the laptops were stolen, the rebels were using the staff’s personal Skype accounts that were set to autologin and were sending messages impersonating the staff. They couldn’t easily shut down these personal accounts.” Eventually all personnel returned safely and everyone was accounted for. But they lost all their equipment: “that was never seen again.”
Few IT managers or CIOs have to deal with this kind of situation. “It is pretty nasty stuff, and it is because of the nature of how many international nonprofits operate and the places they have their offices are often in conflict areas. This means we don’t just worry about IT security, but the safety of the staff too.”
Here is another example. At one international nonprofit, he wanted to improve the organization’s password policies. The issue was that many of the staffers are scattered around the world and don’t regularly login to their enterprise Active Directory domain controller which meant that staff didn’t get regular notifications of expiring passwords. “So for the field staff, we set their domain passwords not to expire. As you might imagine, this wasn’t great infosec policy, so I tried to implement a better one that had complexity and change management built-in. I got buy-in from senior management and approval from the CEO. We were ready to implement it, and I sent a reminder email to some of the affected parties, including the CEO.”
Suddenly he scuttled the whole idea: “He told me that he had been using the same password for more than 30 years and wasn’t about to change it now. So the very straightforward and approved password policy was shelved, and there are probably still hundreds of people using non-expiring passwords around the organization.” Goodman couldn’t get him to understand why a better password policy matters.
All is not gloom and doom however. At NetHope, he is working with a number of major donors, including the Gates Foundation and MasterCard International, to create non-profit specific security controls that can be used for guiding IT auditing and compliance. “We will have a set of best practices on how to appropriately secure critical data, all based on existing standards like ISO, NIST, and PCI. We will also provide implementation guidance so that nonprofits without dedicated info sec staff — which is nearly all of them — will know how to implement these controls.”
Ravi Ravishanker is the CIO and Associate Provost at Wellesley College in Massachusetts. He has been in IT for many years, and supports an organization with more than 1400 faculty and staff. I spoke to him in September 2016. “Information security has continued to be one of the highest priority for every one of the IT organizations I have worked for. The only difference is that it has become harder and its relative importance compared to the other things we have to do has gotten higher, which results in much higher resource allocation to security across the entire institution.”
He recalls back in 1986, when he began his IT career. He was writing code in assembler for a VAX VMS. This was done to make it faster to execute. “However, we made a programming error to have one user send a file to another using TCP/IP. Because of an internal security lapse, the students found out they could send someone else’s files using our program. It didn’t take long to fix the problem, fortunately.” Coming into the modern day, he finds that vulnerability scanners are one of his most important security tools. “This is because they expose vulnerabilities about network ports that shouldn’t be open. Similarly, scanners that test our web apps for a range of vulnerabilities are also essential.”
“We realize that given our limited resources, we have to be very diligent. First and foremost, data and network security needs to be a priority for everyone in the IT organization, not just a select group of security administrators. Also, security is a joint partnership between IT and our users; it is a shared responsibility of the entire the enterprise. If our users aren’t following best practices, they can expose our enterprise to data security issues. Security is a critical part of everything that we do.”
To date, he hasn’t seen much in the way of insider threats at the college. “People in higher education have a sense of loyalty to the institution, and we place a lot of trust in our employees. While insider threats are always a potential issue, we are in a space where it is minimal.”
The college has moved into the cloud and continues to increase its cloud footprint. “We try to do as much due diligence when we sign up with a new provider and make sure that they are giving us the security that we need. We thoroughly review the contracts and agreements from security and compliance perspectives before signing up with a provider.”
“We are a fairly small IT organization and currently our user services, which manages desktop support, and the systems and network groups are all under one director. This works really well in terms of information exchange between the groups and easy access to the systems and network engineers. However, we recently decided to reorganize this group and we hope that this relationship will be preserved because this relationship is critical from information security perspective.”