CSOonline: Top 7 security mistakes when migrating to cloud-based apps

With the pandemic, many businesses have moved to more cloud-based applications out of necessity because more of us are working remotely. In a survey by Menlo Security of 200 IT managers, 40% of respondents said they are facing increasing threats from cloud applications and internet of things (IoT) attacks because of this trend. There are good and bad ways to make this migration to the cloud and many of the pitfalls aren’t exactly new. In my analysis for CSOonline, I discuss seven different infosec mistakes when migrating to cloud apps.

 

Avast blog: Covid tracking apps update

After the Covid-19 outbreak, several groups got going on developing various smartphone tracking apps, as I wrote about last April. Since that post appeared, we have followed up with this news update on their flaws. Given the interest in using so-called “vaccine passports” to account for vaccinations, it is time to review where we have come with the tracking apps. In my latest blog for Avast, I review the progress on these apps, some of the privacy issues that remain, and what the bad guys have been doing to try to leverage Covid-themed cyber attacks.

Book and courseware review: Learning appsec from Tanya Janca

If you are looking to boost your career in application security, there is no better place to start than by reading a copy of Tanya Janca’s new book Alice and Bob Learn Application Security. The book forms the basis of her excellent online courseware on the same subject, which I will cover in a moment.

Janca has been doing security education and consulting for years and is the founder of We Hack Purple, an online learning academy, community and weekly podcast that revolves around teaching everyone to create secure software. She lives in Victoria BC, one of my favorite places on the planet, and is one of my go-to resources to explain stuff that I don’t understand. She is a natural-born educator, with a deep well of resources that comes not just from being a practitioner, but someone who just oozes tips and tools to help you secure your stuff.

Take these two examples from her book:

The book is both a crash course for newbies as well as a refresher for those that have been doing the job for a few years. I learned quite a few things and I have been writing about appsec for more than a decade. The audience is primarily for application developers, but it can be a useful organizing tool for IT managers that are looking to improve their infosec posture, especially these days when just about every business has been penetrated with malware, had various data leaks, and could become a target from the latest Internet-based threat. Everyone needs to review their application portfolio carefully for any potential vulnerabilities since many of us are working from home on insecure networks and laptops.

Her rough organizing framework for the book has to do with the classic system development lifecycle  that has been used for decades. Even as the nature of software coding has changed to more agile and containerized sprints, this concept is still worth using, if security is thought of as early in the cycle as possible. My one quibble with the book is that this framework is fine but there are many developers who don’t want to deal with this — at their own peril, sadly. For the vast majority of folks, though, this is a great place to start.

Alice and Bob are that dynamic duo of infosec that are often foils for good and bad practices, are used as teaching examples that reek of events drawn from Janca’s previous employers and consulting gigs.

For example, you’ll learn the differences between pepper and salt: not the condiments but their security implications. “No person or application should ever be able to speak directly to your database,” she writes. The only exceptions are your apps or your database admins. What about applications that make use of variables placed in a URL string? Not a good idea, she says, because a user could see someone else’s account, or leave your app open to a potential injection attack. “Never hard code anything, ever” is another suggestion  because by doing so you can’t trust the application’s output, and the values that are present in your code could compromise sensitive data and secrets.

“When data is sensitive, you need to find out how long your app is required to store it and create a plan for disposing of it at the end of its life.” Another great suggestion for testing the security of your design is to look for places where there is implied trust, and then remove that trust and see what breaks in your app.

Never write your own security code if you can make use of ones that are part of your app dev framework. And spend time on improving your “soft skills” as a developer: meaning learning how to communicate with your less-technical colleagues. “This is especially true, when you feel that the sky is falling and you aren’t getting any management buy-in for your ideas.”

One topic that she returns to frequently is what she calls technical debt. This is a sadly too-often situation, whereby programmers make quick and dirty development decisions. It reflects the implied costs of reworking the code in your program due to taking shortcuts, shortcuts that eventually will catch up with you and have major security implications. She talks about how to be on the lookout and how to avoid this style of thinking.

Let’s move on to talk about the online classes.

The classes will cost $999 (with an option to interact directly with her for 30 minutes for an additional $300) but are certainly worth it. They cover three distinct areas, all of which are needed if your code is going to stand up against hackers and other adversaries.

The first course is for beginners, and covers the numerous areas of appsec that you will need to understand if you are going to be building secure apps from scratch, or trying to fix someone else’s mess. Even though I have been testing and writing about infosec for decades, I still managed to learn something from this class.

If you are not a beginner, and if you are just aiming to learn more for yourself, then you should probably just focus on the third class. The second class goes into more detail about how to create a culture at your organization where appsec is part of everyone’s job. If you aren’t going to be managing a development team, you might want to return to this class later on.

There are certainly many sources of online education, but surprisingly, few offer the range and depth that Janca has put together. Google and Microsoft have free classes to show you how to make use of their clouds, but they aren’t as comprehensive nor as useful, especially for beginners who may not even know how to frame the right questions, or even assemble their goals for what they want to learn about appsec. And both OWASP and SANS, which normally are my go-to places to learn something technical, are also deficient on the practice of appsec, although they both have developed many open-source tools and cheat sheets and other supporting things that are used in developing secure apps. Thus Janca’s courseware fills an important missing niche.

The textbook for all three classes is her excellent Alice and Bob book mentioned above. Yes, you could probably learn some of the things by just reading the book without taking the classes, but you would have to work a lot harder, especially if you are more of an auditory learner. Watching and listening to Janca explain her way through numerous different tools that you’ll need to build your apps securely is worth the price of the courses: you are in the presence of a master teacher who knows her stuff.

One thing missing from the trio of classes is any product-specific discussion. (She covers this separately.)  I realize why she did this, but think that eventually you will be frustrated and just wish you could have a little more context of how a piece of defensive or detection software actually works, because that is how I, as an experiential learner, figure these things out.

All in all, I highly recommend the sequence, with the above caveats. We all need to move in the direction of making all of our apps more secure, and Janca’s courseware should be required for anyone and everyone.

Apple’s App Store: monopoly or digital mall?

Another salvo in the legal battle between Apple and its developers was fired last month. The EU Commission is following up on a complaint from Spotify that says Apple’s practices are anti-competitive and are designed to block the popular music streaming service. Apple has two policies: one that prevent app creators from linking payments from within the app other than subscriptions, and another that limits users from making payments other than in-app purchases. These two policies result in developers having to pay Apple commissions on these payment streams: which amount to nearly a third for the first year and 15% in subsequent years.

This follows the US Supreme Court ruling that iPhone customers could sue Apple for allegedly operating the App Store as a monopoly that overcharges people for software. So far no action has happened as a result of this case and legal experts say it will probably take several years to wind its way through the courts. There was another lawsuit filed in US District Court in San Jose by two app developers that also accuse Apple of being a monopolist.

Andy Yen of ProtonMail posted this blog entry last month, saying “We have come to believe Apple has created a dangerous new normal allowing it to abuse its monopoly power through punitive fees and censorship that stifles technological progress, creative freedom, and human rights. Even worse, it has created a precedent that encourages other tech monopolies to engage in the same abuses.” He states further that “It is hard to stay competitive if you are forced to pay your competitor 30% of all of your earnings.” 

Of course, Apple disputes all of these charges, saying that it is just a digital mall where the tenants (the developers) are just paying rent. Nevertheless, it is the only mall when it comes to providing iOS apps. Apple claims it needs some compensation to screen out malware and badly coded apps and claims that the vast majority of apps in its store are free with no payments collected from developers. “We only collect a commission from developers when a digital good or service is delivered through an app.” The company explained its practices in this post in May, and cited a number of instances where third-party app developers compete with its own apps such as iCloud storage, the Camera, Maps and Mail apps.

Tim Cook thinks nobody “reasonable is going to come to the conclusion that Apple’s a monopoly. Our share is much more modest. We don’t have a dominant position in any market.” I disagree. From where I sit this seems very similar to what Microsoft went through back in the 1990s. You might remember that the US government ruled its Windows and anti-competitive practices were considered a monopoly.

There are some differences between Microsoft then and Apple now: Apple doesn’t have a dominant share in mobile OS outside of the US (Google’s Android has 75% of the market). whereas Microsoft had 90% of the PC OS market. But still, the Apple App Store represents a high barrier for app developers to enter, and consumers do suffer as a result.

The privacy challenges of contact tracing by smartphone apps

A number of countries — and now individual US states — are planning or have rolled out their smartphone-based contact tracing apps, in the hopes of gaining insight into the spread of infections. As you might imagine, this brings up all sorts of privacy implications and challenges. Before I review where in the world you can find an ailing Carmen San Diego, let’s look at the four major development projects that are now underway.

  • The most well-known is a joint project from Google/Alphabet and Apple that is more a framework than an actual app. Vaughan-Nichols explains the actual mechanics and The Verge answers some of the questions about this effort. The UK is poised to test their app based on this framework sometime soon. Both vendors have stated that these protocols will be incorporated into later releases of Android and iOS later this summer.
  • An open-source EU-based effort called DP-3T has developed an Apache/Python reference implementation here on Github. There are sample apps for Android and iOS too.
  • A second joint EU-based closed-source effort called PEPP-PT has gotten support from 130 organizations in eight different countries. No current apps are yet available to my knowledge on either EU effort.
  • Finally is something called BlueTrace/OpenTrace which is open source code developed by Singapore that is part of their tracing app called Trace Together. This was launched in late March. So far no one else has made use of their code.

All four proposals — I hesitate to call them implementations — are based a few common principles:

  • When a match with a known infected user is made, all data is collected and stored locally. The idea is to preserve a user’s privacy, but still give public health officials some insight into the users’ movements. Some of the implementations combine local and centralized health data, such as the PEPP framework and Singapore’s app.
  • The contacts are found through the use of Bluetooth low energy queries from your phone to nearby phones. These can reach up to a hundred feet in open air. The ACLU is worried that this data isn’t all that accurate, and has raised other privacy issues in this paper.
  • There are various encryption protocols and layers, some better than others. The goal here is to anonymize the user data and keep hackers at bay. Some information and interfaces are documented, some things aren’t yet published or won’t be made public. And of course no system is 100% fail safe.
  • The apps all rely on the GPS network, which limits their utility given that precise locations aren’t really possible. Some efforts are more sophisticated in cross-checking with the user’s common locations and Bluetooth contacts, but this is very much an inexact science. Taiwan tries to get around this by having the user call the health department and cross-check their own location history against this repository and request a test if there was an intersection.
  • Usually, the local health agency interacts with the tracking data — that is the whole point of these things. But as in the case of Singapore, do we really want a central point where potential privacy abuse could happen? How long does the agency keep this location data, for example?

You can see where I am going with this analysis. We have a lot of things to juggle to make these apps really useful. One of the biggest issues is the need to combine tracking with testing to verify the spread of infection. This paper from Harvard goes into some of the details about how many tests will be needed for tracking to be effective. As you can guess, it is a lot more testing than we have done in the US.

Yes, many of us are now sticking at home, and obeying the recommendations or in some cases the varying local rules. (Israel, for example, doesn’t allow anyone to travel very far from their homes.) But some of us aren’t obeying, or have to travel for specific reasons. And what about folks who have gotten the virus and haven’t gotten sick? Should they be allowed to travel with some sort of document or (as Bill Gates has suggested, a digital signature)?

This page on Wikipedia (while I don’t like citing them, folks seem to be keeping the page updated) lists more than a dozen countries where have apps deployed. India has multiple app deployments from various state agencies. There are also apps available in China, Israel, Norway, Ghana, the Czech Republic and Australia. You should take a look at the various links and make your own comparisons.

What should you do? In many places, you don’t have much choice, particularly if you recently returned home from outside the country. For those of us that have a choice, if you don’t like the idea, then don’t install any of these apps, and when the phone operating systems update over the summer, remember to turn off the “contact tracing” setting. If any of you are active in the efforts cited here, please drop me a note, I would love to talk to you and learn more.

Red Hat blog: containers last mere moments, on average

You probably already knew that most of the containers created by developers are disposable, but did you realize that half of them are only around for less than five minutes, and a fifth of them last less than ten seconds? That and other fascinating details are available in the latest annual container report from Sysdig, a container security and orchestration vendor.

I mention that fun fact, along with other interesting trends in my latest blog post for Red Hat’s Developer site.

Red Hat Developer website editorial support

For the past several months, I have been working with the editorial team that manages the Red Hat Developers website. My role is to work with the product managers, the open source experts and the editors to rewrite product descriptions and place the dozens of Red Hat products into a more modern and developer-friendly and appropriate context. It has been fun to collaborate with a very smart and dedicated group. This work has been unbylined, but you can get an example of what I have done with this page on ODO and another page on Code Ready Containers.

Here is an example of a bylined article I wrote about container security for their blog.

How to protect your mobile apps using Zimperium’s zIAP SDK (screencast)

If you are looking for a way to protect your Android and iOS apps from malware and other mobile threats, you should look at Zimperium ‘s In-App Protection (zIAP) SDK . It supports both Apple X-Code for iOS apps and Android Studio for those apps. One of the advantages of zIAP is that you don’t have to redeploy your code because changes are updated dynamically at runtime and automatically pushed to your devices. zIAP ensures that mobile applications remain safe from cyber attacks by providing immediate device risk assessments and threat alerts. Organizations can minimize exposure of their sensitive data, and prevent their customers and partners’ data from being jeopardized by malicious and fraudulent activity. I tested the product in April 2019.

Pricing starts for 10K Monthly Active Devices at $12,000 per year, with steep quantity discounts available.

https://go.zimperium.com/david-strom-ziap

Keywords: strom, screencast review, webinformant, zimperium, mobile security, app security, Android security, iOS security

CSOonline: Top application security tools for 2019

The 2018 Verizon Data Breach Investigations Report says most hacks still happen through breaches of web applications. For this reason, testing and securing applications (from my CSOonline article last month) has become a priority for many organizations. That job is made easier by a growing selection of application security tools. I put together a list of 13 of the best ones available, with descriptions of the situations where they can be most effective. I highlight both commercial and free products. The commercial products very rarely provide list prices and are often bundled with other tools from the vendor with volume or longer-term licensing discounts. Some of the free tools, such as Burp Suite, also have fee-based versions that offer more features. You can review my list in CSOonline here. 

 

 

CSOonline: What is application security and how to secure your software

Application security is the process of making apps more secure by finding, fixing, and enhancing the security of apps. Much of this happens during the development phase, but it includes tools and methods to protect apps once they are deployed. This is becoming more important as hackers increasingly target applications with their attacks.

In the first of a two-part series for CSOonline, I discuss some of the reasons why you need to secure your apps and the wide variety of specialized tools for securing mobile apps, for network-based apps, and for firewalls designed especially for web applications. Next month, I will recommend some of these products.