The first decision you need to make in your smart home journey is selecting the right ecosystem. By ecosystem, I mean the voice-activated smart hub that is used to deliver audio content from the Internet (such as news, weather, and answers to other queries) as well as the main interface with a variety of other smart home devices, such as lighting, thermostats and TVs. In this review I look at two of the three main hubs from Google (the white-topped taller unit on the right) and Amazon (the smaller black unit on the left) and how they stack up.
This is the second in a series of articles on how to successfully and securely deploy smart home technology. The first one can be found here.
As more banking customers make use of mobile devices and apps, the opportunities for fraud increases. Mobile apps are also harder to secure than desktop apps because they are often written without any built-in security measures. Plus, most users are used to just downloading an app from the major app stores without checking to see if they are downloading legitimate versions.
Besides security, mobile apps have a second challenge: to be as usable as possible. Part of the issue is that the usability bar is continuously being raised, as consumers expect more from their banking apps.
In this white paper for VASCO, I show a different path. Mobile banking apps can be successful at satisfying the twin goals of usability and security. Usability doesn’t have to come at the expense of a more secure app, and security doesn’t have to come at making an app more complex to use. Criminals and other attackers can be neutralized with the right choices that are both usable and secure.
As the Internet of Things (IoT) becomes more popular, state and local government IT agencies need to play more of a leadership role in understanding the transformation of their departments and their networks. Embarking on any IoT-based journey requires governments and agencies to go through four key phases, which should be developed in the context of creating strategic partnerships between business lines and IT organizations. Here is more detail on these steps, published in StateTech Magazine this month.
Today I begin a series of reviews in Network World around smarter home products. Last year we saw the weaponized smart device as the Mirai botnet compromised webcams and other Internet-connected things. Then earlier this year we had Vizio admit to monitoring its connected TVs and more recently there was this remote TV exploit and even dishwashers aren’t safe from hackers.
Suddenly, the smart home isn’t smart enough, or maybe it is too smart for its own good. We need to take better care of securing our homes from digital intruders. The folks at Network World asked me to spend some time trying out various products and using a typical IT manager’s eye towards making sure they are setup securely.
Those of you that have read my work know that I am very interested in home networking: I wrote a book on the topic back in 2001 called The Home Networking Survival Guide and have tried out numerous home networking products over the years. My brief for the publication is broadly defined and I will look at all sorts of technologies that the modern home would benefit from, including security cameras, remote-controlled sensors, lighting and thermostats, and more.
Smart home technology has certainly evolved since I wrote my book. Back then, wireless was just getting started and most homeowners ran Ethernet through their walls. We didn’t have Arduino and Pi computers, and many whole house audio systems cost tens of thousands of dollars. TVs weren’t smart, and many people were still using dial-up and AOL to access the Internet.
Back in the early 2000’s, I visited John Patrick’s home in Connecticut. As a former IBMer, he designed his house like an IBM mainframe, with centralized control and distributed systems for water, entertainment, propane gas, Internet and other service delivery. He was definitely ahead of the time in many areas.
When I wrote about the Patrick house, I said that for many people, defining the requirements for a smart home isn’t always easy, because people don’t really know what they want. “You get better at defining your needs when you see what the high-tech toys really do. But some of it is because the high-tech doesn’t really work out of the box.” That is still true today.
My goal with writing these reviews is to make sure that your TV or thermostat doesn’t end up being compromised and being part of some Russian botnet down the road. Each article will examine one aspect of the secure connected home so you can build out your network with some confidence, or at least know what the issues are and what choices you will need to make in supporting your family’s IT portfolio of smart Things.
Since I live in a small apartment, I asked some friends who live in the suburbs if they would be interested in being the site of my test house. They have an 1800 sq. ft. three bedroom house on one level with a finished basement, and are already on their second smart TV purchase. One of them is an avid gamer and has numerous gaming consoles. Over the past several months (and continuing throughout the remainder of this year), we have tried out several products. In my first article posted today, we cover some of the basic issues involved and set the scene.
So you have read The Lean Startup. Suffered through following several agile blogs (such as this one). You think you are ready to join the cool kids and have product scrums and stand-up meetings and all that other stuff. Now you need an implementation plan.
Maybe it is time to read this post by Paul Adams on the Intercon blog. He describes some of the lessons he and his development team have learned from building software and scaling it up as the company grows. I asked a few of my contacts at startup software firms what they thought of the post and there was mostly general agreement with his methodology.
Here are some of Adams’ main points to ponder.
Everyone has a different process, and the process itself changes as the company matures and grows. But his description is for their current team size of four product managers, four software designers, and 25 engineers. Like he says: “it’s not how we worked when we had a much smaller team, and it may not work when we have doubled our team.”
Create a culture where you can make small and incremental steps with lots of checkpoints, goals, and evaluations. “We always optimize for shipping the fastest, smallest, simplest thing that will get us closer to our objective and help us learn what works.” They have a weekly Friday afternoon beer-fueled demo to show how far they have gotten in their development for the week. Anyone can attend and provide comments.
Facetime is important. While a lot of folks can work remotely, they find productivity and collaboration increases when everyone is in the same room in a “pod.” Having run many remote teams, certainly local pods can be better but if you have the right managers, you can pull off remote teams too. It appears IBM is moving in this “local is better” mode lately.
Have small teams and make them strictly accountable. Adams has a series of accountability rules for when something goes wrong. Create these rules and teams and stick by them. “We never take a step without knowing the success measurement,” said one friend of mine, who agrees with much of what Adams says in his post. My friend also mentions when using small teams, “not all resources have a one-to-one relationship in terms of productivity; we find that we that the resources we use for prototyping new features can generally float between teams.”
Have a roadmap but keep things flexible and keep it transparent. “Everything in our roadmap is broken down by team objective, which is broken down into multiple projects, which in turn are broken down into individual releases,” said Adams. They use the Trello collaboration tool for this purpose, something that can either be a terrific asset or a major liability, depending on the buy-in from the rest of the team and how faithful they are to keeping it updated.
However, caution is advised: “The comprehensive approach that Adams describes would be entirely too much overhead for most startups,” says my friend. This might mean that you evaluate what it will take to produce the kind of detail that you really need. And this brings up one final point:
Don’t have too many tools, though. “Using software to build software is often slower than using whiteboards and Post-it notes. We use the minimum number of software tools to get the job done. When managing a product includes all of Google Docs, Trello, Github, Basecamp, Asana, Slack, Dropbox, and Confluence, then something is very wrong.”
As you loyal readers know (I guess that should just be “readers” since that implies some of you are disloyal), I have been using and writing about email encryption for two decades. It hasn’t been a bowl of cherries, to be sure. Back in 1998, when Marshall Rose and I wrote our landmark book “Internet Messaging,” we said that the state of secure Internet email standards and products is best described as a sucking chest wound.” Lately I have seen some glimmers of hope in this much-maligned product category.
Last week Network World posted my review of five products. Two of them I reviewed in 2015: HPE/Voltage Secure Email and Virtru Pro The other three are Inky (an end-to-end product), Zix Gateway, and Symantec Email Security.cloud. Zix was the overall winner. We’ll get to the results of these tests in a moment.
In the past, encryption was frankly a pain in the neck. Users hated it, either because they had to manage their own encryption key stores or had to go through additional steps to encrypt and decrypt their message traffic. As a consequence, few people used it in their email traffic, and most did under protest. One of the more notable “conscientious objectors” was none other than the inventory of PGP himself, Phil Zimmerman. In this infamous Motherboard story, the reporter tried to get him to exchange encrypted messages. Zimmerman sheepishly revealed that he was no longer using his own protocols, due to difficulties in getting a Mac client operational.
To make matter worse, if a recipient wasn’t using the same encryption provider as you were using, sending a message was a very painful process. If you had to use more than one system, it was even more trouble. I think I can safely say that these days are soon coming to an end, where encryption is almost completely frictionless.
By that I mean that there are situations where you don’t have to do anything, other than click on your “send” button in your emailer and off the message goes. The encryption happens under the covers. This means that encryption can be used more often, and that means that companies can be more secure in their message traffic.
This comes just in time, as the number of hacks with emails is increasing. And it is happened not only with email traffic, but with texting/instant message chats as well. Last week Checkpoint announced a way to intercept supposedly encrypted traffic from What’s App, and another popular chat service Confide was also shown to be subject to impersonation attacks.
So will that be enough to convince users to start using encryption for normal everyday emailing? I hope so. As the number of attacks and malware infections increase, enterprises need all the protection that they can muster and encrypting emails is a great place to start.
What I liked about Zix and some of the other products that I tested this time around was that they took steps to hide the key management from the users. Zimmerman would find this acceptable, to be sure. Some other products have come close to doing this by using identity-based encryption, which makes it easier to on-board a new user into their system with a few simple mouse clicks.
I also found intriguing is how Zix and others have incorporated data loss prevention (DLP) and detection into their encryption products. What this means is that all of these systems detect when sensitive information is about to be transmitted via email, and take steps to encrypt or otherwise protect the message in transit and how it will ultimately be consumed on the receiving end.
DLP has gone from something “nice to have” to more essential as part of business compliance and data leak hacks, both of which have increased its importance. Having this integration can be a big selling point of making the move to an encrypted email vendor, and we are glad to see this feature getting easier to use and to manage in these products.
Finally, the products have gotten better at what I call multi-modal email contexts. Users today are frequently switching from their Outlook desktop client to their smartphone email app to a webmailer for keeping track of their email stream. Having a product that can handle these different modalities is critical if it is going to make a claim towards being frictionless.
So why did Zix win? It was easy to install and manage, well-documented and had plenty of solid encryption features (see the screenshot here). It’s only downside was no mobile client for composing encrypted messages, but it got partial credit for having a very responsive designed webmailer that worked well on a phone’s small screen. Zix also includes its DLP features as part of its basic pricing structure, another plus.
We have come a long way on the encrypted email road. It is nice to finally have something nice to say about these products after all these years.
With the announcement last week of the Enterprise Ethereum Alliance, it is timely to look at what is going on with blockchain technologies. The Alliance was formed to try to encourage a hybrid kind of blockchains with both public and private aspects. Its members include both cutting-edge startups along with established computer vendors such as Microsoft and major banks such as ING and Credit Suisse. As mentioned in this post by Tom Ding, a developer at String Labs, the Alliance could bring these disparate organizations together and find best-of-breed blockchain solutions that could benefit a variety of corporate development efforts.
When Bitcoin was invented, it was based on a very public blockchain database, one in which every transaction was open to anyone’s inspection. A public chain also allows anyone to create a new block, as long as they follow the protocol specs. But as blockchains matured, enterprises want something a bit more private, to have better control over the transactions for their own purposes and to control who is trusted to make new blocks.
This isn’t a mutually exclusive decision, and what is happening now is that many blockchain solutions use aspects from both public and private perspectives, as you can see from this infographic from Let’s Talk Payments.
You want the benefits of having multiple programmers hammering against an open source code base, with incentives for the blockchain community to improve the code and the overall network effects as more people enter this ecosystem. You also gain efficiencies as the number of developers scales up, and perhaps have future benefits where there is interoperability among the various different blockchain implementations. At least, that is theory espoused in a recent post on Medium here, where R Tyler Smith writes: “One thing that blockchains do extremely well is allow entities who do not trust one another to collaborate in a meaningful way.”
The Ethereum Alliance is just the latest milepost that blockchains are becoming more potentially useful for enterprise developers. Over the past year, several blockchain-as-a-service (BaaS) offerings have been introduced that make it easy to create your own blockchain with just a few clicks. Back in November 2015, Microsoft and ConsenSys built the first BaaS on top of Azure and now have several blockchain services available there. IBM followed in February 2016 with their own BaaS offering on BlueMix. IBM has a free starter plan that you can experiment with before you start spending serious money on their cloud implementations. Microsoft’s implementation is through its Azure Marketplace. There is no additional charge for blockchain services other than the cloud-based compute, network and storage resources used.
IBM’s BlueMix isn’t the only place the vendor has been active in this area: the company has been instrumental in supporting open source code regarding blockchain with large commitments to the Apache Hyperledger project. Not to be left out of things, the Amazon Web Services marketplace offers two blockchain-related service offerings. Finally, Deloitte has its own BaaS service offering as part of its Toronto-based blockchain consulting practice.
If you want to get started with BaaS, here is just one of numerous training videos that are available on the Microsoft virtual academy that covers the basics. There is also this informative white paper that goes into more details about how to deploy the Microsoft version of BaaS. IBM also has an informative video on some of the security issues you should consider here. (reg. req.)
Many website operators have wrestled with the decision to move all their web infrastructure to support HTTPS protocols. The upside is obvious: better protection and a more secure pathway between browser and server. However, it isn’t all that easy to make the switch. In this piece that I wrote for IBM’s Security Intelligence blog, I bring up the case study of The Guardian’s website and what they did to make the transition. It took them more than a year and a lot of careful planning before they could fully support HTTPS.
reading my article on HPE’s latest website. You’ll learn something essential to maintaining your overall IT security posture. I provide an overall timeline of events since last fall, show how Mirai was first detected, and what things you should do to protect your enterprise infrastructure.You’ve probably seen your fill of Mirai-inspired headlines, but keep