The intersection of art and technology with Thomas Struth

As I grow older, I tend to forget about events in my youth that shaped the person that I am now. I was reminded of this last week after seeing a Thomas Struth photography exhibit at the St. Louis Art Museum. Struth’s pictures are enlarged to mural size and depict the complex industrial environments of the modern age: repairing the Space Shuttle, a blowout preventer at an oil rig (shown here), the insides of physics and chemistry labs, the Disney Soarin’ hang glider simulation ride, and chip fabrication plants. Many of these are places that I have had the opportunity to visit over the years as a technology reporter.

The pictures reminded me of a part-time job that I had as an undergraduate student. My college had obtained a set of geometrical string models that were first constructed back in the 1830s and demonstrated conic sections, such as the intersection of a plane and a cone. Back then, we didn’t have Mathematica or color textbooks to show engineering students how to draw these things. These models were constructed out of strings threaded through moveable brass pieces that were attached to wooden bases, using lead weights to keep the strings taut.

The models were first built by a French mathematician Theodore Olivier, and were used in undergraduate descriptive geometry courses up until 1900. I was one of the students who helped restore them. While the models look very nice now, back when I was a student they were in pretty bad shape: the wooden bases were cracked, the brass pieces were tarnished, and the strings were either tangled or missing. It took some effort to figure out what shapes they were trying to display and how to string them properly. Sometimes there were missing parts and I had the help of the college machine shop and local auto body shops to figure out what to do. The best part of this job was that it came with its own private office, which was a nice perk for me when I needed to escape dorm life for a few quiet hours. After I graduated, the college put the finished models on display for everyone to see.

The intersection of art and technology has always been a part of me, and it was fun seeing Struth’s work. it was great to get to see the details captured and the point of view expressed from these images, lit and composed to show their colors and construction. And the photos reminded me of the beauty of these advanced machines that we have built too.

FIR B2B podcast #87: A LinkedIn Exec’s 2018 Sales and Marketing Predictions

We spoke to Justin Shriber, Vice President of Marketing for LinkedIn Sales and Marketing Solutions, to start off the new year. He put together a series of predictions for the year ahead, and in this discussion he explains them and the role that LinkedIn will play in advancing B2B sales and marketing in 2018.

Smart, quanitative driven marketers will still be in high demand, but the pendulum will start to swing back toward marketers that have a qualitative eye for good stories.

— Brands will re-evaluate the platforms on which they post their content, favoring those that have gone on record saying that user trust is a priority for them. Brands want platforms that give them control over affiliations and customer IDs, where they show up, and the audiences to which they’re exposed.

Sales will be the new awareness marketing channel. Sales used to be the direct connection to prospects, but we will see sellers start to build awareness through direct advocacy programs.

Marketing will gather more intelligence around the reach employees have on the sales side. There will be formal processes that make it possible for employees to easily decide what shareable content speaks to them so they can maintain their own individuality while also benefiting the company. 

LinkedIn Sales Navigator will play an increasing role in this unification of marketing and sales efforts. 

Sales and marketing alignment will continue to improve. Organizations need to engage both sales and marketing in a concerted way from awareness through conversion, rather than having marketing take the front end and sales the back end.

Listen to our 21 minute podcast here.

HPE Enterprise.nxt: How to protect hidden Windows services from attacks

The hijacking legitimate but obscure Windows services is a tough exploit to detect. Here are two lesser known Windows services that could be vulnerable to malware attacks. You might think you can tell the difference between benign and malicious Windows services, but some of these services are pretty obscure. Do you know what ASLR and BITS are? Exactly.

You can read my latest article for HPE here.

Behind the scenes at a Red Cross shelter

A friend of mine, Dave Crocker, has been volunteering for Red Cross activities around the California fires and Houston floods over the past several months, and has been working as a volunteer for them for more than nine years. I thought it would be an interesting time to chat with him about his experiences and consider why the media is so often critical of the Red Cross .

Crocker was in Houston for two weeks, starting two weeks after the hurricane hit. He has been a shelter supervisor at both small and large operations, a dispatcher for daily, local disasters, and helps out in other situations, both in the field and in their offices. Given his tenure as a volunteer, he has taken numerous Red Cross training classes, including learning to drive a fork lift (although not that well, he ruefully notes).

The work is challenging on several levels. First are the 12 hour shifts, usually 7 to 7. Except they often don’t end exactly at 7:00; so your shift lasts 13 or 14 hours or more. If you are on a night shift, that can be even tougher. You get one day off per week, if you are lucky. You sleep wherever you can find a bunk, sometimes that means you don’t exactly have five-star accommodations, or even one-star. “I’ve slept on a shelter’s army cots, but in Ventura I paid for my own accommodations and got a hotel room. I don’t sleep well on cots. Some of my fellow volunteers have slept in their cars or on the ground.”

He is very proud of his volunteer efforts, although he doesn’t carry any personal hubris in what he does. “First and foremost, it’s about helping our clients,” he told me in a recent phone call and over a series of emails and Facebook posts. “Self-praise almost never shows up in anyone’s behavior. The focus is the work.”

One of the things he learned from the recent series of disasters was to expand his definition of a “client”. Originally, he thought just the people displaced by the floods or fires were his clients, but other volunteers pointed out that the Red Cross ecosystem is much greater, including someone who donates items or funds to a relief effort. “The rest of the community is also our client, because they are also affected by the disaster and are compelled to be connected to it, by coming to the shelter to donate or by asking how can they help.”

One of the challenges is that these spontaneous donations can become overwhelming. In the Ventura County fires, Crocker experienced this first-hand. “We saw an enormous amount of donations of water, snacks, face masks, diapers, clothing, toys, and more, That was all brought to our shelters, and our warehouses quickly got filled. Processing all that requires a lot of staff. Historically, these donations have been turned away by the Red Cross, with a request to just send money. This has regularly produced word-of-mouth criticism of the Red Cross. This year, Red Cross policy changed and the rule is to say yes and then figure out how to make it work.”  Crocker said that many tens of thousands of bottles of water were donated, as were donations that had been ordered online, with enough showing up to fill a shipping container.

Running a large disaster response is sometimes compared to the logistics of running a military deployment. “Even the smallest shelter has an enormous amount of detail to it,” Crocker told me. “There is the whole setting-up of beds and linens, and then taking it all down, the ongoing cleaning of various items as clients leave and new ones register; then there is feeding three meals a day plus snacks. It is a massive logistics game and the situation is highly dynamic. Communication is challenging because you have to deal with a lot of noisy information. And equipment and geography can be difficult.”

Fires are unpredictable, especially when the wind changes, and that puts a wrench in your plans, for who is affected and where to locate the shelters. The Ventura Fairgrounds shelter he worked at had roughly 250 clients, with a peak of about 500, before he arrived. The range of quality in facilities that are available is also highly variable. At Ventura, the shelter was in a building that is typically used for livestock shows. “We were in better shape in the wine country fires because we had use of a church with excellent kitchen and shower facilities and had been explicitly designed for be used as a shelter.” That church-based facility has hosted a disaster shelter 11 times in the last few years. In Houston, there were roughly 4,000 volunteers in the relief effort, divided amongst 25 different shelters.

The timing of the Ventura fires produced an unusual benefit for the shelter’s clients. Because the fires were around the holidays, a lot of corporate parties were canceled and as a result restaurants had surplus food that they repurposed as donations to feed the volunteers working the shelters.

One of the frustrations Crocker cites for himself and his colleagues is the negative press surrounding the response of the Red Cross volunteers to these disasters. “Sometimes the reporting focuses only on the negative, citing only one or another disgruntled person.” While certainly there are issues, for the most part he sees the relief efforts as run as well as they can be, given the complex and dynamic circumstances that any large effort like this will have. “Certainly, there are people who try to scam the system, something that I’ve seen in my limited volunteer efforts. But Red Cross policy is to err in the direction of helping rather than rejecting people who ask for assistance.”

“The work itself, and the privilege to do it, is what I enjoy, and being around people with a similar attitude, and getting the work done.” Crocker mentioned in one Facebook post that “everyone has had a collaborative tone” including Red Cross volunteers, employees and even clients, which could be because many clients have been displaced by multiple fires in past years. Note that more than 90% of Red Cross staffing is done by volunteers.

I highly recommend taking a moment, and getting involved in your local Red Cross chapter. Give blood, give money, give your time. You are working with a great group of people and for something very worthwhile.

My love affair with the phone central office

I have a thing about the telephone central office (CO). I love spotting them in the wild, giving me some sense of the vast connectedness that they represent, the legal wrangling that took place over their real estate, and their history in our telecommunications connectedness. That is a lot to pack into a series of structures, which is why I am attracted to them.

Many of you may not be familiar with the lowly CO, so I should probably back up and set the scene. Back in the day when all of us had landline phones, indeed, before we even called them such things, the phone company had to wire these phone lines to one central place in each community. A pair of low-voltage wires ran from your home to this central office, and were connected to a vast wire switching center called the main distribution frame. The central office actually supplied the dial tone that you heard when you picked up your phone, which assured you that your phone line was operating properly.

Interestingly, one of the inventors of the central office was a Hungarian named Tivadar Puskas, and I actually got to visit the fruits of his labors when I was in Budapest several years ago. The building that housed his switchboard is now offices occupied by the firm Prezi. Notice it still is quite a beautiful structure, with gothic touches and high ceilings.

I was reminded of visiting the Prezi offices when I was on a trip last week to Columbus Indiana. I have been wanting to go there for several years since first learning that the small city contains some of the best examples of modern architecture in one place. What does this have to do with phone COs? Well, they have a beautiful CO building that was designed in 1978 and it once looked like this.

Columbus’ CO was actually an expansion of an earlier and more modest building that was built in the 1930s. The phone company needed more room as the town grew and they were adding services, and so we have this space-frame curtain-glass wall structure that was built around the original building. I would urge those of you interested in seeing other great modern structures to schedule your own trip to the town and see what they have done because there is a lot to see.

But let’s get back to the phone CO. These buildings were at the center of activity back in the 1990s when DSL technology (which is the broadband technology that the phone companies still use to deliver Internet services) was first coming into popularity. At the time there were a number of independent startups such as Covad and Rhythms that wanted to provide DSL to private customers. The phone companies tried to block them, claiming that there wasn’t enough room in the COs to add their equipment. Back in the day, there weren’t many higher-speed data service offerings, so DSL was a very big improvement. This battle eventually was won and while these startups have come and gone, we have Uverse service offered by AT&T as a result. (I wrote about these DSL issues in an interesting historical document from that era.)

In 2001, I was teaching TCP/IP networking to a class of high school boys and wanted to take them on a series of field trips to noteworthy places and businesses near the school. Now, field trips are normal for younger kids, but when you get to high school, that means taking your kids out of other regularly scheduled classes. I thought this would be interesting to my students and so one of the first trips we took was a short one, down the street to the local CO. This was in a very non-descript structure that many of us passed by frequently and didn’t give it any thought (see the photo here).

To give you additional context, this was about two months after the 9/11 attacks. Somehow I was able to convince the Verizon folks (what the local phone company was called at the time) to allow me to bring a bunch of high school kids into their CO. I recall we had old Bell veterans who gave us a tour, and when the time came to show the kids where their home phone lines were located, I volunteered to have them find my own wire pair. I remember the employee pulled the location of my phone wires and told the class that I had something very unusual: back then I had an office phone that also was wired to my home, and he could show the students how this worked so the phones rang in both locations.

I realize that nowadays the landline is a historical communications curiosity more than anything, and you have to look long and hard to even find someone who uses one anymore. So it is great that there are few phone COs that are grandly designed and stand out as great examples of architecture and design, such as the ones I have seen in Budapest and Columbus. Do send me your own favorites too, and best wishes for a great 2018.

Gregory FCA newsletter: How to get your annual year-end security reports noticed and read

It’s about as regular as hearing Auld Lang Syne on New Year’s Eve: The annual year-end security report issued by companies big and small looking to create awareness and build relationships. Our inboxes were flooded with dozens of them. In this newsletter that I co-authored with Greg Matusky and Mike Lizun, we look at some of the best and worst features of these annual reports and give our opinions. Hopefully you can use our findings to improve your own reports this time next year, and learn from the best and avoid the biggest mistakes.

The Scintillating Standouts of 2017!

Some of the more unusual reports are the ones that really caught our eyes.

Kaspersky’s “story of the year” takes the typical annual year-end report and transforms it into a cyber-security news story similar to People’s Person of the Year. Written in layman’s terms with an accompanying infographic, Kaspersky’s Story of the Year reworks the tired ransomware story into a can’t-not-read compendium on all things ransomware. And it’s understandable! The first line reads like the opening of a movie rather than a technical rehash. Consider, “In 2017, the ransomware threat suddenly and spectacularly evolved. Three unprecedented outbreaks transformed the landscape for ransomware, probably forever.”

Kasperksy then takes it one step forward by producing “The Number of the Year,” based on the number of malicious files its networks have seen transit its sensors. Our co-author David Strom calls it gimmicky, and maybe it is from his journalistic perch. But from a strictly PR perspective, the ability to distill a finding down to a single number (and one drawn from data at their ready disposal) is a brilliant PR take, and they are to be congratulated.

What about your organization? Do you have available internal data that could add PR gravitas to your next report? Might be something to consider.

Another take comes from ServiceNow. They opted to deliver their security predictions in a short-and-sweet format–one that takes less than three minutes to read. Their conclusions are compelling without overselling. For instance, they suggest that 2018 will see the emergence of security haves and have-nots–those having automated detection and response and those who don’t. Guess who sells such a solution? Still, they keep the sell to a minimum.

Watchguard uses their blog to make a series of predictions in a very attractive and still informative way. There are predictions about IoT botnets, a doubling of Linux-based attacks, what will happen to multi-factor authentication, and the state of election and voter hacking. Each prediction takes the form of a short video with high production values.

With all the news about Uber’s mistakes over the past year, here is a cogent analysis by Dark Reading of what Uber did wrong with its breach response: delayed notification, failure to implement stronger access controls, unclear approval workflows, storing access credentials in GitHub, and failing to compartmentalize data access. This analysis was a neat package that we wish others would emulate.

This report, which appeared in IBM’s Security Intelligence blog, is another rarity. It compares what few of these year-end surveys actually do by looking back a year and then scoring their predictions. The author looked at the threats posed by IoT, the rise of cybercrime-as-a-service, and the threats against brand reputations and concludes he was a bit ahead of the curve on some trends. We wish we would see more of these “truth telling” evaluation-type pieces.

Those were our top picks. But there are plenty of other year-end reports, most choosing one of three paths: presenting the results of a survey, focusing on a particular vertical market, or summarizing what telemetry they have collected from sensors located at major internet peering points or at their customers.

All in the Numbers: The Best of the Survey-Based Reports

Let’s look at the two best survey posts.

The State of Open Source Security” touches on both telemetry and survey methods. It presents the results of a survey of 500 open-source users combined with internal data from Snyk and scans of various GitHub repositories. Sadly, almost half of the code maintainers never audit their code, and less than 17 percent feel they have high security knowledge. Code vulnerabilities are on the rise for open-source projects but not for Red Hat Linux, which is an interesting factoid that isn’t often mentioned.

Beyond Trust’s report has a series of 18 predictions, most of which are obvious (bigger targets will fall, mobile spam on the rise, games can double as malware). A few are interesting, and what sets this report apart is a look ahead to five years from now when GDPR becomes untenable, online elections become secure, and the end of cash arrives.

Customer Telemetry-Based Reports Work Well Also

McAfee’s annual threat predictions have some interesting insights and cover some non-obvious subjects, including describing the machine learning arms race, the opportunities for serverless attackers, and the ways that home automation vendors will misuse your personal data.

Fortinet is another one of those companies that runs a massive protection network and can cull trends from its customers. Their quarterly threat report has identified 185 zero-day vulnerabilities, with an average of each customer experiencing more than 150 attacks over the quarter and unknowingly running an average of two botnets inside their networks. Like other security researchers, they talk about the delay to patching known exploits and how lousy most of their customers are at getting at root causes of infections.

Then there is Bitdefender’s insights into the past year’s threats. It is based on their own global sensor network and from their customers. Ransomware is still king, with one in every six spam emails including some kind of ransomware attack vector. Also on the rise this past year are crypto-currency miner malware, polymorphic attacks, and Android-based Trojans.

Dashlane’s report on the worst passwords of the year is entertaining, if a bit predictable. While they break all the rules about these year-in-review articles, it works. Yes, it is subjective, it is somewhat self-serving (Dashlane sells a password manager), and it covers familiar ground. But it is very amusing and that is why sometimes you can deliver old chestnuts in interesting ways.

Slicing and Dicing Vertical Markets in Reports 

Some vendors have taken a different tactic and written year-end reports that examine specific verticals. This is what eSentire has done with the healthcare industry. Rather than just positing the “chicken little” scenario, it provides specific case studies of security weaknesses in various enterprises that of course were eSentire customers and discovered malware on their networks. They conclude by saying that well-known exploits have been out for years and yet still aren’t patched. Yes, it is self-serving, but it is also instructive.

Another way to slice things is to just focus on bitcoin exploits, which have been increasing as its value rises. Incapsula looked at exploits across its own network and found three out of four bitcoin sites were attacked and a third of the network attacks were persistent attacks. Hong Kong was the most targeted country for bitcoin-based network layer assaults in Q3 2017, largely because of a persistent attack on a local hosting service that was hit hundreds of times throughout the quarter.

Another example is this report looking at mobile threats by RiskIQ. They used telemetry from their network of more than 120 different app stores and billions of endpoints. This is a rich source of exploits and a growing threat. It highlights the non-surprising trend toward using phony rave reviews to prop up a malicious app. It also reviews the collaboration over the takedown of the WireX botnet earlier this fall.

What to Avoid in Your Annual Report 

Finally, no compendium would be complete without mentioning some examples of what to avoid. As we mentioned in an earlier newsletter, having small survey sample sizes is never a good idea, and this report by Holger Schulze where he interviews 500 people forthis report for Alienvault is to be avoided. While it has numerous graphics that can be used in blog posts, it contains mostly subjective content.

Also to be avoided: reports that don’t say anything new, such as this report from Wandera on WiFi risks, or this report on security trends from Cipher. A corollary to this is to avoid predictions that are more self-serving or self-promotional, such as these from Axiomatics.

Another issue: checking your facts. In November, an organization called the Information Technology and Innovation Foundation posted a supposedly detailed review of the security compliance of hundreds of the more popular U.S. government websites. Sadly, the facts weren’t correct, and webmasters responded with complaints and corrections.

Don’t do what NordVPN and eSentire did. Both of their PR firms sent out predictions for 2018 in email messages, and neither of them posted any of this content online. That isn’t helpful, especially in a world where you want to cite a URL for any predictions-related materials.

Then there is this encyclopedic listing from our colleagues at MSSP Alert of dozens of predictions, culled from various security management vendors. We dare you to read through the entire list, which spans multiple pages. Sometimes less is more!

Finally, here is a somewhat different twist on the predictions route.Varonis put together a post that contained quotes from a series of podcasts. It was a good try, and a terrific example of repurposing content. But it held little value for discerning audiences that would want more context in their analysis.

iBoss blog: A Review of the Notable Vulnerabilities of 2017

This past year has seen its usual collection of exploits, vulnerabilities, attacks and data leaks. But let’s take a look back and see if we can learn a few lessons from the progress of time. Of all stories, it certainly seems like this year has been a watershed in terms of major ransomware attacks. From Locky, Petya, Mirai, WannaCry, and BadRabbit, we haven’t had much time in between each attack to bounce back, the attacks are getting bigger and more intrusive and more targeted.

For this and other megatrends, I offer up some suggestions for security managers too. Here are more in my iBoss blog post this week. 

Lessons learned from the Minitel era

Technologists tend to forget the lessons learned from the immediate past, thinking that new tech is always better and more advanced than those dusty modems of yesteryear. That is why a new book from MIT Press on Minitel is so instructive and so current, especially as we devolve from a net-neutral world in the weeks and years to come. Much as I want to be tempted to discuss net neutrality, let’s just leave those issues aside and look at the history of Minitel and what we can learn from its era.

Minitel? That French videotext terminal thing from the 1980s and 1990s? Didn’t that die an ignominious death from the Internet? Yes, that is all true. But for its day, it was ahead of its time and ahead of today’s Internet in some aspects too. You’ll see what I mean when you consider its content and micropayments, network infrastructure, and its hybrid public/private ownership model. Let’s dive in.

Minitel was the first time anyone figured out how to develop a third-party payments system called Kiosk that made it easier for content providers to get paid for their work, and laid the foundation for the Apple App Store and others of its ilk. It presaged the rise of net porn well before various Internet newsgroups and websites gained popularity, and what was remarkable was that people paid money for this content too.  It was the first time a decentralized network could hook up a variety of public clients and servers of different types. Granted the clients were 1200 bps terminals and the network was X.25, but still this was being done before anyone had even thought of the Web. It was the first public/private tech partnership of any great size: millions of ordinary citizens had terminals (granted they got them free of charge) well before AOL sent out its first CD and before the first private dot coms were registered. The authors call this “private innovation decentralized to the edges of the network.” This is different from what the Internet basically did beginning in the middle 1990s, which was to privatize the network core. Before then, the Internet was still the province of the US government and had limited private access.

Minitel made possible a whole series of innovations well before their Internet-equivalents caught on, sometimes decades earlier. The book describes a whole series of them, including e-government access, ecommerce, online dating, online grocery ordering, emjois and online slang, electronic event ticketing and electronic banking. When you realize that at its peak Minitel had 25,000 services running, something that the Web wouldn’t reach until 1995, it is a significant accomplishment.

Minitel wasn’t all rainbows and unicorns. Like AOL, it was a “walled garden” approach, but in some aspects it was more open than today’s Internet in ways that I will get to in a moment. It had issues being controlled by a nationalized phone company.

Certainly, the all-IP Internet was a big improvement over Minitel. You didn’t have to provision those screwy and expensive X.25 circuits. You could send real graphics, not those cartoon ones that videotext terminals used that were more like ASCII art. Minitel was priced by the minute, because that is what the phone company knew how to do things. Certainly, the early days of the Internet had plenty of 1200 bps modem users who had to pay per call, or set up a separate phone line for their modems. Now we at least don’t have to deal with that with broadband networks that are thousands of times faster.

One side note on network speeds: Minitel actually had two speeds: 1200 and 75 bps. Most of the time, the circuits were set up 1200/75 down/up. You could send a signal to switch the speeds if you were sending more than you were receiving, but that had to happen under app control.

So what can we learn from Minitel going into the future? While most of us think of Minitel as a quaint historical curio that belongs next to the Instamatic camera and the Watt steam engine, it was far ahead of its time. Minitel was also a cash infusion that enabled France to modernize and digitize its aging phone infrastructure. It was the first nationalized online environment, available to everyone in France. It proved that a state subsidy could foster innovation, as long as that subsidy was applied surgically and with care.  As the authors state, “sometimes complete control of network infrastructure by the private sector stifles rather than supports creativity and innovation.”

When we compare Minitel to today’s online world, we can see that the concept of open systems is a multi-dimensional continuum, and that it is hard to judge whether Minitel and the Internet are more or less open.  As we begin the migration of a neutral Internet infrastructure to one that will be controlled by the content providers, we should keep that in mind. The companies that control the content have different motivations from the users who consume that content. I do think we will see a vastly different Internet in 30 years’ time, just as the Internet of 1987 is very different from the one we all use today.

HPE blog: The changing perception of open source in enterprise IT

Once upon a time, when someone in IT wanted to make use of open source software, it was usually an off-the-books project that didn’t require much in the way of management buy-in. Costs were minimal, projects often were smaller with a couple of people in a single department, and it was easy to grasp what a particular open source project provided. Back then, IT primarily used open source to save money and “do more with less,” letting the department forgo the cost of commercial software.

Times have certainly changed. Yes, software costs are still a factor, and while it is generally true that open source can save money, it isn’t the only reason nowadays to adopt it. While application deployment costs have risen, the direct software cost is a small part of the overall development budget, often dwarfed by infrastructure, scalability, and reliability measures.

As a result, today’s open source efforts aren’t anything like those in earlier days.

You can read the full story on HPE’s blog here.

Understanding how to become an effective digital change agent

As technologists, we tend to get caught up in the computer side of things when it comes to try to get stuff done in our organizations. So often we forget that the real drivers of change are the people behind the screens. In new research that my colleague Brian Solis has just published, he documents exactly how enterprise digital transformation happens, and talks directly to some of those “change agents” that he has known for decades as an analyst covering the IT scene. His Manifesto is available now for downloading and reading, I strongly suggest doing so. (Other than registering your email, it is free of charge.)

With most organizations, “these digital transformation efforts often take place in isolated pockets, sometimes with little coordination and collaboration across the enterprise,” he writes. Often it is a solitary individual who drives change and introduction of particular digital technologies and methods at a grassroots level — and often fails to go further across the enterprise. His manifesto puts together a solid ten-point plan (shown here) if you want to be more effective in bringing this about at your company. This includes embracing yourself as a catalyst, obtaining leadership support, creating a roadmap and democratizing idea creation. Some of these are obvious, some aren’t. 

He says that “digital transformation is more of a people problem than a business problem. Trust is the least measurable but most important factor to build.” Without this trust, your colleagues can sabotage or block your efforts. One of the biggest obstacles in building trust is in managing your own ego as a change agent. When you display too much ego, you make the change all about you, rather than the benefit to your company. The same is true when managing your colleagues’ egos too.

On the other side of this is managing your own doubts about what you are trying to do. “Although it may seem counterintuitive to manage detractors, change agents ought to listen closely to their feedback. It is better to let them voice their concerns than to let them detract in secret.” Indeed, listening is often overlooked when advocating change. The better listener you are, the more you’ll get done. 

Solis mentions that when a change agent has the full buy-in of the executive suite, real change becomes possible and turns from a suggestion to a corporate mandate.

“Digital Darwinism is increasingly becoming either a threat or an opportunity based on how organizations react to change,” he says in his report. Digital change agents can become the next generation of leaders and help to be instrumental in having their companies more effectively compete in this digital economy.