SiliconANGLE: The changing economics of open-source software

The world of open-source software is about to go through another tectonic change. But unlike earlier changes brought about by corporate acquisitions, this time it’s thanks to the growing series of tech layoffs. The layoffs will certainly change the balance of power between large and small software vendors, and between free and commercial software versions, and the role played by OSS in enterprise software applications could change.

In this post for SilicionANGLE, I talk about why these changes are important and what enterprise software managers should take away from the situation.

 

Infoworld: What app developers need to do now to fight Log4j exploits

Earlier this month, security researchers uncovered a series of major vulnerabilities in the Log4j Java software that is used in tens of thousands of web applications. The code is widely used across consumer and enterprise systems, in everything from Minecraft, Steam, and iCloud to Fortinet and Red Hat systems. One analyst estimate millions of endpoints could be at risk.

There are at least four major vulnerabilities from Log4j exploits. What is clear is that as an application developer, you have a lot of work to do to find, fix, and prevent log4j issues in the near-term, and a few things to worry about in the longer term.

You can read my analysis and suggested strategies in Infoworld here.

Avast blog: How will advertisers respond to Apple’s latest privacy changes?

Last week, we described the privacy changes happening within Apple’s iOS 14.5. Now, in this post, we’ll be presenting the advertiser’s perspective of the situation at hand. While advertisers may think the sky is falling, the full-on Chicken Little scenario might not be happening. The changes will make it harder – but not impossible – for advertisers to track users’ habits and target ads to their devices. And as I mention in my latest blog for Avast here, digital ad vendors need to learn new ways to target their campaigns. They have done it before, and hopefully the changes in iOS will be good for everyone, eventually.

 

Picking the right tech isn’t always about the specs

I have been working in tech for close to 40 years, yet it took me until this week to realize an important truth: we have too many choices and too much tech in our lives, both personal and work. So much of the challenges about tech is picking the right product, and then realizing afterwards the key limitations about our choice and its consequences. I guess I shouldn’t complain, after all, I have had a great career out of figuring this stuff out.

But it really is a duh! moment. I don’t know why it has taken me so long to come to this brilliant deduction. I am not complaining, it is nice to help others figure out how to make these choices. Almost every day I am either writing, researching or discussing tech choices for others. But like the barefoot shoemaker’s children, my own tech choices are often fraught with plenty of indecisions, or worse yet, no decision. It is almost laughable.

I was involved in a phone call yesterday with a friend of mine who is as technical as they come: he helped create some of the Net’s early protocols. We both were commiserating about how quirky Webex is when trying to support a multiple-hundred conference call. Yes, Webex is fine for doing the actual video conference itself. The video and audio quality are both generally solid. But it is all the “soft” support that rests on the foibles of how we humans are applying the tech: doing the run-up practice session for the conference, notifying everyone about the call, distributing the slide deck under discussion and so forth. These things require real work to explain what to do to the call’s organizers and how to create standards to make the call go smoothly. It isn’t the tech per se – it is how we apply it.

Let me draw a line from that discussion to an early moment when I worked in the bowels of the end-user IT support department of the Gigantic Insurance company in the early 1980s. We were buying PCs by the truckload, quite literally, to place on the desks of the several thousand IT staffers that until then had a telephone and if they were lucky a mainframe terminal. Of course, we were buying IBM PCs – there was no actual discussion because back then that was the only choice for corporate America. Then Compaq came along and built something that IBM didn’t yet have: a “portable” PC. The reason for the quotes was that this thing was enormous. It weighed about 30 pounds and was an inch too big to put in the overhead bins of most planes.

As soon as Compaq announced this unit (which sold for more than $5000 back then), our executives were conflicted. Our IBM sales reps, who had invested many man-years in golf games with them, were trying to convince them to wait for a year before their own portable PC could come to market. But once we got our hands on an IBM prototype, we could see that Compaq was a superior machine: First, it was already available. It also was lighter and smaller and ran the same apps and had a compatible version of DOS. We gave Compaq our recommendation and started buying them in droves. That was the beginning of what was called the clone wars, unleashing a new era of technology choices to the corporate world. After IBM finally came out with their portable, Compaq already had put hard drives in their model so they stayed ahead of IBM on features.

My point in recounting this quaint history lesson is to point out something that hasn’t changed in nearly 40 years: how tech reviews tend to focus on the wrong things, which is why we get frustrated when we finally decide on a piece of tech and then live with the consequences.

Some of our choices seem easy: who wants to pay a thousand bucks for a stand to sit your monitor on? Of course, some things haven’t changed: the new Macs also sell for more than $5000. That is progress, I guess.

My moral for today: looking beyond the specs and understand how you are eventually going to use the intended tech. You may choose differently.

Blogger in residence at Citrix Synergy conference

This is my second time at the major Citrix annual conference, and I will be posting regularly during and after the show. My first piece can be found here and covers what I heard from a new management team at Citrix. They introduced their vision for the future of Citrix, and the future of work. “Work is no longer a place you go, it is an activity and digital natives expect their workplace to be virtual and follow them wherever they go. They are pushing the boundaries of how they work,” said Citrix CEO Kirill Tatarinov.

My second post is on Windows Continuum. This puts the Windows 10 functionality on a lot of different and non-traditional IT devices, such as the Surface Hub gigantic TV, Xbox consoles, and Windows Phones. If you review the information provided from Microsoft, you might get the wrong idea of how useful this could be for the enterprise, and in my post I discuss what Citrix is doing to embrace and extend this interface.

My next piece is looking at several infosec products that were shown at the show, including solutions from Bitdefender, Kaspersky, IGEL and Veridium. Security has been a big focus at the show and I am glad to see these vendors here supporting Citrix products.

Speaking of security, one of the more important product announcements this week at Synergy was that the Secure Browser Essentials will be available later this year on the Azure Marketplace. This is actually the second secure browsing product that Citrix has announced, and you can read my analysis of how they differ and what are some things to consider if you are looking for such a product.

And here is a story about the Okada Manila Resort that was featured as a semi-finalist for the innovation award at the show. It was built on a huge site and is similar to the resort-style properties that can be found in Las Vegas and Macau. It will house 2,300 guest rooms when it is fully built and have 10,000 employees. Scott’s IT department has at least 100 of them full-time — plus contractors — to support 2,000 endpoints and numerous physical and virtual servers placed in two separate datacenters on the property. I spoke to the IT manager about how he built his infrastructure and some of the hard decisions he had to make. 

At his Synergy keynote, Citrix CEO Kirill Tatarinov mentioned that IT “needs a software defined perimeter (SDP) that helps us manage our mission critical assets and enable people to work the way they want to.” The concept is not a new one, having been around for several years. An SDP replaces the traditional network perimeter — usually thought of as a firewall. I talk about what an SDP is and what Citrix is doing here. 

Finally, this piece is about the Red Bull Racing team and how they are using various Citrix tech to power their infrastructure. Few businesses create a completely different product every couple of weeks, not to mention take their product team on the road and set up a completely new IT system on the fly. Yet, this is what the team at Red Bull Racing do each and every day.

Looking back at 25 years of the world wide web

This week the web has celebrated yet another of its 25th birthdays, and boy does that make me feel old. Like many other great inventions, there are several key dates along the way in its origin story. For example, here is a copy of the original email that Tim Berners-Lee sent back in August 1991, along with an explanation of the context of that message. Steven Vaughan-Nichols has a nice walk down memory lane over at ZDnet here.

Back in 1995, I had some interesting thoughts about those early days of the web as well. This column draws on one that I wrote then, with some current day updates.

I’ve often said that web technology is a lot like radio in the 1920s: station owners are not too sure who is really listening but they are signing up advertisers like crazy, programmers are still feeling around for the best broadcast mechanisms, and standards that are changing fast making for lots of unsure technology that barely works on the best of days. Movies obviously are the metaphor for Java and audio applets and other non-textual additions.

So far, I think the best metaphor for the web is that of a book: something that you’d like to have as a reference source, entertaining when you need it, portable (well, if you tote around a laptop), and so full of information that you would rather leave it on your shelf.

Back in 1995, I was reminded of so-called “electronic books” that were a big deal. One of my favorites then was a 1993 book/disk package called The Electronic Word by Richard Lanham. It is ironically about how computers have changed the face of written communications. The book is my favorite counter-example of on-line books. Lanham is an English professor at UCLA and the book comes with a Hypercard stack that shows both the power of print and how unsatisfactory reading on the screen can be. Prof. Lanham takes you through some of the editing process in the Hypercard version, showing before and after passages that were included in the print version.

But we all don’t want to read stuff on-line, especially dry academic works that contain transcripts of speeches. That is an important design point for webmasters to consider. Many websites are full reference works, and even if we had faster connections to the Internet, we still wouldn’t want to view all that stuff on-screen. Send me a book, or some paper, instead.

aaaSpeaking of eliminating paper, in my column I took a look at what Byte magazine is trying to do with their Virtual Press Room. (The link will take you to a 1996 archive copy, where you can see the beginnings of what others would do later on. As with so many other things, Byte was so far ahead of the curve.)

Byte back then had an intriguing idea of having vendors send their press releases electronically, so editors don’t have to plow through the printed ones. But how about a step further in the interest of saving trees: sending in both the links to the vendor’s own websites and whatever keywords are needed. Years later, I am still asking vendors for key information from their press releases. Some things don’t change, no matter what the technology.

What separates good books from bad is good indexing and great tables of contents. We use both in books to find our way around: the latter more for reference, the former more for determining interest and where to enter its pages. So how many websites have you visited lately that have either, and have done a reasonable job on both? Not many back in 1995, and not many today, sadly. Some things don’t change.

Today we almost take it for granted that numerous enterprise software products have web front-end interfaces, not to mention all the SaaS products that speak native HTML. But back in the mid 1990s, vendors were still struggling with the web interface and trying it on. Cisco had its UniverCD (shown here), which was part CD-ROM and part website. The CD came with a copy of the Mosaic browser so you could look up the latest router firmware and download it online, and when I saw this back in the day I said it was a brilliant use of the two interfaces. Novell (ah, remember them?) had its Market Messenger CD ROM, which also combined the two. There were lots of other book/CD combo packages back then, including Frontier’s Cybersearch product. It had the entire Lycos (a precursor of Google) catalog on CD along with browser and on-ramp tools. Imagine putting the entire index of the Internet on a single CD. Of course, it would be instantly out of date but you can’t fault them for trying.

The reason why vendors combined CDs with the web was because bandwidth was precious and sending images down a dial-up line was painful. (Remember that the first web browser shown at the top of this column was text-only.) If you could off load these images on to a CD, you could have the best of both worlds. At the time, I said that if we wanted to watch movies, we would go to Blockbuster and rent one. Remember Blockbuster? Now we get annoyed if our favorite flick isn’t available to be immediately streamed online.

Yes, the web has come a long ways since its invention, no matter which date you choose to celebrate it. It has been an amazing time to be around and watch its progress, and I count myself lucky that I can use its technology to preserve many of the things that others and I have written about it.

Fast Track blog: The benefits of being in a hackathon

With the number of coding for cash contests, popularly called hackathons, exploding, now might be the time that you should consider spending part of your weekend or an evening participating, even if you aren’t a total coder. Indeed, if you are one of the growing number of citizen developers, you might be more valuable to your team than someone who can spew out tons of Ruby or Perl scripts on demand. I spoke to several hackathon participants at the QuickBase EMPOWER user conference last month to get their perspective. You can read my post in QuickBase’s Fast Track blog today.

Using citizen science to hunt for new planets

When I was growing up, one of my childhood heroes was Clyde Tombaugh, the astronomer who discovered Pluto. Since then, we have demoted Pluto from its planetary status. But it still was a pretty cool thing to be someone who discovered a planet-like object. Today, you have this opportunity to find a new planet, and you don’t even need a telescope nor spend lonely cold nights at some mountaintop observatory. It is all thanks to an aging NASA spacecraft and how the Internet has transformed the role of public and private science research.

Let’s start in the beginning, seven years ago when the Kepler spacecraft was launched. Back then, it was designed to take pictures of a very small patch of space that had the most likely conditions to find planets orbiting far-away stars. (See above.) By closely scrutinizing this star field, the project managers hoped to find variations in the light emitted by stars that had planets passing in front of them. It is a time-tested method that Galileo used to discover Jupiter’s moons back in 1610. When you think about the great distances involved, it is pretty amazing that we have the technology to do this.

Since its launch, key parts of the spacecraft have failed but researchers have figured out how to keep it running using the Sun’s solar winds to keep the cameras properly aligned. As a result, Kepler has been collecting massive amounts of data and downloading the images faithfully over the years, and more than 1,000 Earth-class (or M class, from Star Trek) planets have already been identified. There are probably billions more out there. 

NASA has extended Kepler’s mission as long as it can, and part of that extension was to establish an archive of the Kepler data that anyone can examine. This effort, called Planethunters.org, is where the search for planets gets interesting. NASA and various other researchers, notably from Chicago’s Adler Planetarium and Yale University, have enlisted hundreds of thousands of volunteers from around the world to look for more planets. You don’t need a physics degree, you don’t need any sophisticated computer or run any Big Data algorithms. Instead, if you have a keen mind and eyesight to pore over the data and the motivation to try to spot a sequence that would indicate a potential planetary object.

What is fascinating to me is how this crowd-based effort has been complementary to what has already happened with the Kepler database. NASA admits that it needs help from humans. As they state online, “We think there will be planets which can only be found via the innate human ability for pattern recognition. At Planet Hunters we are enlisting the public’s help to inspect the Kepler [data] and find these planets missed by automated detection algorithms.”   

Think about that for a moment. We can harness the seemingly infinite computing power available in the cloud, but it isn’t enough. We still need carbon-based eyeballs to figure this stuff out.

Planet Hunters is just one of several projects that are hosted on Zooniverse.org, a site devoted to dozens of crowdsourced “citizen science” efforts that span the gamut of research. Think of what Amazon’s Mechanical Turk does by parcelling out pieces of data that humans classify and interpret. But instead of helping some corporation you are working together on a research project. And it isn’t just science research: there is a project to help transcribe notes from Shakespeare’s contemporaries, another one to explore WWI diaries from soldiers, and one to identify animals captured by webcams in Gorongosa National Park in Mozambique. Many of the most interesting discoveries from these projects have come from discussions between volunteers and researchers. That is another notable aspect: in the past, you needed at least a PhD or some kind of academic street cred to get involved with this level of research. Now anyone with a web browser can join in. Thousands have signed up.

Finally, the Zooniverse efforts are paying another unexpected benefit: participants are actually doing more than looking for the proverbial needle in the haystack. They are learning about science by doing the actual science research. It is taking something dry and academic and making it live and exciting. And the appeal isn’t just adults, but kids too: one blog post on the site showed how Czech nine year old kids got involved in one project. That to me is probably the best reason to praise the Zooniverse efforts.

So far, the Planet Hunters are actually finding planets: more than a dozen scientific papers have already been published, thanks to these volunteers around the world on the lookout. I wish I could have had this kind of access back when I was a kid, but I also have no doubt that Tombaugh would be among these searchers, had he lived to see this all happening.

Veracode blog: Why is SQL injection still around?

While there are many Web hacking exploits, none are as simple or as potentially destructive as SQL injection. This isn’t news: the attack method has been around for more than a decade. Sadly, for something so old it is still one of the most popular ways to penetrate networks and extract data. And it is easy to find and almost as easy to avoid. Why is SQL injection still with us? It all comes down to a lack of understanding about how SQLi vulnerabilities work.

You can read my post in Veracode’s blog here.

The cashless customer is now king

I wanted to bring in my winter coat to the cleaners (maybe optimistically a week or so too soon) and in cleaning out the various pockets I came across some cash and a receipt dated last December. I thought about how long it has been since I have actually used cash.

What a difference from my dad’s world. My dad dealt with millions of dollars every day as a comptroller and always carried a wad of cash worthy of a mafia don. I still have his money clip somewhere. I put the few bills on my desk as a reminder and then thought about how the world has changed. Paying in cash is certainly becoming less common.

Most of my customers still pay me with paper or electronic checks, a few go through Paypal and every once in a while I get asked to accept credit cards. Now there are so many options for accepting Internet payments and two good ones that you might not know about. One is Simplify.com, which is part of MasterCard and has done a lot of work in developing their payment gateway. The other is Stripe.com. Both charge a bit less than 3% per transaction but have no other recurring fees. That is a lot less compared to just a few years ago, when you had to pay monthly processing and other annoying fees to have a merchant account. Stripe even accepts non-dollar currencies, including Bitcoins, and converts them into dollars for you.

aaa2Both Stripe and Simplify offer a variety of APIs, tools, code samples, and connectors to various payment-related apps. I like the way Simplify arranges its code samples, as you can see in this screenshot.

Stripe has more third-party plug-ins than Simplify, including more than a dozen just for WordPress. Both offer documentation on webhooks, which are URLs that can interact with short pieces of code for particular event notifications, although I think Stripe has better documentation. Both also support OAuth for consolidated signons to other SaaS apps without having to store your credentials. Finally, both can operate in either a testing or sandbox mode so you can try various things out, and then go live with actually processing real transactions.

We have come a long way with online payments to be sure. Both services allow you to build in payment processing to your website in ways that were unthinkable just a few years ago. I think my dad would be just as amazed as I am.