Automated video image analysis will be the next big breakthrough

Remember Chance the Gardener character in Being There? “I like to watch tv” was his famous line. The problem with most business videos is that we produce a lot of unwatchable footage, especially those taken by security camera systems. And there aren’t any Chances around that want to watch them, either.

The trouble is that trying to find the one or two actionable events in all of that footage isn’t easy. As an example, take the story in today’s NY Times that mentions how the military is being buried under a massive pile of video footage from its Predator drones that are flying over Pakistan and Afghanistan. The situation is only going to get worse, as newer drone models will be sending streams from dozens of cameras within a few years.

The story has already been told about how the drones’ video feeds are available to anyone who has some minor software skills – the actual control channels are encrypted but the video transmissions aren’t. This is because many legit people need to see what they are broadcasting and the military hasn’t been able to implement any encrypted viewing packages on these streams.

In the Times article, a bunch of soldiers based in Hampton, Virgina sit in front of the screens and see it in real time, and then make screengrabs available to the right people via computer chat rooms. I hope for the sake of everyone involved that these chat rooms are encrypted, but the article didn’t say.

So how do we implement our automated Chance Gardner? There are a couple of technologies that can help here, but they aren’t easy or cheap to implement. One is the telestrator, the device made popular by John Madden and Monday Night Football where a commentator draws on the screen and you seen colored squiggles to highlight what is going on. The ones that Madden and the pros use are very expensive, but there are dozens of telestrator products available for the PC market, with some freeware products such as VideoMage Producer.

The telestrators are nice, but again, someone has to be watching the video and doing the electronic doodling. You need more than the fast-forward button to do this – ideally, you want some kind of automated system that can identify actionable moments on the video. This is what the next class of products does, called intelligent image analysis. They have computers to look at the stream and highlight particular activities that a human operator can come back to and review later.

This is what the company stoplift.com is doing with its retail checkout analysis systems. Typically, a retail store installs video cameras above each checkout aisle and records what the checker is doing as items pass through the point of sale scanning devices. There are all sorts of scams that can be used, such as “sweethearting” (a confederate is giving free items that aren’t scanned) and looking like you are scanning a bar code when you are just passing the item around and over to the bagger. So what is needed is a system that ties into your point of sale and can flag when these items aren’t rung up at the register. I got to see a demo last week and thought this was way cool. The company claims their software can have a six-month ROI and significantly reduce the cost of stolen goods. And the good news is that no one has to watch all the security tapes to see those few sweetheart moments.

Giving thanks to RSS, the most unappreciated technology of the decade

In the ten years since Real Simple Syndication (RSS) has been invented, it has been one of the most significant technologies that Rodney Dangerfield would say “got no respect.” Providing the connective glue behind most social media, linking various Web sites for automatically posting content, being able to Webify various other protocols — RSS is the tech that most of us now take for granted.

I am not a big fan of the best of the decade type of stories (especially as the decade isn’t really yet over for another year). But as I was thinking about how far we have come in the past ten years, I thought I would take a moment of appreciation for RSS and all that it has done. It is one of those stories of unintended consequences. And what is ironic is how many of us use it every day without realizing it, or even knowing what it does to help better our online lives.

Back at the end of 1999, a few computer scientists at Netscape (talk about underappreciated companies, at least for those of us that weren’t part of their stratospheric IPO), Apple and Microsoft put together the beginnings of the protocol. Aided and abetted by pundit and programmer Dave Winer, RSS began to show up in a variety of odd places, including early Web server software. The early days of RSS spawned a series of specialty software tools called RSS readers that enabled some of us to keep track of new content that was added to our favorite Web sites without having to cycle through them in our browsers one by one. And that is where things stood for most of the time, until the blogosphere and the social Web took off.

Well, those RSS readers were probably the biggest pile of mostly unused software. A few geeks used them, but mostly they were oddities. I recall giving a presentation in 2007 at the New Communications Forum to explain RSS to public relations people, and some of the things that I mentioned then still apply to the technology, such as a way to quickly scan information, be the first on your cubicle block to find out something, and supplement email as a way to send information to a lot of folks quickly.
I will post these slides to Slideshare.net/davidstrom so you can take a look for yourself if you are interested.

Just as a side note: the site Slideshare.net is an interesting outgrowth of RSS itself: you can notify various people on your LinkedIn and other sites when you put up new content such as this slide deck.

I still have my collection of RSS feeds somewhere on my hard drive, and I stopped looking at them a few years ago when I realized that I could Google just about anything that actually showed up in these feeds.

The early blogging tools had one big thing going for them: they automatically generated their RSS feeds without any additional software. This made it easy to integrate their content into a wide variety of places, and before you knew it, RSS feeds were an intrinsic part of online software.

Indeed, it became easier to just review my Facebook, LinkedIn, and Twitter accounts and see what people have posted there than mess with RSS directly. So we can thank RSS first all for making the concept of a data feed popular in these social networking sites. Now most everyone knows what “post to my Wall” means or “take a look at my feed” – terms that became popular from Facebook but owe their origins to how RSS was constructed.

Thanks to RSS, I can post my content to my WordPress blog, and within a few minutes (or hours, depending on how things are going out on the Interwebs), that content will magically appear in my Twitter feed, my Facebook profile, on LinkedIn status, and more. I have tools such as Pixelpipe.com and TubeMogul.com that can send out content to dozens of different places. While with many of these tools there are other programming interfaces that are going on to enable all of this fun and fascinating connectivity, it really got started with RSS and its series of very minimal standards to publish and subscribe its data feeds.

So let’s start off 2010 with thanks to those early RSS pioneers!

Mediablather podcast with Adam Christensen of IBM

Paul Gillin and I have restarted our ever-popular series of podcasts called MediaBlather. This week we interview Adam Christensen, the head of social media communications for IBM. Just get a look at some of these stats:

  • Internal blogs: 17,000
  • Members of the Beehive social network: 60,000
  • Daily page views on IBM’s internal wiki: 1,000,000
  • IBMers on Twitter: 3,000
  • IBMers on Facebook: 52,000
  • IBMers on LinkedIn: 198,000

You can download and listen to the podcast here.

 

The evolution of Web-based enterprise video

This week Brightcove begins a new lower-priced video service called Express that starts at $100 a month and offers some impressive features. I’m glad to see them in this space, which is still very much in the pre-Gutenberg publishing era. I thought I would take this moment to talk about some of the issues involved in publishing Web videos for corporate uses, putting aside all the tectonic shifts that are happening in the Web entertainment arena for another essay.

To put things in perspective, realize that it took only a few years for the Web to evolve from its first crude text-only efforts to a full graphical experience. Yet it has taken more than a decade to get videos inside the browser page. And while there are dozens of video streaming service providers, including Brightcove, Wistia, Fliqz and Kaltura, that offer ways of delivering videos, none of them are as easy to use as they could be, and almost none of them offer one-stop solutions for publishers.

In the last year I have spent a lot of time with video publishing as a result of my five-minute screencast videos, where I write, review, narrate and produce everything about a particular product. The product’s vendor sponsors each video that appears on my WebInformant.tv site along with 20 other places around the Internet.

Just take a look at the most popular Web content creation tool of the moment, WordPress, as a good case in point. If you create your own blog and host it using WordPress.com, you can purchase a “space upgrade” for $20 a year and start uploading video content. But if you decide that you want more control over your page design and host your blog on your own Web server, this space upgrade option isn’t available and you have to dive into the nasty world of third-party video player plug-ins. Even though you are still using WordPress software. It is these sorts of gotchas that can drive you crazy, or keep me fully employed explaining them.

All of these video services operate in some broad basic ways. After you prepare your video, you upload it to their server and then annotate it with any supporting text, keywords, and other information. You are then given a bunch of HTML code to embed the video player into your Web page. When you view the page, you see a player that you can click on and control the video playback, just as you would come to expect from YouTube et al. The special embed code contains tracking information that the service collects and then offers reports so you can see who watched what videos.

The service that I use at the moment is Wistia.com. Their most basic plan starts at less than $40 a month, and offers some very sophisticated tracking and embedding features. Their video player is very clean and crisp, and I haven’t had too many reports about playback quality issues from my site. I recommend that you start with them and see if they meet your needs, and if not then you might want to ask the following questions:

First, do you need a branded player for your videos? Meaning that you have your logo somewhere on the first or end screen, or underneath the video image. For some people, this is important. Some services offer a single player, like Wistia, while others, such as Brightcove, give you more stylistic choices.

Second, do you need control over the ultimate size of the video image on your Web site? The various hosting services either offer this explicitly, or else (like the basic plan from Fliqz.com) leave it up to you to edit their embed codes that they provide for you to copy and paste into your Web page. If you have to manually edit the code, you want to maintain the aspect ration (horizontal to vertical) so your video displays correctly. (It helps if you produce your video for the ultimate intended size that it will appear on your Web site, too.)

Third, how big of an audience do you expect for your videos? Given that these are targeted at potential customers and not people looking for the latest skateboarding cats or guys gone wild, you should set expectations accordingly: several thousand views over a period of a few months is a good audience. Some of the services, like Wistia, charge by playbacks per month. Brightcove charges on the number of individual videos and on your bitstream consumption, which is harder to estimate. Kaltura offers a free WordPress plug-in for hosting up to 10 GB of monthly video data.

Fourth, what kinds of reports and features are available from your service provider? With some services like Fliqz and Brightcove, their more expensive plans give you more features and choices.

Finally, what else is or isn’t included in the service? One of the things that I like about Wistia is the ability to share the video project with a number of collaborators, such as my clients, who can view the video directly, without my having to email them a huge attachment.

As you can see, there is a still a lot to deal with when it comes to Web videos. If you have another site that you would like to recommend, please let me know on my Strominator blog. And if you are a subscriber of Sam Whitmore’s Media Survey, you can listen to me and Sam talk about some of these video hosting and production issues on a Webinar that we will host this coming Thursday afternoon. For those of you that aren’t subscribers, I will post my Powerpoint slides on my slideshare.net/davidstrom account here.

Book review: Detecting Malice by Robert Hansen

In his ebook Detecting Malice, Robert Hansen has a difficult task. To compile in one place a variety of attack descriptions and forensic methods for various Internet intrusions. He does a great job of covering the landscape, talking in plain language without a lot of technical jargon and with many clear examples. If you have never read packet captures this book will be an eye opener, and if you have some exposure to hacking tools and Web traces then you will do fine with the examples that he portrays.

Think that your Web site is immune from these exploits? Think again. Just about everyone has some kind of exposure, and part of understanding exactly what that is is being able to get into the bad guys’ mindset and see how they can penetrate your servers.

I highly recommend this book, well worth the time and money. It will stimulate your thinking and certainly raise your level of paranoia, and perhaps level of motivation, to lock things down.

Behind the scenes at the Cisco AXP Contest

Today Cisco announced the winners of its AXP contest. If you haven’t heard of the contest before, you aren’t alone. It was an interesting combination of people, places and events. The goal was to design an application for a relatively new add-on module to Cisco routers called Application eXtension Platform (AXP), a Linux “blade” that allows third-party applications to be integrated with Cisco’s IOS router operating system and network applications. It has its own CPU and can store from 1 GB to 160 GB of data, depending on the model. Here is a more details Q&A about the AXP.

Earlier this year, Cisco announced the contest and a $100,000 prize purse. They received 100 submissions from teams around the world, and the three finalists were announced this week. Check out the winning entry from MAD Network here – it is a very clever use of a variety of materials to explain their innovation, and I am sure one of the reasons why they won.

Brian Profitt, one of the judges in the contest, wrote about his experiences in a blog post here. When I spoke to him, he was very upbeat about his participation. “Initially, I was skeptical that we needed apps there on the AXP, but after seeing the apps from the contestants, I realized that it is a good thing and they made a believer out of me. It is definitely a platform that you can build something that is useful for businesses. Cisco could have kept this all to themselves and developed all of their apps in house. By having this contest, they opened the door for people that probably wouldn’t have gotten to otherwise. They asked people to play with it, and certainly the prize was a big motivation, but this was a very significant move. I am hugely surprised and pleased by the number of international entrants. We had teams from all over the place – South American, Europe, elsewhere. I think this is a product of Cisco’s strength and how well they are known globally. I saw a number of women in the demo videos, which also was good too and runs counter to the notion that all coders are men.”

Profitt, who is the community manager for Linux.com, think that this is a very viable model for how you can really get developers into your enviroment. It also was his first time working with Cisco too.

Some good news on Iran’s net connectivity

With all the reports about blocked connections and such, the folks at Renesys have done their usual good and clear-headed analysis about what is working in terms of Internet routing into Iran in their post here. Unlike what has been reported in the general press, things aren’t as simple as a complete blackout, and there could be other reasons (such as a greater interest from the rest of the world) that is affecting the traffic patterns observed. 

Top talkers on Twitter research

Research from the Harvard Business school has found that “the top 10% of prolific Twitter users accounted for over 90% of tweets. On a typical online social network, the top 10% of users account for 30% of all production. To put Twitter in perspective, consider an unlikely analogue – Wikipedia. There, the top 15% of the most prolific editors account for 90% of Wikipedia’s editsIn other words, the pattern of contributions on Twitter is more concentrated among the few top users than is the case on Wikipedia, even though Wikipedia is clearly not a communications tool. This implies that Twitter’s resembles more of a one-way, one-to-many publishing service more than a two-way, peer-to-peer communication network.”

I would like to see research that shows the relative utility of Twitter vs. social networks as the size of your followers/followed network increases. My thesis is that the bigger your Twittersphere, the less utility it has — the reverse I would think would be true of social networks.

When to defriend and defollow

When I was growing up as a nerdy teen on Long Island, needless to say I wasn’t one of the Popular Kids. Back then we called it Junior High rather than the current appellation Middle School and now nerds are now the new cool kids. In my youth, we didn’t have reality shows where beauties met their geeks, Bill Gates hadn’t yet gone to, let alone dropped out of college, and the Steves were still eating fruits rather than making Macs. We didn’t even have computers, phones still had dials on them, and we all watched one of three network TV channels and read newspapers that came in the afternoon. And all of our parents bought American-made cars.

Ok, enough nostalgia. I give this as background, to explain my own behavior when I started getting involved in social networks. My first thought was to collect as many “friends” as I could, to grow my network quickly and add just about everyone that I had an email address for. Now that I have accumulated a bunch of people on Facebook, LinkedIn, Twitter and Plaxo, I have a different strategy.

I want quality rather than quantity. As my networks have grown – and they still aren’t as large as my college-age daughter (see, it is that underdog feeling again) – I have seen the “feed” streams that are produced from all these people just burying me in the details and status updates of their lives. I try to dip into this vast, deep flow of information on a daily basis, but it quickly overwhelms me. I run back to the relative comfort of my email inbox, where at least I can hit the delete key and pare things down to a reasonable single screen of to-do and action items and people that I have to return messages to.

Burger King ran a promotion not too long ago where they asked people to defriend 10 Facebook friends in order to get a coupon for a free burger. They were swamped with thousands of requests, thereby establishing the value of a friend at somewhere around a quarter. That is pretty depressing. I always thought a friend was worth at least a couple of bucks, if not more.

I also want to grow my networks slower, because like anything else on the Internet, I am concerned about customer retention and my networks are my customers. You are the people that will (hopefully soon, puh-lease) pay me money to speak at a conference, write an article or white paper, produce a screencast video, or do some custom product consulting. So I don’t want to just spam you with needless updates about what I had for breakfast or insights about my pets or family vacations, although I did get some interesting feedback when I mention the books that I read in my last missive.

So I have gotten pickier about who I add to my various networks. And while I don’t want to be as snobby as that Jr. High clique of popular kids, I do think we all need to take a step back and consider what our friending – and more importantly defriending –policies will be going forward.

Over at Twitter (where my network is still “just” a few hundred followers), there is a lot of activity around third-party apps that will automatically increase your network with all sorts of tricks. This is a bad thing, because those networks become less valuable as their feeds become larger. You will be adding more noise to the signal, and as a result, miss out on the important stuff.

I am still figuring out Twitter, to say the least. But I can tell you that my Twitter activities have saved me a grand total of $140, which is the overdraft fee that Bank of America initially charged me when I deposited a check to the wrong account. Through the miracle of social networks, I was able to tweet my bank, email them the information and get them to call me and correct the problem, and probably keep me as a customer.

Now, I don’t have all the answers here. Or even some of them. And I am glad that I don’t have to deal with the hyper social strata that are Middle School today. But I can take some small comfort that none of my 20-something children have Twitter accounts, at least not yet.