This is my second time at the major Citrix annual conference, and I will be posting regularly during and after the show. My first piece can be found here and covers what I heard from a new management team at Citrix. They introduced their vision for the future of Citrix, and the future of work. “Work is no longer a place you go, it is an activity and digital natives expect their workplace to be virtual and follow them wherever they go. They are pushing the boundaries of how they work,” said Citrix CEO Kirill Tatarinov.
My second post is on Windows Continuum. This puts the Windows 10 functionality on a lot of different and non-traditional IT devices, such as the Surface Hub gigantic TV, Xbox consoles, and Windows Phones. If you review the information provided from Microsoft, you might get the wrong idea of how useful this could be for the enterprise, and in my post I discuss what Citrix is doing to embrace and extend this interface.
My next piece is looking at several infosec products that were shown at the show, including solutions from Bitdefender, Kaspersky, IGEL and Veridium. Security has been a big focus at the show and I am glad to see these vendors here supporting Citrix products.
Speaking of security, one of the more important product announcements this week at Synergy was that the Secure Browser Essentials will be available later this year on the Azure Marketplace. This is actually the second secure browsing product that Citrix has announced, and you can read my analysis of how they differ and what are some things to consider if you are looking for such a product.
And here is a story about the Okada Manila Resort that was featured as a semi-finalist for the innovation award at the show. It was built on a huge site and is similar to the resort-style properties that can be found in Las Vegas and Macau. It will house 2,300 guest rooms when it is fully built and have 10,000 employees. Scott’s IT department has at least 100 of them full-time — plus contractors — to support 2,000 endpoints and numerous physical and virtual servers placed in two separate datacenters on the property. I spoke to the IT manager about how he built his infrastructure and some of the hard decisions he had to make.
At his Synergy keynote, Citrix CEO Kirill Tatarinov mentioned that IT “needs a software defined perimeter (SDP) that helps us manage our mission critical assets and enable people to work the way they want to.” The concept is not a new one, having been around for several years. An SDP replaces the traditional network perimeter — usually thought of as a firewall. I talk about what an SDP is and what Citrix is doing here.
Finally, this piece is about the Red Bull Racing team and how they are using various Citrix tech to power their infrastructure. Few businesses create a completely different product every couple of weeks, not to mention take their product team on the road and set up a completely new IT system on the fly. Yet, this is what the team at Red Bull Racing do each and every day.
This week the web has celebrated yet another of its 25th birthdays, and boy does that make me feel old. Like many other great inventions, there are several key dates along the way in its origin story. For example, here is a copy of the original email that Tim Berners-Lee sent back in August 1991, along with an explanation of the context of that message. Steven Vaughan-Nichols has a nice walk down memory lane over at ZDnet here.
Back in 1995, I had some interesting thoughts about those early days of the web as well. This column draws on one that I wrote then, with some current day updates.
I’ve often said that web technology is a lot like radio in the 1920s: station owners are not too sure who is really listening but they are signing up advertisers like crazy, programmers are still feeling around for the best broadcast mechanisms, and standards that are changing fast making for lots of unsure technology that barely works on the best of days. Movies obviously are the metaphor for Java and audio applets and other non-textual additions.
So far, I think the best metaphor for the web is that of a book: something that you’d like to have as a reference source, entertaining when you need it, portable (well, if you tote around a laptop), and so full of information that you would rather leave it on your shelf.
Back in 1995, I was reminded of so-called “electronic books” that were a big deal. One of my favorites then was a 1993 book/disk package called The Electronic Word by Richard Lanham. It is ironically about how computers have changed the face of written communications. The book is my favorite counter-example of on-line books. Lanham is an English professor at UCLA and the book comes with a Hypercard stack that shows both the power of print and how unsatisfactory reading on the screen can be. Prof. Lanham takes you through some of the editing process in the Hypercard version, showing before and after passages that were included in the print version.
But we all don’t want to read stuff on-line, especially dry academic works that contain transcripts of speeches. That is an important design point for webmasters to consider. Many websites are full reference works, and even if we had faster connections to the Internet, we still wouldn’t want to view all that stuff on-screen. Send me a book, or some paper, instead.
Speaking of eliminating paper, in my column I took a look at what Byte magazine is trying to do with their Virtual Press Room. (The link will take you to a 1996 archive copy, where you can see the beginnings of what others would do later on. As with so many other things, Byte was so far ahead of the curve.)
Byte back then had an intriguing idea of having vendors send their press releases electronically, so editors don’t have to plow through the printed ones. But how about a step further in the interest of saving trees: sending in both the links to the vendor’s own websites and whatever keywords are needed. Years later, I am still asking vendors for key information from their press releases. Some things don’t change, no matter what the technology.
What separates good books from bad is good indexing and great tables of contents. We use both in books to find our way around: the latter more for reference, the former more for determining interest and where to enter its pages. So how many websites have you visited lately that have either, and have done a reasonable job on both? Not many back in 1995, and not many today, sadly. Some things don’t change.
Today we almost take it for granted that numerous enterprise software products have web front-end interfaces, not to mention all the SaaS products that speak native HTML. But back in the mid 1990s, vendors were still struggling with the web interface and trying it on. Cisco had its UniverCD (shown here), which was part CD-ROM and part website. The CD came with a copy of the Mosaic browser so you could look up the latest router firmware and download it online, and when I saw this back in the day I said it was a brilliant use of the two interfaces. Novell (ah, remember them?) had its Market Messenger CD ROM, which also combined the two. There were lots of other book/CD combo packages back then, including Frontier’s Cybersearch product. It had the entire Lycos (a precursor of Google) catalog on CD along with browser and on-ramp tools. Imagine putting the entire index of the Internet on a single CD. Of course, it would be instantly out of date but you can’t fault them for trying.
The reason why vendors combined CDs with the web was because bandwidth was precious and sending images down a dial-up line was painful. (Remember that the first web browser shown at the top of this column was text-only.) If you could off load these images on to a CD, you could have the best of both worlds. At the time, I said that if we wanted to watch movies, we would go to Blockbuster and rent one. Remember Blockbuster? Now we get annoyed if our favorite flick isn’t available to be immediately streamed online.
Yes, the web has come a long ways since its invention, no matter which date you choose to celebrate it. It has been an amazing time to be around and watch its progress, and I count myself lucky that I can use its technology to preserve many of the things that others and I have written about it.
With the number of coding for cash contests, popularly called hackathons, exploding, now might be the time that you should consider spending part of your weekend or an evening participating, even if you aren’t a total coder. Indeed, if you are one of the growing number of citizen developers, you might be more valuable to your team than someone who can spew out tons of Ruby or Perl scripts on demand. I spoke to several hackathon participants at the QuickBase EMPOWER user conference last month to get their perspective. You can read my post in QuickBase’s Fast Track blog today.
When I was growing up, one of my childhood heroes was Clyde Tombaugh, the astronomer who discovered Pluto. Since then, we have demoted Pluto from its planetary status. But it still was a pretty cool thing to be someone who discovered a planet-like object. Today, you have this opportunity to find a new planet, and you don’t even need a telescope nor spend lonely cold nights at some mountaintop observatory. It is all thanks to an aging NASA spacecraft and how the Internet has transformed the role of public and private science research.
Let’s start in the beginning, seven years ago when the Kepler spacecraft was launched. Back then, it was designed to take pictures of a very small patch of space that had the most likely conditions to find planets orbiting far-away stars. (See above.) By closely scrutinizing this star field, the project managers hoped to find variations in the light emitted by stars that had planets passing in front of them. It is a time-tested method that Galileo used to discover Jupiter’s moons back in 1610. When you think about the great distances involved, it is pretty amazing that we have the technology to do this.
Since its launch, key parts of the spacecraft have failed but researchers have figured out how to keep it running using the Sun’s solar winds to keep the cameras properly aligned. As a result, Kepler has been collecting massive amounts of data and downloading the images faithfully over the years, and more than 1,000 Earth-class (or M class, from Star Trek) planets have already been identified. There are probably billions more out there.
NASA has extended Kepler’s mission as long as it can, and part of that extension was to establish an archive of the Kepler data that anyone can examine. This effort, called Planethunters.org, is where the search for planets gets interesting. NASA and various other researchers, notably from Chicago’s Adler Planetarium and Yale University, have enlisted hundreds of thousands of volunteers from around the world to look for more planets. You don’t need a physics degree, you don’t need any sophisticated computer or run any Big Data algorithms. Instead, if you have a keen mind and eyesight to pore over the data and the motivation to try to spot a sequence that would indicate a potential planetary object.
What is fascinating to me is how this crowd-based effort has been complementary to what has already happened with the Kepler database. NASA admits that it needs help from humans. As they state online, “We think there will be planets which can only be found via the innate human ability for pattern recognition. At Planet Hunters we are enlisting the public’s help to inspect the Kepler [data] and find these planets missed by automated detection algorithms.”
Think about that for a moment. We can harness the seemingly infinite computing power available in the cloud, but it isn’t enough. We still need carbon-based eyeballs to figure this stuff out.
Planet Hunters is just one of several projects that are hosted on Zooniverse.org, a site devoted to dozens of crowdsourced “citizen science” efforts that span the gamut of research. Think of what Amazon’s Mechanical Turk does by parcelling out pieces of data that humans classify and interpret. But instead of helping some corporation you are working together on a research project. And it isn’t just science research: there is a project to help transcribe notes from Shakespeare’s contemporaries, another one to explore WWI diaries from soldiers, and one to identify animals captured by webcams in Gorongosa National Park in Mozambique. Many of the most interesting discoveries from these projects have come from discussions between volunteers and researchers. That is another notable aspect: in the past, you needed at least a PhD or some kind of academic street cred to get involved with this level of research. Now anyone with a web browser can join in. Thousands have signed up.
Finally, the Zooniverse efforts are paying another unexpected benefit: participants are actually doing more than looking for the proverbial needle in the haystack. They are learning about science by doing the actual science research. It is taking something dry and academic and making it live and exciting. And the appeal isn’t just adults, but kids too: one blog post on the site showed how Czech nine year old kids got involved in one project. That to me is probably the best reason to praise the Zooniverse efforts.
So far, the Planet Hunters are actually finding planets: more than a dozen scientific papers have already been published, thanks to these volunteers around the world on the lookout. I wish I could have had this kind of access back when I was a kid, but I also have no doubt that Tombaugh would be among these searchers, had he lived to see this all happening.
While there are many Web hacking exploits, none are as simple or as potentially destructive as SQL injection. This isn’t news: the attack method has been around for more than a decade. Sadly, for something so old it is still one of the most popular ways to penetrate networks and extract data. And it is easy to find and almost as easy to avoid. Why is SQL injection still with us? It all comes down to a lack of understanding about how SQLi vulnerabilities work.
You can read my post in Veracode’s blog here.
I wanted to bring in my winter coat to the cleaners (maybe optimistically a week or so too soon) and in cleaning out the various pockets I came across some cash and a receipt dated last December. I thought about how long it has been since I have actually used cash.
What a difference from my dad’s world. My dad dealt with millions of dollars every day as a comptroller and always carried a wad of cash worthy of a mafia don. I still have his money clip somewhere. I put the few bills on my desk as a reminder and then thought about how the world has changed. Paying in cash is certainly becoming less common.
Most of my customers still pay me with paper or electronic checks, a few go through Paypal and every once in a while I get asked to accept credit cards. Now there are so many options for accepting Internet payments and two good ones that you might not know about. One is Simplify.com, which is part of MasterCard and has done a lot of work in developing their payment gateway. The other is Stripe.com. Both charge a bit less than 3% per transaction but have no other recurring fees. That is a lot less compared to just a few years ago, when you had to pay monthly processing and other annoying fees to have a merchant account. Stripe even accepts non-dollar currencies, including Bitcoins, and converts them into dollars for you.
Both Stripe and Simplify offer a variety of APIs, tools, code samples, and connectors to various payment-related apps. I like the way Simplify arranges its code samples, as you can see in this screenshot.
Stripe has more third-party plug-ins than Simplify, including more than a dozen just for WordPress. Both offer documentation on webhooks, which are URLs that can interact with short pieces of code for particular event notifications, although I think Stripe has better documentation. Both also support OAuth for consolidated signons to other SaaS apps without having to store your credentials. Finally, both can operate in either a testing or sandbox mode so you can try various things out, and then go live with actually processing real transactions.
We have come a long way with online payments to be sure. Both services allow you to build in payment processing to your website in ways that were unthinkable just a few years ago. I think my dad would be just as amazed as I am.
Based on a white paper that I wrote earlier in the year for them, I am holding a webinar next week with the above focus. In this webinar David S. Linthicum SVP, Cloud Technology Partners and Brandon Elliott the Chief Technologist for Rackspace and I will examine the infrastructure needs of customer-facing applications by examining the challenges faced by businesses in the most demanding industries. It will provide a framework for evaluating technology decisions from the perspective of customer experience quality and suggest metrics that can help businesses justify and benchmark the success of their investments.
If you are trying to improve global access to your applications, you have probably considered one of several solutions: stringing together your own private network, purchasing WAN optimization appliances, or using a managed cloud-based service provider. Figuring out the benefits of each solution isn’t easy and it is hard to test for variations in Internet connectivity, specific applications and other conditions.
But what if a vendor could show you exactly the benefit in a particular use case, so you could understand what they are delivering? I got Aryaka to do just that. You can read my post in Network World today here.
This article was written by Jesse Jacobsen, who is a web content writer at TechnologyAdvice. He covers a variety of topics, including business intelligence, project management, and analytics software. Connect with him on Google+.
Most professionals have used bar graphs and Excel pie charts on present data. At its most basic this is what’s known as data visualization, a growing feature of business intelligence software. However, such charts are often too simplistic to convey complex data sets. That’s where today’s advanced data visualization tools come in. With them, it’s easier than ever to manipulate data sets, visualize trends, and find competitive insight. Let’s look at some of the most useful data visualizations, and show how they can provide better insights into your company’s data.
A streamgraph is a stacked area graph that displays data around a central axis. By assembling the information over a time-based axis, streamgraphs allow users to compare the ebb and flow of different data sets.
For example, in 2008 the New York Times created an interactive streamgraph that displays the ebb and flow of box office receipts for movies released between 1986 and 2008. It highlights the aesthetic nature of such diagrams, and how they can be used for quickly displaying comparisons.
In addition to being an interesting source for displaying cultural information, streamgraphs can be used to provide business insight. For example, a clothing company sells red, blue, and yellow shirts. By visualizing the daily or weekly sales figures of each shirt, companies can observe how the sales ebb and flow based on the time of day, the day of the week, or even the month. Observations on product popularity can lead to competitive adjustments in inventory ordering, marketing strategy, and even product development.
Treemapping is a method for displaying hierarchical data through space-contained, rectangle graphing. This visualization is typically displayed within a larger rectangle, with the surface area divided into segments that correspond to data points.
Because data in treemaps can be grouped together based on similarities or relevance, this is a great way to visualize categorical data. The Observatory of Economic Complexity did just that in their treemap displaying products exported by The United States in 2009.
By grouping exports into categories like machines, transportation, and vegetable products, this treemap compares diverse data in a way that’s easy to grasp. Companies with a diverse array of products can use treemaps to provide valuable insight into sales data or to evaluate an organization’s budget in a more accessible way.
Geolocation-based visualization modules display data on, you guessed it, a map. While this sounds like a simple concept, different use cases continually demonstrate how this technology can be manipulated to provide business insight.
Companies commonly use mapping to display store locations or product availability. Many companies include similar mapping capabilities on their websites, which guide customers to the closest store. Many BI vendors take mapping visualization to the next level by including temporal data. This allows users to view geographic trends over time for further insight into behavioral patterns. For instance, Foursquare displayed the “pulse” of New York by calibrating a map to display how commuters use their “check-ins” over the course of a day.
Temporal mapping can also be useful for businesses. If your company is looking to expand to a new city, for instance, temporal mapping (combined with analytics) can provide valuable insight into where the most receptive audience for your product is.
Network visualization displays the connections of information or systems over time. While network displays can illustrate simple two-way connections, they can also illustrate complex temporal relationships.
In June, the New York Times created an interactive network visualization that displays how club teams and national teams are connected in the 2014 World Cup. Users can scroll over any information bubble to more clearly see the relationships, including the name of the player that makes the connection.
Network visualization is an effective tool to observe and understand the relational structure of business operations, such as how acquisitions and changes in leadership affected employee retention and division management. Understanding your data through visualization modules can provide you with the information you need to get ahead of the competition.
Stealing content from websites is all too common but there is a way to protect yourself with a new tool from ScrapeDefender.com. You can track and distinguish scraping bots from normal visitors and monitor your site in near real time too.
We tested their service in February 2014 against several websites, including our own.
Pricing starts at $79/mo for basic service