What becomes editorial collaboration most?

I have written several times about the technologies and processes that enable collaboration (talking about the former here for SiliconAngle in 2023 and appropriate tools here for Biznology’s blog in 2021). My reason for writing about this now is having known and worked around and for Dylan Tweeny for many decades, I was interested in a report he released today about the state of editorial content. In my SiliconAngle post, I describe some of the more successful collaboration efforts down through history, including those working as codebreakers at Bletchley Park WWII and the team behind the 2015 Ford Mustang redesign.

A large part of Dylan’s report deals with the collaborative effort that is involved in producing content in this AI era, based on a self-selected survey of 169 respondents.

In my many years creating content, I have seen plenty of situations where great content is edited into some uninteresting pablum by a group assigned to review my content. Now, I am not a member of “every word out of my word processor is precious” school. But I often cringe when an editor – or a gaggle of them – start reducing the value of my content rather than adding to it.

Part of the problem here is understanding where and when the “collaboration” actually begins: is it at the first light of an assignment where 15 stakeholders weigh in on a Zoom call what their first draft will be? That is unworkable, as many of Dylan’s respondents say with the “too many cooks” comments. Or when someone in the workflow wants to back up and start from another POV that wasn’t recognized initially.

Many people characterize collaboration as a team sport. However, the analogy breaks down when we look more closely. In sports, you have definite rules of play, which is mostly missing in content creation. You also have well defined roles — also MIA. You have leaders that delegate specific tasks (at least the better ones do), but this often is hard to define in the content creation biz. So yes, you need a team. But the idea that “everyone thinks they are an editor, and thinks they are good at it” is just wrong-headed. Eventually, the ref must blow the whistle and play resumes. A better analogy would be a “team of rivals” (apologies to Doris) that have to work together and make up the roles, rules, and who is in charge as they go along.

I don’t think all collaboration all the time is necessary at all stages of the creation of content. At some point, an individual author needs to synthesize all the (often conflicting) points and produce something which tells the story and connects with the eventual audience. This is why AI in its current incarnation is a total fail. In writing my stories I have had interviews of my sources that directly contradict each other, or at least at first blush. Dive deeper, and the devil is in the details. That is what makes my experience valuable — yes, you can assign a 20-something to do the initial interviews, but they would probably miss these details. So finding sources and knowing the questions to ask are key ways I collaborate,

Certainly, we need to adopt better project management tools and use them more effectively. Dylan’s report shows that many content creators still use simple things such as GDocs, with a light seasoning of Grammerly. (A side ironic note: GDocs didn’t have real-time collaboration features when it was initially created, until Google purchased that technology and incorporated it into the service.) There is still plenty of content which is created by having serial edits being passed back and forth via email. That is still the most common PM method that I see being used in my clients. Maybe I have the wrong clients <G>.

Part of the challenge here is that having true editorial management is a lost art. Remember when our pubs had copy desks to manage their workflows? When was the last time a PR agency had something similar? Now that anyone can push a button and post something online, it means that managing this process involves saying “don’t push the publish button quite yet.” Most of the big b2b tech sites that I have worked for in the past couple of decades have a “post first, copy edit and fix later” philosophy. That is no way to manage anything. When I worked for ReadWrite back around 2012 and ran a bunch of their b2b websites, I had one writer who refused to acknowledge that he worked for me. He was a free agent, and damn the torpedoes, full steam ahead and how dare I mess with his golden prose? That was an impossible situation but was tolerated because he wrote (a lot of) good stuff.

Another challenge is relying on meetings as a collaboration path, either virtual or in place. That requires skill, something we don’t all have to bring the best collaboration forward from all participants.

Sharing is more than caring. And the real challenge is that collaboration usually carries what is called a “work tax” because the tools mentioned take the creator out of the context of creation and divert the heat of the creative workflow by adding an explicit sharing step. And it is ironic that the people who would know tech better are often the ones paying the highest tax.

Me and the mainframe

I recently wrote a sponsored blog for VirtualZ Computing, a startup involved in innovative mainframe software. As I was writing the post, I was thinking about the various points in my professional life where I came face-to-CPU with IBM’s Big Iron, as it once was called. (For what passes for comic relief, check out this series of videos about selling mainframes.)

My last job working for a big IT shop came about in the summer of 1984, when I moved across country to LA to work for an insurance company. The company was a huge customer of IBM mainframes and was just getting into buying PCs for its employees, including mainframe developers and ordinary employees (no one called them end users back then) that wanted to run their own spreadsheets and create their own documents. There were hundreds of people working on and around the mainframe, which was housed in its own inner sanctum, raised-floor series of rooms. I wrote about this job here, and it was interesting because it was the last time I worked in IT before switching careers for tech journalism.

Back in 1984, if I wanted to write a program, I had to first create them by typing out a deck of punch cards. This was done at a special station that was the size of a piece of office furniture. Each card could contain instructions for a single line of code. If you made a mistake you had to toss the card and start anew. When you had your deck you would then feed it into a specialized card reader that would transfer the program to the mainframe and create a “batch job” – meaning my program would then run sometime during the middle of the night. I would get my output the next morning, if I was lucky. If I made any typing errors on my cards, the printout would be a cryptic set of error messages, and I would have to fix the errors and try again the next night. Finding that meager output was akin to getting a college rejection letter in the mail – the acceptances would be thick envelopes. Am I dating myself enough here?

Today’s developers probably are laughing at this situation. They have coding environments that immediately flag syntax errors, and tools that dynamically stop embedded malware from being run, and all sorts of other fancy tricks. if they have to wait more than 10 milliseconds for this information, they complain how slow their platform is. Code is put into production in a matter of moments, rather than the months we had to endure back in the day.

Even though I roamed around the three downtown office towers that housed our company’s workers, I don’t remember ever stepping foot in our Palais d’mainframe. However, over the years I have been to my share of data centers across the world. One visit involved turning off a mainframe for Edison Electric Institute in Washington DC in 1993, where I wrote about the experience and how Novell Netware-based apps replaced many of its functions. Another involved moving a data center from a basement (which would periodically flood) into a purpose-built building next door, in 2007. That data center housed more souped-up microprocessor-based servers which would form the beginnings of massive CPU collections that are used in today’s z Series mainframes BTW.

Mainframes had all sorts of IBM gear that required care and feeding, and lots of knowledge that I used to have at my fingertips: I knew my way around the proprietary protocols called Systems Network Architecture and proprietary networking protocols called Token Ring, for example. And let’s not forget that it ran programs written in COBOL, and used all sorts of other hardware to connect things together with proprietary bus-and-tag cables. When I was making the transition to PC Week in the 1980s, IBM was making the (eventually failed) transition to peer-to-peer mainframe networking with a bunch of proprietary products. Are you seeing a trend here?

Speaking of the IBM PC, it was the first product from IBM that was built with spare parts made by others, rather than its own stuff. That was a good decision, and this was successful because you could add a graphics card (the first PCs just did text, and monochrome at that) or extra memory or a modem. Or a adapter card that connected to another cabling scheme (coax) that turned the PC into a mainframe terminal. Yes, this was before wireless networks became useful, and you can see why.

Now IBM mainframes — there are some 10,000 of them still in the wild — come with the ability to run Linux and operate across TCP/IP networks, and about a third of them are running Linux as their main OS. This was akin to having one foot in the world of distributed cloud computing, and one foot back in the dinosaur era. So let’s talk about my client VirtualZ and where they come into this picture.

They created software – mainframe software – that enabled distributed applications to access mainframe data sets, using OpenAPI protocols and database connectors. The data stays put on the mainframe but is available to applications that we know and love such as Salesforce and Tableau.  It is a terrific idea, just like the original IBM PC in that it supports open systems. This makes the mainframe just another cloud-connected computer, and shows that the mainframe is still an exciting and powerful way to go.

Until VirtualZ came along, developers who wanted access to mainframe data had to go through all sorts of contortions to get it — much like what we had to do in the 1980s and 1990s for that matter. Companies like Snowflake and Fivetran made very successful businesses out of doing these “extract, transfer and load” operations to what is now called data warehouses. VirtualZ eliminates these steps, and your data is available in real time, because it never leaves the cozy comfort of the mainframe, with all of its minions and backups and redundant hardware. You don’t have to build a separate warehouse in the cloud, because your mainframe is now cloud-accessible all the time.

I think VirtualZ’s software will usher in a new mainframe era, an era that puts us further from the punch card era. But it shows the power and persistence of the mainframe, and how IBM had the right computer, just not the right context when it was invented, for today’s enterprise data. For Big Iron to succeed in today’s digital world, it needs a lot of help from little iron.

The end of meetings could be upon us

Last week Shopify’s COO Kaz Nejatian sent a memo to its employees saying it would cancel previously scheduled meetings of more than two people, according to CNN. “No one joined Shopify to sit in meetings,” he wrote. True, that. Larger meetings of 50 or more would only be allowed on Thursdays. This couldn’t have come at a better time: as companies have shifted to more remote workers, they also have to do a better job at meetings, and often that means less is more.

We have grown to become meeting-dependent. Part of the reason is the ubiquity of group communications tools such as Microsoft Teams and Salesforce Slack. Of course, we have always had these tools in the past (remember Notes and Groupwise?) but the tools have gotten better. Ironically, this means meetings can proliferate and the potential for abuse increases. We’ll see where Shopify ends up in a few months and whether they are successful at taming the meeting monster. I recall back in the 1990s, Computer Associates used to turn off their corporate email system for several hours during the workday so employees could focus on their real work. That strategy didn’t age well, to be sure.

A survey of Microsoft Teams usage data found that since February 2020, users saw a 252% increase in their weekly meeting time and the number of weekly meetings has increased 153%. You can see the trends over the past couple of years in this chart below.

Microsoft found that people are becoming more intentional about taking breaks, avoiding double booking, and establishing meeting-free work blocks, along with having shorter and more ad hoc meetings according to their Outlook calendar data studied as part of this report. All of these things are great, and perhaps a Shopify shock to the overall culture has some chance of success.

As I said, it is about time. I have written about this subject for more than a decade, including this blog post from 2012 about how to be more effective at scheduling them and the various meeting calendaring software products that should be used. (Not email!) Adam Enfroy has this comparative review of these tools.

But for the software to be effective, you have to change the culture. Entrepreneur.com has these important takeaways here, including promoting small talk for cementing personal connections, having someone be in charge of the agenda and then keeping things on track, and setting expectations up front. Some other recommendations come from an HR consulting firm and include:

  • Figure out in advance the meeting type (stand-up daily huddle, weekly tactical session, longer strategy session) and make sure everyone’s expectations line up accordingly.
  • Keep in mind one goal is to have a passionate meeting with some healthy conflict to air differences. The meeting leader should be deliberate about eliciting different speakers.
  • Dig deep for any buried conflicts and try to resolve them during the meeting.

Time for some privilege management

Working in infosec, we use the term “privilege access management” to refer to security tools that determine which users have what kinds of rights to access particular applications, devices and networks. But when I read this recent Protocol story (that is the name of the online pub, btw) about a tech writer who turned down a potential job with a software firm because they were using Teams (that is the name of the Microsoft software, btw), I had to stop and think about this.

This is what the Great Resignation has come to? Granted, I am not a big fan of Teams but heck, that would not be a dealbreaker when I would consider joining a company.  At least they aren’t using AOL IM, which was the messaging standard — even for corporations — back in 2006 when I wrote this story for the NY Times.

But still. I guess in these days where it is a job seeker’s market, you don’t have to check your privilege at the Teams web portal, to inelegantly coin a new phrase.

Back in the olden times — say the early 90s — people who wanted to use Macs had trouble getting them purchased for their corporate desktop or laptop of choice. Thankfully we have all moved on from that era. So I guess it was only a matter of time before someone, as misguided as the dude in the Protocol story, would vote with his feet or keyboard or whatever and seek employment elsewhere.”The vibes are off.” What, is he also a music critic?

Now, being a member of the tech writing community I am embarrassed about this. And unlike the Mac/Windows dichotomy of yore, we are talking about the software this potential privileged person will use to connect to his peers. And a collaborative piece of software: this is something that everyone has to use to derive value.

Remember how tech companies used to lure candidates by having free food prepared by on-site chefs, well tricked-out workout rooms, and snack closets that could compete with Trader Joes? Now I guess this means that companies will have to offer Slack safe spaces now (or whatever piece of software offends the next potential new hire). It is a sad day indeed for all of us.

Do you really need to learn calculus?

I was talking to a friend of mine who teaches middle school math this week. It brought back all sorts of memories about my career in math, how I picked my classes over my primary and secondary schooling, and what I would tell my teen-aged self with the benefit of hindsight if such self would actually listen to an adult back then.

I come from a family of math geeks: my sibs and my parents were all good at math, and all of the kids were on the “math team” that met after school to solve problems and compete for prizes. Looking through my old report cards, rarely did I get a grade less than an A in any of my classes. When it came time for college though, I started out as a physics major, quickly changing to math when I got frustrated with all the prerequisites, and eventually graduating with a roll-my-own major that combined independent study classes in science, art and math.

What many parents don’t realize until their kids have been through middle school is that there is in most districts a separation of kids into two math tracks. One is the basic math curriculum which involves teaching algebra, trig, geometry and some statistics by the time you finish high school. The other is a more advanced series of classes that ends with students taking calculus in their senior year in high school. If you are good at math, you end up with the latter program of study.

Why does anyone need to study calculus? There really isn’t any good reason. It is more custom than necessity. In my case, getting calculus “out of the way early,” (as I look at it now) allowed me to get AP credit and graduate early from college. It also enabled me to take more advanced math classes too. I asked Arnold Seiken, one of my former college math professors, why anyone should take the class. He was mostly bemused by the question: “Calculus was always part of the requirements for graduation – students assume that it is part of the burden of life and just grin and bear it. I assume you took my courses because you liked the jokes. I can’t think of any other reason.” He was right: he was always a crack-up, in class and now in retirement. Interestingly, he told me that he got into math by accident in high school because he couldn’t do the science labs, much like I decided. “Math was a fallback for me, I was always breaking stuff in the labs.” He was an excellent teacher, BTW.

When you are a college math major, there are basically two different career paths you hear about: to teach or to become an actuary. I wasn’t all that excited about teaching (although I did dabble with teaching both a high school computer class and a graduate business class later on in life), and when I took the first exam of many to become an actuary, I got the lowest possible passing score. That first exam is calculus, and my justification for the miserable score was because I hadn’t had any calculus for several years by the time I took the exam. But it didn’t bode well for a career in that field.

But having plenty of math classes – including one on linear algebra taught by Seiken — also enabled me to have a solid foundation for graduate school study of applied math topics that were part of my degree in Operations Research. That took me to DC and to do math modeling for supporting government policy analysis, and eventually on to general business technology.

My master’s degree was issued in the late 1970s. Back then we didn’t have personal computers, we didn’t have spreadsheets, we didn’t have data modeling contests like Kaggle. What we did have was a mainframe on campus that you had to program yourself if you wanted to do mathematical models. Today you can use Excel to solve linear programs and other optimization problems, set up Chi Square analyses, run simulations and other things that were unthinkable back when I was in school – not that I would know how to do these things now if you forced me to watch a bunch of videos to relearn them.

Seiken reminded me of a part-time job that I had in college, repairing these ancient geometric string models that were used in the 1800s to teach engineering students how to draw conic sections. I didn’t need calculus to restring the models, although it helped to understand the underlying geometry to figure the string placement. It did get me started on my path towards problem-solving though.

And I think that is what I would tell my teenage self. Whether or not I took this or that math class, what I was good at is solving technical problems. Having a math background made it easier for me to pick a more technical career path. Had I not moved into the calculus track, I probably would still have been interested in math of some kind, but probably wouldn’t have been as challenged to take the advanced classes I was doing as a junior and senior in college. So yes, calculus per se isn’t really relevant to debugging a network protocol stack, or figuring out the root cause of a piece of malware, but it does train you to learn how to pick apart these problems and find your way to an answer. Now, your kids may have a different path towards developing their own problem-solving skills, but math was my ticket and I am glad I took the path that I did.

FIR B2B podcast #131: How to Run Webcasts and Video Calls

Both Paul Gillin and I have run and participated in various webinars and online meetings over the years. For this podcast episode, we share some of their best practices. There are several things you can do to have great meetings. First, is preparing your speakers and in planning for the presentation. Do you have the right kind of slide deck? With our in-person speaking gigs, we try to minimize the text on our slides and provide for more of an experience and set the mood. For a webinar where you don’t necessary see your audience, your slides are more of your speaking notes, so your audience can take away your thoughts and remember your major points.

I produce a monthly webinar for the Red Cross that has a dozen speakers and an audience of several hundred. To pull this off with minimal technical issues, my team has put together a lengthy document that recommends how speakers connect (watch for poor Wi-Fi and don’t use speakerphones) and describes the various roles that different people play during the conference call (master of ceremonies, moderator, time keeper, slide wrangler, presenter recruiter, chat and notes helpers). Paul and I both suggest using a common slide deck for all speakers, which means getting the slides in order prior to the meeting. Also, with more than a couple of presenters you should test your speakers’ audio connections too; both of us have had more problems with wonky audio than video. And settle on a protocol for whether or not to show your face when the meeting starts (and check to see if you are appropriately dressed).

Both of us feel you should always start your meetings promptly: you don’t want to be wasting time waiting for stragglers. We both don’t particularly like Skype for Business, although “regular” Skype is fine (most times) and we have also used GoToMeeting and Zoom, too.

Here is an example of a recent speech I gave to an audience of local government IT managers. I also has lots of other tips on how to do more than meetings and improve team collaboration here.

If you would like to listen to our 16 minute podcast, click below:

Good luck with running your own online meetings, and please share your own tips and best practices as comments. And enjoy the video below.

Picking the right tech isn’t always about the specs

I have been working in tech for close to 40 years, yet it took me until this week to realize an important truth: we have too many choices and too much tech in our lives, both personal and work. So much of the challenges about tech is picking the right product, and then realizing afterwards the key limitations about our choice and its consequences. I guess I shouldn’t complain, after all, I have had a great career out of figuring this stuff out.

But it really is a duh! moment. I don’t know why it has taken me so long to come to this brilliant deduction. I am not complaining, it is nice to help others figure out how to make these choices. Almost every day I am either writing, researching or discussing tech choices for others. But like the barefoot shoemaker’s children, my own tech choices are often fraught with plenty of indecisions, or worse yet, no decision. It is almost laughable.

I was involved in a phone call yesterday with a friend of mine who is as technical as they come: he helped create some of the Net’s early protocols. We both were commiserating about how quirky Webex is when trying to support a multiple-hundred conference call. Yes, Webex is fine for doing the actual video conference itself. The video and audio quality are both generally solid. But it is all the “soft” support that rests on the foibles of how we humans are applying the tech: doing the run-up practice session for the conference, notifying everyone about the call, distributing the slide deck under discussion and so forth. These things require real work to explain what to do to the call’s organizers and how to create standards to make the call go smoothly. It isn’t the tech per se – it is how we apply it.

Let me draw a line from that discussion to an early moment when I worked in the bowels of the end-user IT support department of the Gigantic Insurance company in the early 1980s. We were buying PCs by the truckload, quite literally, to place on the desks of the several thousand IT staffers that until then had a telephone and if they were lucky a mainframe terminal. Of course, we were buying IBM PCs – there was no actual discussion because back then that was the only choice for corporate America. Then Compaq came along and built something that IBM didn’t yet have: a “portable” PC. The reason for the quotes was that this thing was enormous. It weighed about 30 pounds and was an inch too big to put in the overhead bins of most planes.

As soon as Compaq announced this unit (which sold for more than $5000 back then), our executives were conflicted. Our IBM sales reps, who had invested many man-years in golf games with them, were trying to convince them to wait for a year before their own portable PC could come to market. But once we got our hands on an IBM prototype, we could see that Compaq was a superior machine: First, it was already available. It also was lighter and smaller and ran the same apps and had a compatible version of DOS. We gave Compaq our recommendation and started buying them in droves. That was the beginning of what was called the clone wars, unleashing a new era of technology choices to the corporate world. After IBM finally came out with their portable, Compaq already had put hard drives in their model so they stayed ahead of IBM on features.

My point in recounting this quaint history lesson is to point out something that hasn’t changed in nearly 40 years: how tech reviews tend to focus on the wrong things, which is why we get frustrated when we finally decide on a piece of tech and then live with the consequences.

Some of our choices seem easy: who wants to pay a thousand bucks for a stand to sit your monitor on? Of course, some things haven’t changed: the new Macs also sell for more than $5000. That is progress, I guess.

My moral for today: looking beyond the specs and understand how you are eventually going to use the intended tech. You may choose differently.

Fast Track blog: Lessons Learned From IT Asset Management

As a citizen developer, trying to manage your IT assets can be tough. Keeping track of such things as programs, servers, policies and procedures requires discipline, organization, and best practices that those of us who were raised in the IT school of hard knocks had to learn along the way. Here are a few tips from the IT pros to help you out.

You can read more on the QuickBase Fast Track blog here.

Quickbase blog: How to Make Scheduling Meetings Easier and More Productive

xk2One thing that hasn’t changed about today’s office environment is that meetings are still very much in force. Certainly there are ways to make their end product – such as linked spreadsheets poked fun of by this Xkcd comic — more productive. But there are other productivity gains to be had with meeting scheduling and tracking and online calendar technologies that can be had as well. Before you dive into any of these, realize that you will probably need more than one tool to help, depending on your needs. In my post today for the Quickbase blog I talk about various tools that you can use.

Quickbase blog: How Much Code Do You Need to Collaborate These Days?

Today we have a seeming ubiquity of the coding generation: rapid application development can be found everywhere, and it has infected every corporate department. But what is lost in this rush to coding everywhere is that you really don’t need to be a programmer anymore. Not because everyone seems to want to become one. But because the best kinds of collaboration happen when you don’t have to write any code whatsoever.

You can read my post about this topic in the Quickbase The Fast Track blog here.