Utopia or Dystopia?

I’ve spent the past few posts thinking about how we need to Reimagine Information Governance for the future. More specifically,

Post #1 - Reimagining Information Privacy.

Post #2 - Reimagining Information Security.

Post #3 - Reimagining Information Authenticity.

Post #4 - Reimagining Information Retention and Disposition.

Post #5 - Reimagining the Information Governance Profession.

I thought I would spend THIS post thinking about the broader technology context for all of this Reimagining — What is going on in the broader universe of technology?

Two years ago I wrote a piece about future visions of technology innovation. In the wake of the utter zaniness of the past two years, I thought I would take another look at it and update it for the post-COVID world.

The first industrial revolution was driven by the advent of steam power in the 18th and 19th centuries and laid the fertile ground for the original Luddites.

General Ludd

“Luddite” is now a blanket term used to describe people who dislike new technology, but its origins date back to a 19th century labor movement that railed against the economic fallout of the Industrial Revolution. The original Luddites were British weavers and textile workers who objected to the increased use of automated looms and knitting frames. Most were trained artisans who had spent years learning their craft, and they feared that unskilled machine operators were robbing them of their livelihood. When their appeals for government aid and assistance were ignored, a few desperate weavers began breaking into factories and smashing textile machines. (History.com -- Who Were the Luddites?)

The second industrial revolution was driven by innovations in communications (radio and TV), transportation (planes and cars) and the mass availability of the electricity needed to drive it all. Lest we be too hard on the original Luddites, there are Luddites in the middle of every technology revolution. The president of the Michigan Savings Bank advised Henry Ford’s lawyer, Horace Rackham, not to invest in the Ford Motor Company: “The horse is here to stay but the automobile is only a novelty – a fad.”

The bulk of my career has been spent in the middle of the third industrial revolution, driven by computing. This era began in the 60s and 70s, but ramped into high gear in the 80s and 90s with massive increases in computing capability and connectivity, Moore’s Law decreases in technology prices, and mobile and cloud-driven business model disruption.

It is hard to overstate the dramatic changes that have occurred during this third industrial revolution. During the span of one single career, I’ve moved from dreaming about a $149 TI calculator to replace my slide rule to being able to summon 4 terabytes of storage directly to my door for $95 upon a mere voice command to a device in my kitchen. This is despite such well-known predictions as “I predict the Internet will soon go spectacularly supernova and in 1996 catastrophically collapse” (Robert Metcalfe, founder of 3Com) and “There’s no chance that the iPhone is going to get any significant market share” (Steve Ballmer).

And all of this will pale relative to the changes that will be brought about by the fourth industrial revolution. The fourth industrial revolution will be driven by artificial intelligence, machine learning and cloud computing and will offer all sorts of opportunities for radically redefined and streamlined processes and ways of working.

I have always been a technology enthusiast, eager to try the latest technology. And on the cusp of the fourth industrial revolution, I still am. But the future's not without a few clouds, and I don’t mean just cloud computing.

At the risk of generating my own set of Luddite predictions that will look foolish 20 years from now, I worry about a rising sense of disorientation about what this fourth industrial revolution will bring. I don’t think we have to go as far as Ray Kurzweil in predicting that technological singularity -- the crucial moment when machines become smarter than humans -- will occur in our lifetime. But I do think that the rising forces of disorientation will threaten to undermine the speed with which the next generation of technologies will be adopted -- unless we all start collectively to address these concerns.

Disorientation 1 -- Rising concerns about technology itself.

All of the revelations about the violations of trust that have occurred in the social media space are generating a broad sense of technological unease in which technology innovation has shifted from a universal good to something more mixed. Facebook’s swing from the Technology Belle of the Ball to Technology Unwanted Guest has occurred seemingly in the blink of an eye. For example, see Can Facebook Be Fixed?, and ponder that this post was written BEFORE whistleblower Frances Haugen appeared on 60 Minutes. And all of the Meta repositioning in the world — and all of the claims that Facebook has really wanted regulation all along — are not going to put that genie back in the bottle.

We have not yet begun to even come close to acknowledging the implications of the fact that there is asymmetrical value in the vast quantities of information now being collected. The accumulated information from consumer-facing data gathering exercises has FAR more value to the one that accumulates the data than to the one who provides it. And when that asymmetrical value is at the very heart of business models — think Facebook, Google, Ancestry, Amazon and countless others — “fixing” the privacy problem will not be easy.

I often think back to my invitation to Sir Tim Berners-Lee to be an AIIM keynoter -- in 1998! -- and his charge to the AIIM attendees -- many of whom had no idea who he was or why he was there. He urged the attendees to think about four key emerging challenges tied to the intersection of intersection of markets and culture and the web:

  1. [Automated] Agents: Will they stabilize or destabilize markets?

  2. Lack of geography [in the web]: Will it polarize or homogenize culture?

  3. Web access: Will it be the great divider or great equalizer?

  4. And his final: Will the web generate jealousy and hatred or Peace, love and understanding?

These challenges seem simple and naive in retrospect. But they were exactly the things we SHOULD have been discussing over the past twenty years as we adopted a model of innovation financed largely by trading access and privacy for convenience.

It is fascinating to watch the current generation of Millennial digital natives struggle with the broader implications of technology for their children. On the one hand, my eight-year granddaughter Lucy can do remarkable things with technology. On the other hand, my son and daughter-in-law constantly wrestle with questions of how much screen time is advisable and the implications of how to teach someone to live in a world that is always on and always connected. But how do you reconcile conflicting messages about “screens,” when children spent an entire year staring at their screens in order to go to school? According to the Pew Research Center, 32% of Americans feel that over the next decade, overall well-being will be more harmed than helped by digital life.

Disorientation 2 -- Radically redefined work and workplaces

The exponential rate of change that is the hallmark of the fourth industrial revolution will inevitably change the nature of work and where and how work is done. According to McKinsey & Company (What the future of work will mean for jobs, skills and wages):

We estimate that between 400 million and 800 million individuals could be displaced by automation and need to find new jobs by 2030 around the world….For advanced economies, the share of the workforce that may need to learn new skills and find work in new occupations is much higher: up to one-third of the 2030 workforce in the United States and Germany, and nearly half in Japan.

That’s a lot of disruption! Even though there is a strong argument that the 4th industrial revolution will create more jobs than it destroys, AI and machine learning will undoubtedly impact how and where those jobs are distributed and to whom. Christopher Hernaes notes (Is technology contributing to increased inequality?), “The next wave of intelligent automation will strike hard at...the middle class: Classic middle-income white-collar jobs, such as bank tellers, insurance underwriters, loan officers and case-file workers. Basically, every job that includes following the rules and making few decisions.” The Washington Post echoes this, “With advances in artificial intelligence, any job that requires the analysis of information can be done better by computers. This includes the jobs of physicians, lawyers, accountants, and stock brokers.”

And all of that was written BEFORE COVID. I don’t think we’ve even begun to understand the long-term implications of the past 18 months on how and where we work. Organizations deployed more collaborative technology to more people in six months than they typically would have over five years. Some organizations think, “Well, we’ll just have everyone come back now.” Nice theory, Darwin, but not so fast.

I know one company that convened a meeting during the time before the Delta variant to discuss having all the sales staff return to work. Fine idea, except everyone had moved out of the area during the time of COVID. The coming months will be a time of reckoning for many organizations. Will policies be agile enough to embrace employee flexibility that stretches from the “stay-at-home” no-commute desires of a young parent with kids to a recent college graduate who likes the idea of going to an office (but also doesn’t want to be told that they MUST go all the time.) These tricky times will require new flexible approaches, never an easy thing for organizations trying to manage HR consistently at scale.

Disorientation 3 -- Uneven distribution of the benefits of technology

Perhaps one of the most challenging aspects of the fourth industrial revolution is this -- while it is difficult to deny the productivity benefits that will be generated by the next wave of technology innovation, it does not necessarily follow that these benefits will be equitably distributed. Andrew McAfee and Erik Brynjolfsson (In The Great Decoupling of the US Economy and in many other places) have done great work highlighting the distributional impact of technology in a disruptive era: “…while digital progress grows the overall economic pie, it can do so while leaving some people, or even a lot of them, worse off.”

The set of issues that this generates are complicated because they cut to the heart of a lot of core assumptions about the economy and who we are or imagine ourselves to be, especially in America. It is a set of issues that neither political party in the U.S. seems to fully understand.

Back in its Horatio Alger days, America was more fluid than Europe. Now it is not. Using one-generation measures of social mobility—how much a father’s relative income influences that of his adult son—America does half as well as Nordic countries, and about the same as Britain and Italy, Europe’s least-mobile places. (The Economist, Repairing the Rungs on the Ladder)

The pandemic highlighted and heightened these tensions. Not everyone is a high salaried knowledge worker who can just as easily work in a Starbucks as at an office. We also discovered that some of our most highly compensated employees are not necessarily the most critical employees during a pandemic.

To return to the original Luddites, that story did not end well for them. But importantly, nor did it halt the march of technology. Per Wikipedia:

The British government sought to suppress the Luddite movement with a mass trial at York in January 1813, following the attack on Cartwrights mill at Rawfolds near Cleckheaton. The government charged over 60 men, including Mellor and his companions, with various crimes in connection with Luddite activities. While some of those charged were actual Luddites, many had no connection to the movement. Although the proceedings were legitimate jury trials, many were abandoned due to lack of evidence and 30 men were acquitted. These trials were certainly intended to act as show trials to deter other Luddites from continuing their activities. The harsh sentences of those found guilty, which included execution and penal transportation, quickly ended the movement.

As we move into the fourth Industrial Revolution, it will not be enough for solution providers to simply develop great products. Collectively -- and this is a place where communities like MER can play a meaningful role -- solution providers must be part of a broader societal conversation about technology. This is a conversation that is not just technical -- the terms that are the most comfortable for most technology people -- but societal.

What do YOU think?

——-

Early registration is now open for the 2022 MER Conference – in person! The MER Conference 2022 agenda is taking shape and will feature keynote presentations and interactive sessions on the most important Information Governance topics facing business today and in the future.

Early registration information is HERE - Agenda details will be fleshed out in the next few weeks.

https://www.merconference.com/page/1901703/registration

Previous
Previous

New M365 Research Project

Next
Next

Reimagining the Governance Profession