One of the great lessons of the digital revolution is that the way we live our daily lives can be radically changed within the timeframe of a decade or less. Often, we barely notice what is taking place – we buy a device, download an app, discover a game, and soon the way we spend our time and the things we think about are transformed. Who would have believed in 2006, a year before the release of the first iPhone, that a decade later we would all be tethered tightly to smartphones, addicted to the constant fix of the latest text, tweet, Snapchat and animated gif? How did our instant access to limitless information become so easy via desktop computers, then laptops, then tablets, then handheld screens? Would we have understood just ten years ago that our world would revolve around access to WiFi or a strong LTE signal? That the expectations of holding a job would include 24/7 attention and constant contact with your supervisor? That we could avoid owning a car because of a thing called Uber? That our monthly budget would absolutely require spending $100 for a phone, $100 for Internet service and another $100 for cable/satellite/Netflix/Hulu/Spotify/etc.?
So with an understanding of what can happen in stealth mode over the course of just a few years, we should all pause to consider the fundamental changes underway in the workplace. For the current generation of students preparing for their careers, the retooling of the vocational landscape should be considered the most serious threat to our happy and productive lives. Put simply, the jobs many of us are training for are likely to be gone a decade from now. Maybe sooner. How will we make a living when we are 35? It’s not a theoretical question – it’s a sobering reality. And yes, we have technology to blame.
What is taking place is best understood with a historical perspective and an understand of a concept called “decoupling.” Following World War II, as the industrial revolution reached full maturity, the productivity of American workers rose in lockstep with the creation of new jobs. The nation’s economy expanded, the Gross Domestic Product grew, and the number of working adults grew at a similar rate. This allowed millions of people without a college education to maintain a decent standard of living, a benefit of their skills in manufacturing-related occupations. But as the digital revolution took hold, businesses found that they could continue to grow productivity without adding workers. Since 2000, the growth in GDP has become uncoupled from employment growth. This trend is only going to accelerate as robotics and artificial intelligence begin to make major inroads into the economy.
Just as it took some time before the steam engine exerted its full influence over the industrial revolution, we are at the early stages of the impact of the digital revolution on our modern society. But the timeframe for change is now highly accelerated compared with the past. Here is a simple fact that illustrates what is taking place: in 1964, the monolithic telecommunications company, AT&T, was worth $267 billion in today’s dollars and employed 758,611 people. Today’s comparable monopoly, Google, was worth $468 billion in 2015, yet it employed fewer than 60,000 people.
In studying these trends, Erik Brynjolfsson and Andrew McAfee, faculty members at the MIT Sloan School of Management, studied the “hollowing out of the middle class” as median income has fallen in the past 30 years. While demand for highly skilled workers remained strong in the U.S., automation, along with the shift of manufacturing jobs overseas, has devastated job prospects for blue-collar employees. In 1980, manufacturing jobs made up 22 percent of American jobs, but by 2011, that percentage was down to 10 percent. Those workers who have managed to keep their jobs have had to deal with stagnant pay levels. The percentage of U.S. households earning a middle-class income fell from 56.5% in 1979 to 45.1% in 2012. Much has been made about the growing income inequality in America, with the blame put on corporate and executive greed. But perhaps the greatest factors are the simple economic forces exerted by the digital revolution.
With the manufacturing and blue collar sectors now making up a fraction of their former shares of the employment picture, the next phase of job depletion is likely to be equally painful for millions of unsuspecting workers in other occupations who don’t fully understand what is taking place. The full impact of the advances in robotics and artificial intelligence will devastate employment prospects for millions of skilled, college-educated individuals.
Consider the prospects of a young person considering law school and a job in the legal profession. At first glance, it would seem unlikely that this vocation would be impacted by the digital revolution. Lawyers ponder complicated questions and deal with human foibles. They gather information from obscure sources and sometimes-reluctant witnesses. They argue persuasively before judges and juries. Robots need not apply, right? Actually, large parts of the daily work in a law firm are well suited for employing the benefits of artificial intelligence. A company called Judicata is making the research for relevant legal cases vastly more efficient. Lawyers and clerks can quickly find the cases they are looking for without searching for the proverbial needle in a haystack case. Here is the company’s sales pitch: “Lawyers exercise skilled judgment. Their time and energy is best spent analyzing information, not gathering it. Our technology organizes and helps make sense of massive amounts of information, enabling better legal decision-making.”
Another company called fairdoc says it is “reinventing the way lawyers work,” proclaiming its benefits to promote “efficiency and value.” Software-created estate planning documents through fairdoc cost under $1,000, one-third to one-fifth what a lawyer would typically charge.
The impact of this tidal wave of the digital revolution on the legal profession has been tremendous. For the law school graduating classes of 2013 and 2014, only 57 percent and 60 percent of the graduates respectively had found jobs that required them to pass a bar exam. So it’s not surprising that law school enrollment is plummeting. According to the American Bar Association, total enrollment in J.D. programs fell 17.5 percent from 2010 to 2014, and the total first-year enrollment dropped 27.7 percent. In 2010, about 52,500 students began law school, and by 2014 that number had fallen to just under 38,000, a 27-year low. New figures recently released show the number continued to fall in 2015, with a total of 37,058 students starting law school.
The legal profession is just one example. In a 2013 study titled “The Future of Employment: How Susceptible are Jobs to Computerisation?”, Oxford University faculty members Carl Benedikt Frey and Michael A. Osborne found great employment risk for workers in a wide range of office and administrative support positions as well as the service industry. Many might be surprised to learn that the workers most likely to lose their jobs to the digital revolution included such occupations as insurance underwriters, tax preparers, brokerage clerks, loan officers, legal secretaries, accounting clerks, real estate brokers, cashiers, retail salespersons, employee benefits managers, paralegals and legal assistants. Perhaps it is time to re-think that business major.
Note that the Oxford study was completed in 2013. Remember the concept of ever-accelerating change? Then consider the news just released this month by Facebook about the advances in chatbot technologies in its Messenger platform. We are entering an age in which we will begin interacting via chat with companies and services through “conversational commerce,” artificial intelligence systems that mimic human employees. Taco Bell just introduced TacoBot, allowing customers to order and pay through a chatbot system. Who needs a salesperson? Why wait on hold for a customer service representative? The chatbots are more efficient and don’t demand vacations, health benefits or a $15/hour minimum wage. “We think you should be able to text message a business like you would a friend, and get a quick response,” said Facebook’s CEO Mark Zuckerberg. Say goodbye to millions of jobs.
A young person’s strategy in this kind of environment might focus on pursuing a career in technology. If the bot revolution is upon us, why not learn how to code? Surely those occupations will be immune to these changes. Not so fast. Bart Selman, a faculty member at Cornell University, says it’s not hard to imagine software that does a better job than humans do at writing code. “A person complemented with an intelligent system can write maybe ten times as much code, maybe a hundred times as much code,” Selman says. “The problem then becomes you need a hundred times fewer human programmers.”
An InfoWorld article titled, “Researchers warn that a glut of code is coming that will depress wages and turn coders into Uber drivers,” warns of a developer bust that will follow the current developer boom. Citing a study released by the National Bureau of Economic Research, InfoWorld writes that “over time, more and more code is produced … Some of it enables smart machines to actually learn to perform new tasks or become so adept at their tasks that there’s no need to spend the money to make them even smarter.”
The comprehensive impact of the digital revolution on employment will reach nearly into every occupation. In a fascinating study, management consulting firm McKinsey & Company analyzed 2,000 distinct work activities. For example, a retail salesperson greets customers, answers their questions, demonstrates products and completes a sales transaction. The study found that 45 percent of work activities could be automated using technology that already exists. And when artificial intelligence that fully understands human language is perfected, an additional 13 percent of work activities could be performed by a machine. With further efficiencies afforded by machine assistance, fewer workers will be needed. Mortgage officers in banks can spend less time reviewing routine papers. Emergency room doctors can more quickly diagnosis a medical problem. Lawyers can more efficiently prepare standard documents like wills and contracts. Surprisingly, McKinsey found that even executive jobs were prime targets for influence by artificial intelligence systems. An estimated 20 percent of a CEO’s time could be automated by these technologies. Ironically, the jobs least likely to be impacted by AI advances were low wage positions such as landscapers, home heath aides and maintenance workers.
The McKinsey report authors Michael Chui, James Manyika and Mehdi Miremadi observed that “organizations and governments will need new ways of mitigating the human costs, including job losses and economic inequality, associated with the dislocation that takes place as companies separate activities that can be automated from the individuals who currently perform them.”
Therein lies the most important issue we must face. What are the social implications of a future in which fewer human workers are necessary and companies choose instead to invest in artificial intelligence and robotics in order to maximize profitability? How will an individual’s identity be changed when their primary occupation is not a defining feature? The changes are already well underway. A study done by software company Intuit predicts that by 2020, 40 percent of the U.S. workforce will be “contingent” workers – contract labor, temporary employees or self-employed. These employees will struggle to piece together a steady living wage, healthcare benefits and retirement nest eggs. They will suffer the stresses of uncertainty with the constant threat of loss of income. There will be none of the employment protections traditionally afforded to full-time workers, benefits won over a century by the labor movement and a business philosophy that rewarded the contributions of long-term, loyal employees who dedicated their lives to building strong companies and organizations. Fundamental changes like this surely have implications for the fabric of society.
Follow the line of consequences even further. Stagnant or falling wages will mean less discretionary spending, impacting the entire economy. Tax revenues will decline, impacting government services and the social safety net, including the already threatened social security system. The ingredients being mixed in this brew have the potential to create financial crises around the world. There are plenty of examples of anarchy in countries with astronomical unemployment, no government resources to fund educational systems, and huge wealth disparity between the masses and the small number of people who hold power. The prospects of a dystopian future are not hard to imagine when the cement of productive vocations crumbles.
This is why humanity must begin to plan for the era that follows the digital revolution – a time we might call the “post-work revolution.” The transition to this model will be extremely challenging, forcing the capitalist systems that have raised the modern standard of living to make radical adjustments that seem to be at odds with the free enterprise philosophy of many nations.
The first change to consider is the granting by governments of a universal basic income, perhaps as a compensation for providing service to society or attaining a specified education or skill level. Hand-in-hand with such a government income floor might be the creation of a Works Project Administration or Civilian Conservation Corps, which were utilized during the Great Depression to put people back to work. A government-provided “solidarity system” currently works well in Denmark, which has one of the happiest populations in the world. The government provides free healthcare, education, childcare and a basic income guarantee for the unemployed. There is no denying that citizens in Denmark pay very high taxes, and the system certainly sounds like a socialist’s dream that could sap any entrepreneurial incentive from people who want to get ahead. But there are benefits to this approach that must be considered in the American economy, which has never fully recovered from the Great Recession of the early 2000s. Wealth disparity continues to grow and there is a serious threat to the standard of living for middle class citizens.
The definition of work-life balance should also be reconsidered in an America where many people seemingly work around the clock, often not even using their annual paid vacation days. With digital efficiencies reducing the work burden, there should be serious consideration given to standardizing a 30-hour work week and mandating time off from work. The social benefits to families, especially young children; to our personal health and well-being; and to communities that are crying out for volunteers and civic engagement, would be enormous. Rather than use the advances of the digital age to expand corporate profitability, could our society agree that we should apply those advances to raising the standards of happiness and fulfillment? Should we not take this opportunity to increase our appetites for the arts, culture, personal enrichment and recreation? This is the perfect time to nourish the building blocks of strong communities by investing our efforts in civic projects, spending more time in churches, expanding parks and working to support our neighbors who need our support – the elderly, disabled or less fortunate.
As with every great threat, there is also a great opportunity. We can leverage the power of the digital revolution to advance the common good, raise the global standard of living, increase access to information and education for millions of people, and make the world a better place. The first step toward making that a reality is to avoid sleepwalking from one technological advance to the next and understand the consequences of the changes that are underway. Governments must put aside the partisan fighting that paralyzes action and prevents creative policies that are necessary to meet the challenges of the modern age. Major employers must realize and confront what is taking place in their workplaces – increasing profits must not be their only motivation. Sclerotic universities weighed down by old-school faculty and administrators must push through academic inertia and create programs that prepare students for a future that will be very different than the past. And all of us must take personal responsibility for gaining the skills that will be most powerful in the decades ahead – the ability to adapt quickly, continuously learn new things, draw on knowledge and creativity gained from a broad-based liberal arts education, and create rewarding lives based on much more than our occupations.