Select Page

 

We are in the midst of a revolution, occasioned by the disappearance of a massive number of jobs as we know them. We are experiencing the end of work as we know it.

What do we mean by the expression “the end of work?” It means technology, as defined as Artificial Intelligence (AI) and “robotization,” exerting a slow but continuous degradation on the value and availability of work—in the form of wages and the number of adult workers with full-time jobs. The widespread disappearance of jobs would usher in a social transformation unlike we’ve ever experience or imagined. The issue won’t be saving jobs, it will be saving or recasting the concept of work, which has become a religion in its own right.

Some aspects of the future world of work are already present. The jobless future is already here. Futurist Jeremy Rifkin contends we are entirely a new phase in history, one characterized by a steady and inevitable decline of jobs. He says the world of work is being polarized into two forces: One, an information elite that controls the global economy; and the other, a growing number of displaced workers.

Organizational structural changes have altered the nature of careers and jobs . Organizations have become “flatter” with fewer management levels as more work has become knowledge work. Project work and teamwork have also changed the nature of jobs.

Careers that once were viewed as progressions “up” a ladder are now often multidirectional and lateral. Robert DeFillippi and Michael Arthur define these changes as the creation of the “boundaryless career,” where the career path is defined by the individual’s soft and hard skills, not by their formal education or experience.

PWC’s report, “The Future of Work: A Journey to 2022,” a study of 10,000 people in China, India, Germany, the UK and the US, gave their views of the future of work, concluded that increasingly large corporations are turning into mini-countries and will take a more prominent role in social issues, the most important of which be the environmental; the development of more sophisticated measures of human productivity that include psychological and social components, and the disappearance of the boundary between work and personal life.

Economic growth is increasingly not driven by human labor but by Artificial Intelligence (AI) and robots

In 2013, Oxford University researchers In a published paper titled: “The Future of Employment: How Susceptible are Jobs to Computerization” C.B. Frey and M.A. Osborne, researchers at Oxford University, created a model that calculates the probability of substituting a worker in a given sector. Frey and Osborne conclude machines may replace 47% of active workers in the future. Of 1,896 prominent scientists, analysts, and engineers questioned in a recent Pew survey on the future of jobs, 48% of them said the AI revolution will be a permanent job killer on a vast scale.The Bank of England has warned that within the coming decades as many as 80 million jobs in the U.S. could be replaced by robots.

We are already witnessing chronic unemployment or significant underemployment for adult men and youth. The share in the economy of men in jobs and wages aged 25-54 has continuously declined through good and bad times since the 1970’s. And real wages and employment opportunities for college and university graduates have substantially declined since the year 2000. Only 68 % of men between 30 and 45 who have a high school diploma were working full time in 2013, according to a recent report by the Hamilton Project at the Brookings Institution, a Washington-based public-policy group.

Even the professions are not spared by the impact of economic restructuring that we’ve experienced.The number of hours logged by first-year and mid-level legal associates fell 12 percent from 2007 at some of New York’s largest law firms, says Jeff Grossman, national managing director of Wells Fargo Private Bank’s Legal Specialty. Architecture graduates ages 25 to 29 had the highest unemployment of the 57 degree programs surveyed by the Education Department in 2009. What about the medical profession? CABG rates are continuing to fall, says cardiologist Jack Tu, co-author of the ICES report and team leader of the Canadian Cardiovascular Outcomes Research Team (CCORT). “Anecdotally, a lot of surgeons are concerned they don’t have the [procedure] volume to meet their targets government funding [as a cardiac centre],” says Tu, a senior scientist at ICES and Canada Research Chair in HealthServices Research. Volumes will definitely continue to fall, resulting, eventually, in a surplus of cardiac surgeons, says Tu. “We need to stop training so many. They’re not going to have a lot of work.”

3d rendering humanoid robot working with headset and monitor

Erik Brynjolfsson and Andrew McAfee, authors of The Second Machine Age, argue computer technology is evolving so rapidly that predicting their capabilities and applications in a decade is almost impossible. Remember it was only 2007 when the first Iphone was released. Look at the capabilities of smartphones now.

Job creation is very different today than it has been in the past. The newest industries being created are mostly related to computer software, telecommunications and like applications, are the most labor efficient and don’t require many people. Economic historian Robert Skidelsky, author of Keynes: Return of the Master, argues, “sooner or later, we will run out of jobs.” If Skidelsky is right, it raises the question of what will our society look like without universal work or even close to it?

Examples of the extent of the breakneck speed at which AI and digitization have been advancing are:
  • Robots that can be surrogates for sexual partners.
  • A U.K. government robot named Amelia who will handle bureaucratic permit applications.
  • Robots that are capable of sensing and responding to human emotions.
  • Robotic pets  for children.
  • Robots that replace middle managers in a routinized environment.
  • An AI system that can read peoples’ thoughts by analyzing brain scans.
  • An AI system that can design and create music to respond to people having problems with anxiety.
  • An AI system that can control the process of gene editing.
  • Robotic soldiers and human/robot cyborgs.

The application of AI, robotics and computer software kills a wide variety of skilled jobs.

The replacement of human labor with AI and robotics has expanded from jobs that produce material products to a wide range of services, including the professions such as law, accounting, and health and even psychological therapy. The recession of 2007–2009 may have sped up the destruction of many relatively well-paid jobs requiring repetitive tasks that can be automated. These so-called routine jobs “fell off a cliff in the recession,” says Henry Siu, an economist at the University of British Columbia, “and there’s been no large rebound.” This type of work, which includes white-collar jobs in sales and administration as well as blue-collar jobs in assembly work and machine operation, makes up about 50 percent of employment in the United States.

Siu’s research also shows that the disappearance of these jobs has most harshly affected people in their 20s, many of whom seem to have simply stopped looking for work. Middle-income jobs are disappearing for a wide range of jobs. For example, the number of financial counselors and loan officers ages 25 to 34 has dropped 40 percent since 2007, outpacing the 30 percent drop in total jobs for the profession, according to the Federal Bureau of Labor Statistics. In the investment business we are seeing the replacement of financial analysts with quantitative analytic systems, and floor traders with trading algorithms. Mutual funds and traditional portfolio managers now compete against ETFs (exchange-traded funds), many of which offer completely automated strategies.

The expansion of contingency work for large numbers of people, and continuing development of the “gig” economy

One in three U.S. workers—53 million people—are now “contingent,” already contending with the changed structure of work, perhaps juggling multiple jobs and serving as temporary, “gig,” or self-employed workers. An increasing number of corporations, government institutions and even colleges and universities have replaced full-time workers with part-time and contract or piecemeal workers, any without any security or benefits. During the recent recession, large numbers of Americans who lost their jobs scrambled to make a living. Simultaneously, Internet commerce was expanding providing the most specialized consumer wants to be met with great efficiency and speed. This provided some enterprising individuals the unprecedented ability to capitalize on their own hands, minds, things, and hours.

Thus, says Jacob Morgan, author of Future of Work: Attract New Talent, Build Better Leaders and Create a Competitive Organization, the gig economy was born: Americans were able to use a craft expertise to an Etsy side job, or a car into a job for Sidecar, Lyft, and Uber for a little extra money. Benefit-less, contractor jobs that fill the gig economy include low barriers to entry and flexibility of schedule. In the past decade cloud computing has radically altered the way we work. But it’s the growth of the “human cloud”—a vast global pool of freelancers who are available to work on demand from remote locations, that will shake up the world of work. More and more employers (“requesters”) are inviting freelancers (“taskers”) to bid for each task. Two of the biggest Internet sites are Amazon’s Mechanical Turk, which lays claim to 500,000 “turkers” from 190 countries and Upwork, which estimates that it has 10 million freelancers from 180 countries. They compete for more than 3 million tasks or projects each year, which can range from tagging photos to writing code.

Management consultants McKinsey & Co. estimate that by 2025 some 540 million workers will have used one of these platforms to find work. The benefits to companies is obvious—instant access to a pool of cheap, willing talent without having to go through the lengthy recruitment process, and costly benefits. For the taskers, the benefits are not so good. However, the champions of crowdsourcing argue that it provides a powerful force for the redistribution of wealth by providing a fresh stream of income into the economy. In balance, it’s more likely to increase income inequality and depress wages. The big challenge for governments will be how to codify, and provide ethical, legal standards for this kind of work, to prevent abuses by employers.

What Happens to Education?

The disappearance of work for many people will have a dramatic impact on the nature of post-secondary education. For some time now, the purpose for, and offerings in colleges and universities in North America, have become increasingly skewed towards the preparation for jobs. If that purpose becomes questionable or even redundant, post-secondary institutions will either shrink or possibly repurpose themselves back to classical views of education, emphasizing the enlightenment of democratic citizens. According to Bethany Moreton of Dartmouth College, the 10 fastest growing job categories require less than a college degree. Over 40% of the college graduates are now working in low-wage jobs.

In policy debates, technology is presented as an uncontrollable force for which societies and workers must prepare. While education can allow individuals to improve their well-being by moving to a more lucrative profession, the overwhelming majority of jobs—in developed and developing countries alike —will not improve through more education. Of the current top 10 occupations in the U.S., only one is a highly skilled—a registered nurse. Retail salespersons and cashiers, fast-food workers, general office clerks, customer service representatives, waiters and waitresses, laborers, and janitors are the other top occupations, accounting for more than one of every five jobs in the U.S. in 2014, and not predicted to disappear anytime soon. The average annual earnings in most of these jobs in the U.S. is just under $20,000. More education may help a fast-food worker to leave the sector, but it will do little for the person remaining in that job. In Denmark and France, countries which embrace socialism, retail and fast-food workers are protected by collective agreements, these jobs provide living wages and other social benefits including paid annual and sick leave.

John Seely Brown, former chief scientist at Xerox, argues that given this era of accelerating change, one in which the half-life of many skills can be as short as five years, corporate training centers no longer work; returning to community colleges every five years is not viable. He argues we must re-invent the workplace as a “learningscape.” He says we can create “Cities of Learning’’—a new movement in which employers, libraries, and museums are wired together to help kids find their interests outside school and pick up new skills—or networks of partners in the corporate world.

A powerful example of this kind of learning is the use of GitHub; another example is being developed by a conservative company, SAP, which has created an extended open source network that has a couple million participants who are learning with and from each other. Another example is Hacker DoJo, a community in Silicon Valley where people share digital technical skills, or the guild networks around massively multi-player online games where thousands of new ideas are created, shared, and tested each night. And the rapid development of MOOCs, and other sources of free education and training through the Internet may make brick-and-mortar educational institutions obsolete.

The traditional answer has been to invest in developing skills that machines can’t replicate—creativity, problem solving, ingenuity, and other higher-order functions. Interestingly, embracing these skills means taking a step back from the idea of the human being that emerged during the Industrial Revolution—cog in a machine, interchangeable, and reproducible—towards the older Renaissance humanism, more prone to seeing people as possessed with unique gifts to create and innovate.

The problem is that public education in the U.S. and much of the world is, in many ways, a by-product of the Industrial Revolution. Education came to be standardized just like production, with students lined up in neat rows of desks and taught a uniform curriculum. An emphasis on memorization and rote learning helped produce a uniform citizenry— literate, compliant, interchangeable—to fill standardized roles in industry, offices, and government.

None of that cuts it in an age when intelligent machines can do anything rote or repetitive far better than we can. Cultivating some of our last uniquely human abilities—namely creativity and social intelligence—requires reimagining education as a means not of reproducing uniformity but of nurturing exceptionalism–in other words, the ability to do things that can’t be codified or systematized.

The Disappearance of Jobs, Income Inequality and the Consumer Economy

Martin Ford in his book, The Rise of Robots: Technology and the Threat of a Jobless Future, asks the question: “What happens to the consumer economy when you take away the consumers who are not working?” And what will happen to much of the infrastructure that supports the world of work as we know it—from the building of suburban communities supported by a commuter working force to the endless rows of office buildings? It also means, says Richard Freeman, a leading labor economist at Harvard University, that far more people need to “own the robots” inclusive of other kinds of automation and digital technologies in general. Some mechanisms already exist in profit-sharing programs and employee stock-ownership plans. Other practical investment programs can be envisioned, he says.

Whoever owns the capital will benefit as robots and AI inevitably replace many jobs. If the rewards of new technologies go largely to the very richest, as has been the trend in recent decades, then dystopian visions could become reality. But the machines are tools, and if their ownership is more widely shared, the majority of people could use them to boost their productivity and increase both their earnings and their leisure. If that happens, an increasingly wealthy society could restore the middle-class dream that has long driven technological ambition and economic growth.

The concept of a “living income” also allows us to keep the wheels of the economy and innovation turning. “A fundamental insight of economics is that an entrepreneur will only supply goods or services if there is a demand, and those who demand the good can pay,” writes the Center for Internet and Society expert Andew Rens.

Progress depends, in no small way, on people buying stuff, and that depends on them having an income. A living income isn’t completely without precedent. In the 1970s, a five-year basic income program in the Canadian province of Manitoba called Mincome showed promising results. Parents spent more time raising children. Students showed higher test scores and lower dropout rates. Hospital visits, mental illness, car accidents, and domestic abuse cases all declined. And in the end, total working hours only slipped by a few percentage points. In other words, having a basic income didn’t lead to sloth or indolence. It let people spend time on the things that mattered: family, education, health, personal fulfillment.

Whatever the reasons for the disconnect between productivity and wage growth, it’s a problem for everyone, not just workers. Rich people like their money, but who wants to live in a world where the haves hide in cloistered communities defended by private armies, while starving haves-not work for peanuts, if at all? To date, we have chosen to distribute society’s resources largely based on our ability and willingness to work. We appear to be rapidly evolving to a world where assets, not labor, are the primary driver of prosperity. So the question is: How can we move toward an economy that equitably distributes benefits in an asset-based economy?

Jerry Kaplan, author of Humans Need Not Apply: A Guide to Wealth and Work in the Age of Artificial Intelligence, says the problem of the lack of connection between increased productivity and stagnant wage growth is a serious problem for the economy, a trend that feeds income inequality. Do we want to see our society as portrayed in the movie Solyent Green, in which the haves hide in clustered communities defended by private armies, while starving have-nots work for peanuts, if at all?

Will Government’s Role Become More Substantial?

New technologies created by AI and robotics will be fed with capital, so it can be assumed under the current free-market capital system the profits from these industries will accrue to the same wealthy individuals and corporations, and not find their way into the hands of the rest of the population. What then becomes the role of government, one purpose for, is to ensure the well being of all its citizens? One proposal to address this issue is for government to provide a guaranteed income to all adults (who are also consumers).

The concept of a universal income without universal work will be terrifying to many political conservatives. A modified solution could be for government to pay people to do something,–including education– rather than nothing, which again raises the fear of socialism in America. In both the U.S. and Canada, recent debate has raged about the minimum wage. But while the debate on whether or not workers should be paid a minimum wage, Finland experimented with  giving every citizen, regardless of income or employment status, around $850 a month. The idea behind the plan — called “basic income” — is that it would replace all other welfare and would serve as a replacement for all other government benefits. Unfortunately, they abandoned the plan.

Governments will have to deal with the reality of the end of retirement. Forget quitting at age 65. As life expectancy lengthens, people will be expected to work longer. Governments are already struggling to afford pensions for longer living population and people find it harder to make their retirement savings stretch. It’s likely first that retirement will become gradual, and then extended to 67 or 70 within 20 years. This also presents challenges for employers in initiatives to ensure older workers are productive. And there is a myth that older workers are less capable or productive, which is not born out by research.

How Then Do We Define Work and Its Value?

Prior to the 20th century, in English the term “job” connoted fragmented, low-quality piece-work. But through time we elevated some of these to the status of “real jobs” and rewarding the minority who performed them as job-holders.

Peter Frase, author of Four Futures: Life After Capitalism, describes how automation will change North America, based on his argument that human labor will end, along with our belief and commitment to “work for work’s sake.” Many experts would argue that for some time now, jobs have not been motivating or rewarding for most people, as evidenced by studies that how as many as 70% of workers are not engaged in their jobs. The modern quest for meaningful work underpins a paradox—we are both disengaged from our jobs and terrified of losing them.

For decades, psychologists and other experts have demonstrated intrinsic factors—purpose, meaning, creativity, fulfillment and autonomy—are actually absent in the typical job today. Several studies have shown that North Americans place a higher value on work and work more hours than Europeans, and feel guilty when they are not productive. This emphasis will exacerbate the problem of the disappearance of jobs from the lives of many. Will the vacuum be replaced—as has often been forecasted—by leisure time? One such possibility would be the development of creative communities such as “maker spaces” or industrial shops of artisans in small communities.

One theory of work proposes that people see themselves in jobs, careers or callings. People who say their work is “just a job” emphasize that they are working for money rather than aligning themselves with any higher purpose. People who pursue a calling don’t do it for status or pay but for the intrinsic fulfillment of the work itself.

There was a time when work and home were distinct realms. The old industrial clock regulated our lives into discrete blocks of time, and a separation between public and private life. No longer. The constant connectivity of mobile, digital technologies erases the boundaries of the week and weekend, and their characteristic social relations. How will we maintain the line between “my time” and “employers’ time?”

In his Harvard Business Review article, “Create a Meaningful Life Through Meaningful Work,” author Umair Haque writes, “Maybe the real depression we’ve got to contend with isn’t merely one of how much economic output we’re generating – but what we’re putting out there and why. Call it a depression of human potential, a tale of human insignificance being willfully squandered.”

Recent studies from research at McKinsey conclude that providing meaningful work to employees was the most important contributing factor to a high level of engagement. In her book, The Progress Principle, author Teresa Amabile reports that of all the events that can deeply engage people in their work, the single most important factor was meaningful work. According to Amabile, “Beyond affecting the well-being of employees, research shows that the ‘inner work life’ affects the bottom line.”

A Dystopian or Utopian Future?

Concepts of utopia and dystopia represent imaginary societies in which people live their life either in a perfect environment, governed by the laws that provide happiness to everyone, or in an oppressive society that is ruled by repressive and controlled state. Origin of these concepts can be traced to the year of 380BC, when Greek philosopher Plato released his influential political dialogue called “Republic”. In it, he first postulated the main themes of utopian society and his visions of the perfect Greek city-state that provided stable life for all of its citizens.

The modern world “Utopia” came to life during early years of 16th century, in the work of the famous English philosopher Thomas Moore. His description of utopian society gave birth to enormous wave of utopian thought that influenced the life and works of many future philosophers and novelist, and helped in the creation of several important political movements (most notably socialism).

Utopias that were envisioned by the minds of those authors can most easily be divided in several distinct categories, all based on the means of their creation – Ecology utopia, Economic utopia, Political utopia, Religious utopia, Feminists utopia and Science and technological utopia. 19th century gave the birth of the largest wave of utopian thought the world has ever seen. Numerous novelist and philosophers focused their careers on the exploration of those themes, and result of their work influenced the audiences across the entire world. Most notable utopian novel from that period was without a doubt Looking Backward by Edward Bellamy.

Not all examples of utopian life were set in the theory. Some people tried to realize the dreams postulated in the work of several philosophers, and so the age of utopian societies came to life. During the 19th century, over a dozen utopian societies were established in the United States, and few of them managed to survive even to today.

End of 19th century brought the rise of Dystopian thought. Numerous philosophers and authors imagined the dark visions of the future where totalitarian rulers governed the life of ordinary citizens. Their works explored many themes of dystopian societies – repressive social control systems, government coercion of citizens, influence of technology on human mind, coping mechanisms, individuality, freedom of life and speech, censorship, sexual repression, class distinctions, artificial life and human interaction with the nature (and often the consequences of its destruction).

Some of the earliest and influential works of dystopian fiction can be contributed to the authors H.G. Wells (Time Machine), Aldous Huxley (Brave New World) and George Orwell (Nineteen Eighty-Four). Their works paved the way to the numerous other authors, who even today manage to envision some new aspect of life in dystopian societies. In addition to literature, dystopian themes found life in many other types of mediums, such as comic books (most notably V for Vendetta, Transmetropoliten, Y: The Last Man and Akira), music, video games (Fallout, Deus Ex and BioShock ) television series (The Prisoner, Dark Angel, Doctor Who and Twilight Zone) and movies (Metropolis, Blade Runner, A Clockwork Orange and The Matrix).

It is difficult to predict with accuracy how technology will shape our future, to what extent it will be used in favor of the citizen and the public good. What has become clear is that it has fallen upon society to assume responsibility for the way technology is used—including to protect individual identity and privacy from governments and corporations.

Technology is not the solution to hunger, war and poverty, but merely a tool. Society can no longer meekly adopt it without thinking of the repercussions of particular advances. Rather, we must actively ensure that it enhances our quality of life the way we had hoped it would. If not, technology will keep advancing but society will lag behind.

Read my latest book: Eye of the Storm: How Mindful Leaders Can Transform Chaotic Workplaces, available in paperback and Kindle on Amazon and Barnes & Noble in the U.S., Canada, Europe and Australia.