All industrial revolutions are characterised by changes in the nature of work. This is not just a change in the types of jobs people do to earn a living – from farm labourer to factory worker to computer software engineer, etc. – but also changes in the role of work in society, or what Adam Smith called the division of labour. Over time waged work has become an increasingly important organising principle for individuals, families and the wider society. Not only as the dominant source of individual economic welfare but as a source of self-identity, self-fulfilment and social advancement. Within society it has also served to create an opportunity bargain, aimed at bringing together fairness and productive efficiency through educational opportunity and social mobility. If we are at the dawn of another industrial revolution it is likely to have wide-ranging implications for the way we live our lives, make a living and create the social foundations for a fair and productive economy.
Today, there are vastly different scenarios of the impact of digital technologies on jobs, with some claiming that close to half of American jobs will disappear in the coming decade or so, while others believe it’s the impact of new technologies on existing jobs that will be far more significant than jobs lost through automation. These different futures of work underline the need to get a better understanding of what is happening to jobs, especially if we want to improve the quality of working life and at the same time creating a more inclusive society.
Talking about the futures of work in the plural also serves as a reminder that technology is not destiny. Digital technologies, including artificial intelligence, robotics, cloud computing, etc. determine what is technologically possible, but they do not come with a blueprint of how these technologies should be used in job redesign, HRM or business innovation. Yet the decisions being taken today will shape tomorrow’s future of work in way perhaps as profound as any in human history. Here’s why.
Evidence from previous industrial revolutions, especially over the last hundred years of economic history, led many commentators to assume that the more complex the technology the greater the level of skills required to handle such complexity. This has been described in various ways, including the idea of a race between education and technology, where the task of our schools, colleges and universities is to keep pace with the rising demand for skilled knowledge workers. It is equally assumed that today’s industrial revolution, will follow past experience and create more and better jobs than it destroys, even if there is a painful period of adjustment for those confronting technological extinction.
But what if we are at a technological tipping point where humans are being ‘cognitively challenged’ by advances in AI? This includes a challenge to the mental actions and processes involved in doing professional, managerial and technical jobs, rather than simply substituting for routine tasks such as processing payments or booking hotel rooms. In reality, it’s less a question of whether AI can replace a doctor, lawyer or college lecturer doing exactly the same things, but rather one of how AI is being used to redesign the activities of the doctor, lawyer or college lecturer, such as the use for professional judgement or discretion in deciding the best way of getting the job done. Equally, while some of those defined as doctors, lawyers or college lectures may continue to enjoy high levels of autonomy and get encouraged to explore the frontiers of professional knowledge, many others find themselves in more routine (even if highly specialised tasks) and in more insecure professional employment, where job titles hide the changing realities of professional work.
Technological revolutions associated with a long-term shift to more high-skilled, high-waged, work may be coming to an end. How we address the cognitive challenge of AI, given its potential to de-skill and de-grade the very jobs that define social mobility linked to a college or university education, is key to understanding the future of work. Therefore, the cognitive challenge is not only one of how far we can push technologies to achieve general artificial intelligence, but how these technologies can be exploited to increase productivity as part of achieving a better future of work for all rather than a few, even if the future of work is going to look significantly different from the model we’ve inherited from the last century.