When I joined the audience at Gresham College for Professor Daniel Susskind’s inaugural lecture on automation anxiety, I expected the familiar line: AI is coming for our jobs. That’s not what I heard. Instead, Susskind, Professor of Business at Gresham, made a surprisingly nuanced argument, explaining that the real story was less about the number of jobs losses, and more about how the very nature of work is shifting. This piece reports on that event (available to watch below) and reflects on what his ideas mean for workplace leaders – and, from my vantage point as a communications professional working with organisations through workplace and digital transformation, how we talk to and engage our colleagues through the change.
We hear the phrase “unprecedented times” a lot these days. But we’ve been here before, haven’t we? Every great leap in technology produces “automation anxiety”– the fear that machines will take all the jobs. This anxiety is understandable and cyclical. Sometimes justified, sometimes not. The Industrial Revolution didn’t end work, and neither will AI, at least not on its own. Unemployment didn’t explode then, and today’s labour markets remain tight, despite what the headlines suggest. Real action, Susskind said, lies in recognising that work will always change and, more importantly, doing something about it.
People may or may not worry about their jobs disappearing – depends on their disposition – but they should probably spend time considering what their jobs may become if the pace of AI continues in this vein, whether that’s the productivity expected and the surveillance demanded, or the expertise required (or removed) and possible dignity lost.
History offers a sober warning. While productivity jumped during the Industrial Revolution, wages lagged for decades – what economic historians call Engels’ Pause, a period when the economy grew but pay stalled. It’s a reminder that productivity gains don’t automatically flow to workers.
Writers and thinkers of their time chronicled the human cost of innovation and revolution. William Blake’s references to the “dark satanic mills” in his poetry; Karl Marx’s peppered mentions of “misery”, “agony”, “slavery” and “mental degradation” in Das Kapital; John Ruskin’s laments about “poisoned air” and “diminished life” – all point to degraded quality of work and life amid technological progress.
Power and practice
Today’s equivalents include direct and indirect forms of labour exploitation. At one end are the “ghost workers” in countries such as Kenya, the Philippines and India, paid a pittance to review and censor traumatising material on behalf of tech giants like Meta. At the other end are the millions of everyday consumers pressed into service through so-called CAPTCHA tests – picking out motorbikes, traffic lights or zebra crossings – an exercise dressed up as security but in reality free labour that helps to train machine-learning models.
Then there’s the rise of algorithmic scheduling, where computer systems automatically assign shifts and set working hours. For retail staff, warehouse operatives or gig workers, that can mean being told when to work with very little notice, making it almost impossible to plan family life or other commitments. It can mean loss of income and an increase in inequality.
While new jobs will emerge after the initial upheaval, many of these may be insecure, invisible or potentially inhumane unless we consciously design them better
A gig worker named Armin Samii, for example, built a browser extension called UberCheats after suspecting he’d been underpaid by Uber Eats. He once did a multi-mile delivery only to be paid for a single mile. Frustrated by many mileage miscalculations, he created the tool to automatically cross-check the distances recorded by Uber against Google Maps. If a significant discrepancy cropped up, the extension alerted the driver and offered links for filing a claim. Early data showed about 21% of trips were underpaid. UberCheats remains an act of self-help for this particular branch of the gig economy and goes some way to demonstrate algorithmic scheduling’s impact on workers, unless contested.
And it’s not just affecting gig workers. There’s already a growing quantification of workers, where every keystroke or task is tracked and recorded, where individual and collective output is scored. So, while new jobs will emerge after the initial upheaval, many of these may be insecure, invisible or potentially inhumane unless we consciously design them better.
For most of the late 20th century, more education meant higher returns. But as Susskind noted, earlier waves of mechanisation also deskilled some of the more ‘prestigious’ roles, making tasks once reserved for craftsmen doable by less-skilled workers using machines. We may be entering a similar phase for white-collar work. Generative AI can draft, summarise, code and synthesise, acting as both an aid and leveller. But this has the potential to also erode the status (and bargaining power) of those whose edge was doing all that the old fashioned (and slower) way. If we outsource too much thinking, too much doing, it could result in thinner progression pathways and a slow hollowing-out of human expertise, Susskind warned.
What good leadership looks like now
I’ve cautioned before that over-automation can dull critical thinking and professional judgement. When teams lose touch with the underlying process, they become less able to spot errors, probe assumptions or challenge outputs. Recent research by MIT found that 83% of ChatGPT users couldn’t even recall what they wrote minutes earlier. My suggestion is to prioritise active over passive engagement in and out of work. Communications plays a key role here.
As a workplace communications specialist, my counsel to clients is: share the journey as well as the destination. Employees can handle the truth if we treat them as collaborators rather than cogs in a machine. That starts with explaining the “why” – the problems AI is meant to solve for customers and colleagues, plus the areas where it should not and will not be used. This means co-creating guardrails, bringing employee representatives and domain experts into conversations about privacy, and fairness and escalation so the rules and policies feel natural rather than imposed.
Upskilling should follow the same principle. It’s not just about teaching prompt tricks but building a shared understanding of AI’s strengths and limitations.
Alongside that, organisations must design roles to focus as much on quality as quantity, as much on input as output. If you’re going to measure productivity, or rather perceptions of productivity (that’s the closest you can get in knowledge work), you should also measure workload, wellbeing, learning opportunities, employee sentiment and progression. And through it all, human expertise must remain front and centre. Keep people in the loop for judgement calls and reward those who challenge when something doesn’t feel right.
Managers, in particular, need tailored support before, during and after any transformation. They are the ones on the front line of change, so equipping them with AI literacy, ethical risk awareness, and basic change communication skills is essential.
Gresham’s message is timely. The headline question – “Will AI take our jobs?” – plays directly to our anxieties. Beyond upskilling our teams, there’s only so much influence we have over that outcome. What we can shape is the kind of work we are creating, who truly benefits, and how we ensure it remains fair, purposeful and human.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behaviour or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
September 16, 2025
AI, automation anxiety and the future of work: lessons from Daniel Susskind
by Jo Sutherland • AI, Comment, Workplace
We hear the phrase “unprecedented times” a lot these days. But we’ve been here before, haven’t we? Every great leap in technology produces “automation anxiety”– the fear that machines will take all the jobs. This anxiety is understandable and cyclical. Sometimes justified, sometimes not. The Industrial Revolution didn’t end work, and neither will AI, at least not on its own. Unemployment didn’t explode then, and today’s labour markets remain tight, despite what the headlines suggest. Real action, Susskind said, lies in recognising that work will always change and, more importantly, doing something about it.
People may or may not worry about their jobs disappearing – depends on their disposition – but they should probably spend time considering what their jobs may become if the pace of AI continues in this vein, whether that’s the productivity expected and the surveillance demanded, or the expertise required (or removed) and possible dignity lost.
History offers a sober warning. While productivity jumped during the Industrial Revolution, wages lagged for decades – what economic historians call Engels’ Pause, a period when the economy grew but pay stalled. It’s a reminder that productivity gains don’t automatically flow to workers.
Writers and thinkers of their time chronicled the human cost of innovation and revolution. William Blake’s references to the “dark satanic mills” in his poetry; Karl Marx’s peppered mentions of “misery”, “agony”, “slavery” and “mental degradation” in Das Kapital; John Ruskin’s laments about “poisoned air” and “diminished life” – all point to degraded quality of work and life amid technological progress.
Power and practice
Today’s equivalents include direct and indirect forms of labour exploitation. At one end are the “ghost workers” in countries such as Kenya, the Philippines and India, paid a pittance to review and censor traumatising material on behalf of tech giants like Meta. At the other end are the millions of everyday consumers pressed into service through so-called CAPTCHA tests – picking out motorbikes, traffic lights or zebra crossings – an exercise dressed up as security but in reality free labour that helps to train machine-learning models.
Then there’s the rise of algorithmic scheduling, where computer systems automatically assign shifts and set working hours. For retail staff, warehouse operatives or gig workers, that can mean being told when to work with very little notice, making it almost impossible to plan family life or other commitments. It can mean loss of income and an increase in inequality.
A gig worker named Armin Samii, for example, built a browser extension called UberCheats after suspecting he’d been underpaid by Uber Eats. He once did a multi-mile delivery only to be paid for a single mile. Frustrated by many mileage miscalculations, he created the tool to automatically cross-check the distances recorded by Uber against Google Maps. If a significant discrepancy cropped up, the extension alerted the driver and offered links for filing a claim. Early data showed about 21% of trips were underpaid. UberCheats remains an act of self-help for this particular branch of the gig economy and goes some way to demonstrate algorithmic scheduling’s impact on workers, unless contested.
And it’s not just affecting gig workers. There’s already a growing quantification of workers, where every keystroke or task is tracked and recorded, where individual and collective output is scored. So, while new jobs will emerge after the initial upheaval, many of these may be insecure, invisible or potentially inhumane unless we consciously design them better.
For most of the late 20th century, more education meant higher returns. But as Susskind noted, earlier waves of mechanisation also deskilled some of the more ‘prestigious’ roles, making tasks once reserved for craftsmen doable by less-skilled workers using machines. We may be entering a similar phase for white-collar work. Generative AI can draft, summarise, code and synthesise, acting as both an aid and leveller. But this has the potential to also erode the status (and bargaining power) of those whose edge was doing all that the old fashioned (and slower) way. If we outsource too much thinking, too much doing, it could result in thinner progression pathways and a slow hollowing-out of human expertise, Susskind warned.
What good leadership looks like now
I’ve cautioned before that over-automation can dull critical thinking and professional judgement. When teams lose touch with the underlying process, they become less able to spot errors, probe assumptions or challenge outputs. Recent research by MIT found that 83% of ChatGPT users couldn’t even recall what they wrote minutes earlier. My suggestion is to prioritise active over passive engagement in and out of work. Communications plays a key role here.
As a workplace communications specialist, my counsel to clients is: share the journey as well as the destination. Employees can handle the truth if we treat them as collaborators rather than cogs in a machine. That starts with explaining the “why” – the problems AI is meant to solve for customers and colleagues, plus the areas where it should not and will not be used. This means co-creating guardrails, bringing employee representatives and domain experts into conversations about privacy, and fairness and escalation so the rules and policies feel natural rather than imposed.
Upskilling should follow the same principle. It’s not just about teaching prompt tricks but building a shared understanding of AI’s strengths and limitations.
Alongside that, organisations must design roles to focus as much on quality as quantity, as much on input as output. If you’re going to measure productivity, or rather perceptions of productivity (that’s the closest you can get in knowledge work), you should also measure workload, wellbeing, learning opportunities, employee sentiment and progression. And through it all, human expertise must remain front and centre. Keep people in the loop for judgement calls and reward those who challenge when something doesn’t feel right.
Managers, in particular, need tailored support before, during and after any transformation. They are the ones on the front line of change, so equipping them with AI literacy, ethical risk awareness, and basic change communication skills is essential.
Gresham’s message is timely. The headline question – “Will AI take our jobs?” – plays directly to our anxieties. Beyond upskilling our teams, there’s only so much influence we have over that outcome. What we can shape is the kind of work we are creating, who truly benefits, and how we ensure it remains fair, purposeful and human.
Jo Sutherland is Managing Director of Magenta Associates.