Emotion tracking tech in the workplace puts people’s wellbeing at risk

A new report from the Institute for the Future of Work (IFOW) explores the increasing use of affective computing in the workplace.A new report from the Institute for the Future of Work (IFOW) explores the increasing use of affective computing in the workplace. Affective computing is a branch of artificial intelligence which focuses on recognising and responding to human emotions through technologies like biometric sensors, emotion-tracking software, and wearable devices. Once primarily used in consumer products, these systems are now finding applications in the workplace, often marketed as tools to enhance safety, productivity, and employee wellbeing. The use of AI-powered technologies that monitor and interpret employees’ emotions and behaviours is known as Algorithmic Affect Management (AAM) and is rapidly transforming the landscape of employment, raising significant questions about privacy, ethics, and the future of work, according to the report.

The authors of the report, Professor Phoebe Moore and Dr Gwendolin Barnard draw on research, interviews, and surveys to warn of potential risks tied to the deployment of these systems while highlighting opportunities for positive outcomes if used responsibly. As affective computing becomes more prevalent, the report calls for robust regulation to safeguard workers’ rights and wellbeing.

The use of AAM technology to monitor people’s physiological and emotional states then feed the data into algorithmic management systems is increasingly widespread to inform decisions about task allocation, performance evaluation, and even hiring or firing.

The IFOW report highlights a range of AAM workplace technologies, including EEG devices that measure cognitive load, video systems equipped with emotion-detection AI, and wearable gadgets that track stress, fatigue, and attention levels. While the adoption of these tools promises to optimise workplace efficiency, it also ushers in an era of unprecedented surveillance and control over workers.

The report incorporates findings from two surveys conducted with 380 employees who have experienced AAM technologies in their workplaces. Key insights include:

  1. Limited Perceived Benefits:
    Fewer than 10% of respondents believed AAM systems positively impacted their health, safety, or wellbeing. Around 45% actively disagreed, reporting increased stress and a lack of supportive work environments.
  2. Technostress and Increased Workload:
    Many workers reported that AAM systems led to greater pressure to work faster, meet tighter deadlines, and adapt their behaviours to suit the demands of the technology.
  3. Privacy and Autonomy Concerns:
    Workers expressed significant discomfort with the invasive nature of these systems, which often operate without sufficient transparency or consultation.
  4. Bias and Inequality:
    AAM technologies risk reinforcing existing biases. For example, facial recognition systems have been shown to misinterpret emotions based on racial, cultural, or gendered stereotypes.
  5. Lack of Worker Consultation:
    The introduction of AAM tools often bypasses meaningful engagement with employees, leaving them ill-informed about how the systems work or how their data is used.

The IFOW report acknowledges that AAM technologies, when responsibly deployed, can offer tangible benefits. For example, fatigue monitoring tools can prevent accidents in high-risk industries, and emotional analytics can help employers design better work environments.

However, these potential benefits are counterbalanced by significant risks:

  1. Mission Creep:
    Data collected for one purpose may be repurposed without workers’ consent, raising concerns about surveillance overreach.
  2. Bias and Misinterpretation:
    Affective computing systems are prone to errors, such as misidentifying emotions or applying cultural biases. These inaccuracies can have severe consequences when used for critical decisions like hiring or performance evaluation.
  3. Loss of Autonomy:
    The use of AAM tools can reduce employees’ sense of control over their work, particularly when the technology is used to enforce stricter management practices.
  4. Ethical Concerns:
    The commodification of workers’ emotions and behaviours poses profound ethical questions about the boundaries between professional and private life.

The IFOW emphasises the urgent need for regulatory frameworks to govern the use of AAM technologies in the workplace. Recommendations include:

  1. Stronger Legal Protections:
    Existing laws around employment, privacy, and equality should be extended to cover AAM. This includes introducing neuro-rights to protect against excessive surveillance of cognitive and emotional functions.
  2. Transparency and Accountability:
    Employers must provide clear information about what data is collected, how it is used, and what decisions it influences. Workers should have access to this information and the ability to challenge decisions made by AAM systems.
  3. Worker Consultation:
    The introduction of AAM tools should involve meaningful engagement with employees and their representatives, ensuring that systems are designed and implemented with their input.
  4. Impact Assessments:
    Companies should conduct rigorous assessments to evaluate the risks and benefits of AAM technologies before deployment, with ongoing monitoring to address unforeseen impacts.
  5. AAM Literacy Programmes:
    To foster trust and understanding, workers, unions, and managers should receive training on how AAM technologies work and their implications.

The IFOW report highlights the dual potential of AAM to either enhance worker wellbeing or exacerbate existing inequalities and stress. The report argues that policymakers have a critical role to play in shaping this future. By establishing robust legal frameworks, promoting transparency, and encouraging ethical practices, governments can ensure that technology serves workers rather than exploiting them.

The report concludes with a call for a more integrated and proactive approach to governance, aligning with international efforts such as the UNESCO Recommendation on the Ethics of Neurotechnology.