June 19, 2024
Secret AI cameras have been tracking the emotions and demographics of rail passengers
Many thousands of unsuspecting train passengers in the UK have had their emotions and demographics recorded by hidden AI camera systems at major stations, a new report reveals. The news, raising serious privacy concerns, comes after a freedom of information request by Big Brother Watch. For over two years, Network Rail, the company overseeing Britain’s railway infrastructure, conducted a covert trial program at key stations like London’s Waterloo and Euston, Manchester Piccadilly, and others across the country.
Cameras strategically placed at ticket barriers captured passenger faces and fed them into Amazon’s Rekognition service, a powerful AI tool that analyses images to determine emotions like happiness, sadness, and anger. Additionally, the system estimated passengers’ age and gender.
Network Rail defended the program, claiming it aimed to improve customer satisfaction and potentially generate additional advertising revenue by understanding passenger demographics and emotional states. However, privacy advocates have strongly condemned the lack of transparency surrounding the trial and cast doubt on the accuracy and usefulness of emotion detection technology, particularly in such a dynamic environment.
Big Brother Watch has expressed its deep concern. “It is alarming that a public body like Network Rail would roll out a large-scale trial of Amazon-made AI surveillance without informing the public,” stated Jake Hurfurt, the group’s head of research and investigations. “Mixing safety tech with tools of questionable scientific merit and suggesting the data could be used for advertising is a worrying development.” Big Brother Watch has since filed a complaint with the Information Commissioner’s Office (ICO), the UK’s data privacy watchdog.
Experts in the field of technology and law also voiced their concerns. Professor Lilian Edwards of Newcastle University called the emotion recognition aspect of the program “unethical and probably illegal,” highlighting the technology’s unproven nature and lack of transparency. Professor Sandra Wachter of Oxford University added to the criticism, emphasising the technology’s potential for bias based on gender and ethnicity, as well as its inherent unreliability. “AI cannot accurately read emotions,” she asserted. “Surveillance of this nature not only violates privacy but also normalizes the assumption that everyone is a potential criminal, which is simply not true.”
Network Rail has defended its actions by emphasising its commitment to security. They stated that they utilise a range of advanced technologies to safeguard passengers, staff, and railway infrastructure from crime and other threats. The company also claim they adhere to data protection laws and collaborate with law enforcement agencies to ensure their methods are proportionate and legal.