Dezeen Magazine

Kings Cross facial recognition

London's King's Cross uses facial recognition to track visitors

King's Cross in London is using facial recognition technology to surveil the tens of thousands of visitors that frequent the site each day, and Canary Wharf is "considering" following suit.

According to a spokesperson for the privately owned area around King's Cross station, the 67-acre site in central London is using the technology "in the interest of public safety".

Canary Wharf Group is also looking into installing the technology, in a move likely to raise questions about the surveillance of privatised public space.

"Sophisticated systems" to protect privacy

Argent, the property developer for the King's Cross estate, claimed in a statement that this is "to ensure everyone who visits King's Cross has the best possible experience".

"We use cameras around the site, as do many other developments and shopping centres, as well as transport nodes, sports clubs and other areas where large numbers of people gather," read the statement.

"These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public."

Argent has declined to respond to questions on the nature of these "sophisticated systems", including how many are in place, how the data is being used, and the name of the company who supplies them.

While the large city quarter is privately owned by Argent, it is widely used by the public, accommodating a series of homes, shops, restaurants and bars, as well as Central Saint Martins, London's renowned university of the arts.

King's Cross is also home to the Coal Drops Yard shopping complex designed by Heatherwick Studio, which was completed in October 2018.

Cameras "considered" for Canary Wharf

Canary Wharf, a private development of high-rise buildings in east London's former docklands, is also looking into using facial recognition cameras to monitor activity in the area.

Canary Wharf Group confirmed to Dezeen that facial recognition technology is not currently being used, but that the estate is considering deploying the technology to further enhance security.

The mayor of London, Sadiq Khan, has written to the owner of the King's Cross development to share his concerns about the recent news, and to find out more about the legality of facial-recognition CCTV systems.

"London's public spaces should be open for all Londoners to access and enjoy without fear of separation or segregation," wrote the mayor in a post on Twitter. "I've written to the CEO of the King's Cross development to raise my concerns about the use of facial recognition across the site."

In his letter, Khan requests "more information about exactly how this technology is being used" from chief executive of the King's Cross development, Robert Evans.

According to the Guardian, the mayor also seeks "reassurance that you have been liaising with government ministers and the Information Commissioner's Office to ensure its use is fully compliant with the law as it stands".

Facial recognition "should concern us all"

Facial recognition systems involve using biometrics to map facial features from a photograph or video, before running this information through a database of known faces to find a match.

As UK information commissioner Elizabeth Denham explains, this widespread processing of the biometric data of thousands of people is a threat to data protection laws, as the collection of sensitive personal data, including faces, requires consent.

"Any organisation using software that can recognise a face amongst a crowd then scan large databases of people to check for a match in a matter of seconds, is processing personal data," writes Denham.

"For the past year, South Wales Police and the Met Police have been trialling live facial recognition technology that uses this software, in public spaces, to identify individuals at risk or those linked to a range of criminal activity – from violent crime to less serious offences," the article reads.

"We understand the purpose is to catch criminals," it continues. "But these trials also represent the widespread processing of biometric data of thousands of people as they go about their daily lives. And that is a potential threat to privacy that should concern us all."

Ewa Nowak developed a mask-like accessory that makes the wearer's face impossible for facial recognition algorithms to read, made from delicate brass.

Photography is by Luke Hayes.