We use cookies to collect and analyse information on site performance and usage to improve and customise your experience, where applicable. View our Cookies Policy. Click Accept and continue to use our website or Manage to review and update your preferences.


IBM pledge to shelve facial-rec  tech over bias concerns

10 Jun 2020 / technology Print

IBM pledges to shelve its facial-rec tech, citing bias

IBM CEO Arvind Krishna has said in a letter to the US House of Congress that it will discard its general-purpose facial-recognition and analysis technology.

The announcement came in the context of addressing responsible use of technology by law enforcement, following calls for police reform after the killing of George Floyd by a Minneapolis police officer.

The tech giant says it will no longer offer products for "mass surveillance or racial profiling".

Bias

In the letter to Congress, IBM said artificial intelligence (AI) systems used in law-enforcement needed testing "for bias", but said the firm wanted to work with Congress on police reform, responsible use of technology, and broadening skills and educational opportunities.

"IBM firmly opposes and will not condone the uses of any technology, including facial-recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms," Mr Krishna wrote.

"We believe now is the time to begin a national dialogue on whether and how facial-recognition technology should be employed by domestic law-enforcement agencies".

Data analytics

IBM said Congress should consider technology that would bring "greater transparency", such as body cameras on police officers and data analytics.

However, Privacy International's Eva Blum-Dumontet said IBM had coined the term ‘smart city’, and they pushed urbanisation models that relied on CCTV cameras and sensors processed by police.

"This is why it is very cynical for IBM to now turn around and claim they want a national dialogue about the use of technology in policing," she said.

She commented that the announcement of an end to 'general-purpose' facial recognition was ambiguous.

Data sets

Activists have complained about racial biases in facial-recognition data sets.

A 2019 Massachusetts Institute of Technology study found that tools from Microsoft, Amazon and IBM were not 100% accurate when it came to recognising men and women with dark skin.

A US National Institute of Standards and Technology study found facial-recognition algorithms were far less accurate at identifying African-American and Asian faces compared with Caucasian ones.

Gazette Desk
Gazette.ie is the daily legal news site of the Law Society of Ireland