Ed Bridges, 36, has crowdfunding an action accusing South Wales police of violating his privacy and data protection rights in a landmark judicial review at Cardiff Civil Justice and Family Centre.
The case could upend plans to widen use of the technology.
Facial recognition technology detects faces in a crowd by measuring facial features and comparing the results with photographs.
Officers use the technology to compare ‘watch list’ images of suspects, missing people and persons of interest, the court heard.
Bridges said his face was scanned while he was shopping in 2017 and at a peaceful anti-arms protest in 2018.
His barrister Dan Squires told the court that “What AFR [automated facial recognition] enables the police to do is to monitor people’s activity in public in a way they have never done before.”
Mr Squires said that in the past the police had to use “consent or force” to obtain DNA or fingerprints but added: “You can’t use that sort of data to track people’s movements.”
Human rights charity Liberty is assisting in the case.
South Wales police has led the use of automated facial recognition since 2017 with at least forty deployments so far.
The lawyer told the court that AFT had “profound consequences for privacy and data protection rights”.
He said his client had a reasonable expectation that his face would not be scanned in a public space and that processing of this data had violated respect for privacy protected by Article 8 of the Human Rights Act and the Data Protection Act.
South Wales police argues that it did not infringe the privacy or data protection rights because it was used in the same way as photographing a person in public.
Meanwhile, San Francisco has banned the use of facial recognition technology by police and government agencies but the ban does not include airports or other federally regulated facilities.
The city’s Board of Supervisors said the technology could endanger civil rights and civil liberties, in an eight-one decision.
The Board believes facial recognition could also exacerbate racial injustice and the right to "live free of continuous government monitoring."
The technology has been credited with helping police capture dangerous criminals but has also been criticised for mistaken identity.
China is believed to be using facial recognition technology to track its Uighur Muslim minority by integrating the technology into its network of surveillance cameras.
This is believed to be the first case of a government intentionally using artificial intelligence for racial profiling.