Complex Made Simple

Facial recognition’s growing use during the pandemic and its pitfalls

Emergencies can often lead to escalated surveillance and security measures, and facial recognition is the latest tool that could potentially be abused.

Countries like South Korea, Taiwan and Singapore used facial recognition extensively to fight the pandemic In fact, the global facial recognition market is set to grow from $3.8 billion in 2020 to $4.5 billion by 2021, at a CAGR of 17.1%, thanks in part to demand generated by the outbreak However, like any other tool, there is much room for abuse with facial recognition software

The onset of this COVID-19 pandemic has had governments and companies scrambling with all limbs to try and get the tiniest fraction of control over the situation. Often, technology has provided the best tools for this, and among them is none other than facial recognition, whose use has brought up privacy questions all in of itself. 

In April, we discussed how crises often give governments free reign to enact more authoritarian measures they wouldn’t be able to get away with during times of peace or normality. Today, we will be focusing on facial recognition growing use during the pandemic, and its potential pitfalls. 

Use cases today

Facial recognition has proven an instrumental tool in helping fighting the coronavirus pandemic. 

South Korea, one of the countries that best handled the outbreak, enforced swift but somewhat invasive protocols to keep cases low. This included rigorous contact tracing and heavy use of facial recognition. Other Asian states that followed in this example include Taiwan and Singapore.

“In South Korea, government agencies are harnessing surveillance-camera footage, smartphone location data and credit card purchase records to help trace the recent movements of coronavirus patients and establish virus transmission chains,” The New York Times reported.

Image: RTA

Dubai recently announced it has been trialing AI and facial recognition tools in RTA taxis to ensure passengers are wearing face masks and following social distancing guidance. 

“AI technologies have been employed to monitor and verify the compliance with the preventive measures undertaken to limit the spread of the Coronavirus. The technology can also report offences such as the failure to observe physical distancing, and the improper wearing of face masks, thanks to video analysis feature,” said Ahmed Mahboub, Executive Director of Smart Services, Corporate Technology Support Services Sector, RTA.

“The experiment highlighted the capability of AI technology in processing video files spanning 200 thousand hours a day. Thus, it reduces the need for human intervention and saves much time and effort that would have otherwise been necessary to analyse these videos,” he noted. 

2019 saw thousands of CCTV cameras in Dubai used to help in the arrest of 319 suspects under the Oyoon (Eyes) project that employs artificial intelligence (AI) and facial recongition software, a Dubai Police official said back in March, as reported by Gulf News.  

There has been a discussions as well around Immunity Passports that supposedly cured patients can use to make it through security more easily. These too will utilize facial recognition. 

According to recent research, the global facial recognition market size is set to grow from $3.8 billion in 2020 to $4.5 billion by 2021, at a Compound Annual Growth Rate (CAGR) of 17.1%. This is partially due of course to coronavirus-related demand. 

Privacy at risk
Facial reconigition has always been controversial. While it certainly looked cool in science fiction films at first, it also brought up a major debate about surveillance states and unprecented privacy invasion. With cameras in everything our phones and TVs and even to our cars, the potential for abuse is serious. 

Now, with governments having near free reign to employ whichever privacy-breaching tools they need to fight the virus, we could see temporary ‘extreme’ surveillance measures linger past the virus’ expiry date. Just look at the 9/11 and how it shaped US security and surveillance forever. 

“Nearly two decades [after 9/11], law enforcement agencies have access to higher-powered surveillance systems, like fine-grained location tracking and facial recognition — technologies that may be repurposed to further political agendas like anti-immigration policies. Civil liberties experts warn that the public has little recourse to challenge these digital exercises of state power.”

Ongoing civil unrest in the US has even led to Amazon banning use of its facial recognition software by police for one year, as pressure on tech companies builds to respond to the killing of George Floyd by a police officer in Minneapolis, which has shown the misuse of power by authorities for personal bias, CNBC reported. Facal recognition is just the latest tool that could be abused if not properly moderated.