Opinion: The dangers of facial recognition
Recent developments in technology, coupled with the pandemic, have led to increased use of facial recognition software in South Korea. The increased government surveillance in South Korea has led to less privacy for its citizens — a tradeoff many South Koreans were ready to make in light of COVID-19.
The South Korean government has been able to collect very detailed information about people who tested positive for the virus and publicized it, such as the name of the clinic they were tested at, the times that they left work, and whether or not they wore masks on the subway, among other information. The level of detail released by authorities allowed the public to identify who tested positive.
But could this type of tech lead to unintended consequences?
Facial recognition technology is a powerful tool under the biometrics umbrella. There are many potential benefits such as identifying missing persons and wanted criminals. However, there is opportunity to misuse and abuse this technology, thus violating the rights of the people it’s supposed to benefit.
After 9/11, the PATRIOT Act was passed in the United States to protect national security but ended up grossly violating the privacy rights of Americans. A similar phenomenon could occur in South Korea if they abuse this technology.
If accurate software becomes commercially available, it could put people in danger. Stalkers could easily obtain personal information about their victims. Additionally, doxxing — the practice of revealing private information such as an address or phone number — could become widespread.
Some may argue that this could be useful in identifying and arresting criminals, but this type of software is notorious for not being able to tell people of color apart.
For example, Robert Williams, a Black man, was wrongfully arrested for stealing a watch in Detroit back in June. The Michigan State Police had used facial recognition software to identify Williams as the suspect. Williams only found out that the software had been used to identify him after an officer interrogating him made an offhand comment about how “the computer must have gotten it wrong.”
Additionally, police install surveillance cameras in minority and poor neighborhoods at a disproportionately high rate, making people of color even more likely to be misidentified by facial recognition software.
In Colorado, there are no laws regulating the use of facial recognition software, and most agencies utilizing the technology have not made announcements of its use. The police departments and DMVs in Colorado often will work together on cases to try to identify people involved with a multitude of crimes.
While there is nothing inherently wrong with this, laws need to be passed to increase transparency and regulation of the use of facial recognition in Colorado and the rest of the United States. The laws regulating this software vary greatly at both the local and state level.
A consistent policy regarding the use of facial recognition software in both the public and private sectors is needed, and there needs to be a provision included so that people know that their data is being collected. Some states, such as Illinois and Texas, have passed laws that prohibit private companies from profiting off of data collected via biometrics like facial recognition software, which is a good start.
Aside from the dangers, misuse and abuse of this software, there are data security issues that come along with it. As previously mentioned, authorities may release the information gathered to citizens, which could endanger an individual’s safety. There is also the risk of hacking that accompanies any data stored digitally. In 2019 alone, there were over 1.5 million data breaches and over 160 million records exposed in the United States.
Facial recognition software has evolved greatly over the past decade, and will continue to do so. There are benefits to the technology but there are also significant risks to privacy rights. In order to ensure the rights of citizens, there needs to be a uniform and transparent application of the software whenever it is used for whatever purpose.
Ella Olson is an opinion writer at the Statesman. She’s originally from Minneapolis, Minnesota and enjoys reading, debate, and hammocking.
ella.olson@usu.edu