Business Technical Editing

Examining the Ethics of Behavioral Video Analytics

Copyright [Goodpics] / Adobe Stock

For my final project in EH603 Editing for Publication, I am examining the ethical implications of behavioral video analytics.  This is relatively new technology, but it’s been gaining traction in the past few years, especially for crowd surveillance.  Behavioral analytics incorporates artificial intelligence (AI) into network CCTV cameras (Crescent Guardian, Inc., 2016).  This is beneficial for corporations and government agencies because it could significantly lower (or even eliminate) the need for human monitoring.

With machine learning and deep learning technology, the camera could “learn” to recognize normal crowd movement and behavioral patterns (Pao, 2018).  For example, in an airport, a lot of people are stressed and rushing to their gates.  Most of us have been there.  It can be a stressful situation.  The AI enabled camera would learn to recognize these behaviors as normal (Pao, 2018).  Abnormal or suspicious behaviors, like deliberately leaving a bag, would immediately be flagged and investigated.  The other cameras in the system could also track the suspect throughout the airport.  The intent of the technology is to recognize criminal behavior before the crime is even committed

If this all sounds a bit too Orwellian, that’s because it is.  We live in an increasingly data-driven society, and post-9/11 fear contributed greatly to the idea that public areas should be constantly monitored for threats.  The USA Patriot Act of 2001 allows for watchlists and surveillance of anyone the government suspects is part of an organized crime or terror organization (United States Department of Justice, 2001).

The real question is where is the line drawn?  Is it ethical to track or detain ordinary citizens just because they may have made an expression or movement that the AI camera system deemed suspicious?  Is it an invasion of privacy to be filmed and have your video image stored by businesses or agencies?

Additionally, it should be noted that the technology doesn’t program itself (not yet anyway).  The algorithms that make behavioral video analytics work are still programmed by human beings (Williams, Brooks, & Schmargad, 2018).  This leaves room for discrimination based on skin tone and cultural dress.  The algorithms could be programmed in such a way that they automatically discriminate.  There have also been numerous studies that show failures in biometric facial recognition technology.  For instance, Amazon created a tool called Rekognition that falsely identified 28 members of U.S. Congress (Snow, 2018).  There is so much room for error.  Can we depend on biometrics?  At this moment, I don’t think we can and neither does the American Civil Liberties Union (ACLU) (Snow, 2018).

This is not to say that new CCTV technology can’t provide benefits.  It certainly can.  CCTV has helped identify the Boston bombers, ISIS members in the Brussels airport, lost children, theft, and smuggling to name a few.  CCTV has transformed into technology that makes us feel safer in our homes.  Ring doorbells and other similar home security products are hugely popular.  We even use this technology in baby monitors, keeping sleeping infants safer from SIDS and giving nervous parents peace of mind.  But is there a point when the tech goes too far?  Could behavioral video analytics be the harbinger of a safer future or maybe just another cog in the Big Brother surveillance machine?  That’s what I hope to explore in this project.

There’s one more interesting tidbit for us residents of Huntsville.  The first CCTV system was invented in 1942 for the purpose of observing Nazi V-2 rocket launches (Herring Technology, 2015).  Wernher von Braun was one of the V-2 project’s engineers and the designer of Huntsville’s very own Saturn V rocket (Hollingham, 2014).  It was an unexpected coincidence in my research.

Sources:

Crescent Guardian, Inc. (2016). Frequently asked questions about behavioral analytics software for surveillance cameras.  Retrieved from http://www.cgiprotects.com/faqs-behavioral-analytics#%23Different

Herring Technology. (November 4, 2015). The history of CCTV. Retrieved from https://resources.herringtechnology.com/blog/news/the-history-of-cctv

Hollingham, Richard. (September 8, 2014). V-2: The Nazi rocket that launched the space age. Retrieved from http://www.bbc.com/future/story/20140905-the-nazis-space-age-rocket

Pao, William. (February 21, 2018). Applying deep learning to facial recognition cameras. Messe Frankfurt New Era Business Media, Ltd. Retrieved from https://www.asmag.com/print_article.aspx?id=24600

Snow, Jacob. (July 26, 2018). Amazon’s face recognition falsely matched 28 members of Congress with mugshots. Retrieved from https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-face-recognition-falsely-matched-28

United States Department of Justice. (October 26, 2001). Uniting and strengthening America by providing appropriate tools required to intercept and obstruct terrorism (USA Patriot Act) act of 2001 (Public Law 107-56). Retrieved from United States Justice Department: https://www.govinfo.gov/content/pkg/PLAW-107publ56/pdf/PLAW-107publ56.pdf

Williams, B., Brooks, C., & Shmargad, Y. (2018). How Algorithms Discriminate Based on Data They Lack: Challenges, Solutions, and Policy Implications. Journal of Information Policy, 8, 78-115. doi:10.5325/jinfopoli.8.2018.0078