Behind the Lens

Balancing Ethics and Innovation in Smart City Surveillance

December 12, 2024
"Behind the Lens"

Amid the hustle and bustle of cities across the country, a largely invisible network of surveillance cameras silently observes the ebb and flow of urban life. These cameras, embedded into traffic lights, streetlamps and even police body armor, promise to enhance public safety, streamline urban planning and optimize resource management. Yet, as their lenses capture the minutiae of daily life, a critical question arises: Who is watching the watchers?

At The University of Texas at Austin, one of Good Systems’ six core projects, Being Watched: Embedding Ethics in Public Cameras, aims to address this question. Led by an interdisciplinary team of researchers, the project focuses on how ethics, transparency and accountability can be embedded into the deployment of camera-based surveillance technologies. By developing a robust framework for governance, the project seeks to balance technological innovation with civil liberties.

“The rapid adoption of surveillance technologies has outpaced the development of guidelines to ensure they’re used ethically,” said project lead Sharon Strover, a professor in the School of Journalism and Media at the Moody College of Communication and former Good Systems chair. “We see a cottage industry promoting data collection and analytics, but far less attention to evaluating outcomes or addressing moral questions.”

Promises and Perils

The allure of smart city surveillance is undeniable. Cameras equipped with advanced analytics have the potential to improve pedestrian safety, traffic management and real-time crisis response, among other issues. Yet, as Strover said, these technological advancements come with a hidden cost. “People often don’t realize the extent of surveillance in their daily lives,” she said.

Focus groups conducted by Strover’s team revealed a sizeable gap between the public’s awareness and the reality of pervasive data collection — from drones operated by fire departments to cameras monitoring city intersections. Compounding the issue, many government agencies lack comprehensive data policies. “Few units have explicit guidelines for how data is handled or shared, and this creates a black hole of accountability,” said Strover, who also co-directs the Technology and Information Policy Institute.

Concerns about surveillance extend beyond privacy to issues of control and trust. While younger generations may view surveillance as an acceptable trade-off for convenient and affordable (or even free) access to the newest technologies, older groups express greater wariness. These generational and cultural differences highlight the challenge of crafting one-size-fits-all policies.

“Not everyone defines privacy the same way. It varies across cultures, age groups and even personal preferences. Our goal is to make these systems adaptable, so individuals or communities can choose their level of comfort.”

— Atlas Wang, Cockrell School of Engineering

Customizing Privacy in Public Spaces

To address these complexities, project co-lead Atlas Wang and his team have developed a “differential access model,” a framework that restricts who can access surveillance data and for what purposes. “Not everyone defines privacy the same way,” said Wang, an associate professor in the Cockrell School of Engineering’s Chandra Family Department of Electrical and Computer Engineering. “It varies across cultures, age groups and even personal preferences. Our goal is to make these systems adaptable, so individuals or communities can choose their level of comfort.”

This model aims to ensure flexibility while respecting individual privacy. Wang described the innovation as a sort of digital switchboard of knobs and sliding scales: the user could place values on various aspects of privacy or utility, and the algorithms would respond accordingly. “Our algorithms are designed to customize privacy protections,” Wang said. “For example, we can blur faces, obscure sensitive activities or apply encryption based on the context.”

Take public safety, for instance. A city's police department could use the differential access model to detect suspicious behavior without identifying individuals unless an emergency warrants it. However, deciding who gets access to all those sensitive data — whether raw or processed, and whether it’s law enforcement, city planners or researchers — is a governance issue, according to Wang.

Public Safety vs. Public Trust

Another researcher on the project, Anita Varma, focuses on media ethics and solidarity, illustrating the broader societal tensions around surveillance. Her analysis of public discourse around traffic cameras in California helped spotlight how these systems disproportionately impact marginalized communities.

“Cameras are often installed in areas with high traffic fatalities, which tend to overlap with Black and Brown neighborhoods,” said Varma, an assistant professor in the School of Journalism and Media. While the intention might be to improve safety, she explained, communities question whether these measures address root issues like poor infrastructure or merely reinforce existing inequities.

“Efficiency may be the watchword for cities, but ethics can’t be an afterthought. We need nationally accepted guidelines that balance innovation with privacy, grounded in the lived realities of diverse communities.”

— Sharon Strover, Moody College of Communication

Varma found consensus around the need for public safety but deep disagreement about the mechanisms to achieve it. “The debate isn’t just privacy versus security,” she said. “It’s about trust — whether these systems genuinely serve the public or perpetuate harm.”

Transparency could help build trust. Strover suggests simple measures like signage or QR codes near surveillance devices, allowing residents to learn who operates the cameras and how data are used. “These small steps could go a long way in rebuilding trust,” she said.

Educating the public about the trade-offs of surveillance is another important step. Varma emphasized the importance of solidarity in policymaking: “Accountability shouldn’t just be to the general public but specifically to those most negatively impacted by these systems,” she said. “Policymakers need to bring these voices to the table.”

Ethics First 

As surveillance technologies become more sophisticated, the “Being Watched” team is already thinking ahead. Wang’s latest research explores how generative AI models might pose new privacy threats, such as unauthorized use of personal images in training datasets. “Digital surveillance is evolving,” he says. “We’re developing algorithms to detect and mitigate these risks, ensuring that individuals retain control over their digital presence.”

For Strover, the project’s ultimate goal is fostering ethical innovation. “Efficiency may be the watchword for cities, but ethics can’t be an afterthought,” she said. “We need nationally accepted guidelines that balance innovation with privacy, grounded in the lived realities of diverse communities.”

“Just because the technology has changed doesn’t mean the social dynamics have. To move forward, we have to hold systems accountable not just to the general public, but to the people who are most negatively impacted under the status quo.”

— Anita Varma, Moody College of Communication

Varma added that these guidelines must go beyond surface-level principles like fairness, accountability and transparency (often abbreviated as FAccT). “What we found in journalism is that these principles are not sufficient,” she said. “If we could say, ‘Go be fair, accountable and transparent journalists,’ and everything's fine, then that's great. But that's not what has happened in journalism, and it’s not at all what’s happening in AI spaces, either.”

Building on lessons from journalism’s struggles with ethics, Varma advocates for a solidarity-driven approach to AI ethics, one that prioritizes the needs of those most adversely affected by technological systems. “Just because the technology has changed doesn’t mean the social dynamics have,” she said. “To move forward, we have to hold systems accountable not just to the general public, but to the people who are most negatively impacted under the status quo.”

Grand Challenge:
Good Systems