Unyielding Bias

How researchers are using video and VR to uncover how appearance affects pedestrian safety — and the risks of coding those patterns into AI

August 25, 2025
At the Crosswalk: Who Does AI See? A UT Austin research team is investigating how appearance affects driver behavior—and how those patterns could be replicated in autonomous vehicles.
Illustration created with ChatGPT

Picture a woman at an intersection. She’s white, nicely dressed, carrying a laptop bag. The signal says, “Don’t Walk,” but she is distracted and steps out into the crosswalk. A car approaches.

What does it do?

Now alter a few details in the picture. Instead of a woman, it’s a man. He is Black, in casual clothes, pushing a shopping cart with assorted belongings. The light still says, “Don’t Walk,” but the man begins crossing the street.

What does the car do now?

These aren’t just thought experiments. They are scenarios that play out every day, and they raise troubling questions about how bias shapes even split-second decisions behind the wheel.

Researchers from Good Systems wanted to know: Are drivers really responding differently to different people based on appearance? And if so, what happens when those patterns get passed on to artificial intelligence?

To find out, a team from the Designing AI to Advance Racial Equity core research project recorded more than 1,000 hours of video at two Austin intersections and documented over 20,000 pedestrian crossings, more than 3,000 of which resulted in direct interactions between pedestrians and drivers. They were looking for patterns: who crosses and who waits — on red and green lights alike — and whom drivers yield to.

The Rising Toll 

In 2022, the National Highway Transportation Safety Board (NHTSA) estimated that more than 7,500 pedestrians were killed in U.S. road crashes, the highest number since 1981. Those deaths accounted for 18% of all roadway fatalities, up from 12% in 2009. “In fact, pedestrian fatalities, as a percentage of total roadway crash fatalities, are at the highest point today than at any other time since the NHTSA began tracking those numbers in 1975,” said civil engineering professor Chandra Bhat, the project’s principal investigator. “We needed to understand why.”

Bhat’s team aimed to move beyond infrastructure and traffic counts to explore something harder to measure: how individual characteristics like gender, race and apparent housing status interact with human behavior at street level.

"Implicitly, some lives seem to be valued less."

— Chandra Bhat, Cockrell School of Engineering

Their approach was meticulous. Rather than relying on crash reports or simulations, the team set up high-resolution cameras at two intersections — one on campus, one at a busy frontage road — and manually coded thousands of pedestrian-vehicle interactions. Undergraduate students were trained to identify and log variables like signal compliance, pedestrian demographics and driver responses. For skin tone, the team used the Monk Skin Tone scale, developed by researchers at Google to better reflect social biases in digital systems. 

Their findings were unsettling. Drivers were measurably less likely to yield to pedestrians who appeared to be male, unhoused, or Black or brown. And these same groups were more likely to cross outside the crosswalk or against the signal. The disparities weren’t just behavioral, they were tied to social identity — shaped by bias, perception and appearance. 

Among their most significant findings: men were both more likely to engage in “non-compliant crossing behavior” (NCB) and more likely to be ignored by drivers. Pedestrians with darker skin tones or visible markers of housing insecurity — for example, those pushing shopping carts or dressed in worn clothing — were yielded to less often, even after controlling for NCB. “Implicitly, some lives seem to be valued less,” Bhat said. 

The Behavior Behind the Bias 

For civil engineering graduate student Angela Haddad, who helped lead the research, the decision to include unhoused individuals as a study variable came after a passing comment from a transportation official at a conference. “We’ve all seen people who look like they might be homeless walking along roads or stepping into traffic,” she said. “There’s this general sense that it poses a risk, but it was just anecdotal evidence until (our study). When we measured it, it turned out to be a very significant variable.” 

The stakes of these biases go beyond today's intersections. They threaten to be baked into tomorrow’s technologies. Artificial intelligence systems — like those powering autonomous vehicles — are trained on real-world data. “If all this video footage was used to train a driverless car, it would replicate what we’re seeing now,” Bhat said. “It would yield less to men. It would yield less to people with darker skin. It would yield less to people who look unhoused.” 

That concern — about real-world bias being encoded into future AI systems — is at the heart of what Bhat’s team is studying, and it aligns closely with one of Good Systems’ key research priorities: examining how emerging technologies might exacerbate or mitigate social inequities. 

To that end, the next phase of the project trades the familiar confines of the Forty Acres (privacy laws limited the team to intersections on UT property) for the notoriously hectic streets of Manhattan. Well, simulated ones, anyway. 

“Infrastructure countermeasures to reduce pedestrian fatalities have a certain crispness to them, and do help. But bias? That’s harder to address. It calls for education, awareness and a change in social norms.”

— Chandra Bhat

The team has developed a virtual reality (VR) platform that allows participants to navigate a four-way intersection wearing a headset. Traffic levels, signal patterns and the behavior of nearby virtual pedestrians are all adjustable. The idea is to measure when people choose to cross, what risks they take and how their own demographic backgrounds influence those choices. 

“Now we’re trying to get into the pedestrian’s head,” Haddad said. “In the video footage, we could only watch what people did. With VR, we can ask why.” 

Participants will also complete surveys about where they grew up, how often they walked and how safe they felt doing it. The team hopes to test about 100 people and compare their behavior across race, gender, socioeconomic status and environment. Their goal is to generate a richer behavioral dataset — one that can inform smarter, more equitable AI systems. 

Still, Bhat cautions that no amount of modeling will fix what he sees as a deeper cultural issue: a lack of mutual respect among road users. “Infrastructure countermeasures to reduce pedestrian fatalities have a certain crispness to them, and do help,” he said. “But bias? That’s harder to address. It calls for education, awareness and a change in social norms.” 

In northern Europe, Bhat noted, schoolchildren are taught from a young age about their responsibilities as both pedestrians and drivers. “There’s a mutual respect [there],” he said. “That’s the culture we need to aim for — not just better infrastructure countermeasures.”

Grand Challenge:
Good Systems