Designing AI technologies that benefit society is our grand challenge.
AI-based technologies are helping us solve complex problems in nearly every discipline and industry, but they have the capacity to be harmful to us in ways we might not predict or intend.
Coexisting with AI
Artificial intelligence is a system that can correctly interpret data, learn from it, and then use what it has learned to adapt in order to achieve specific goals autonomously. It improves our everyday lives, but not without risk.
AI is changing the way we do everything because it’s everywhere — from dating apps to the most advanced military weapons systems. AI does many things faster and better than humans can alone, but there are ethical and societal implications to consider.
How can we ensure that AI is beneficial — not detrimental — to humanity? What unintended consequences are we overlooking by developing technology that can be manipulated and misused?
It is ethically irresponsible to focus only on what AI can do. We believe it is equally important to ask what it should (and should not) do.
Our goal is to better understand what changes new technologies will bring, predict how those changes will unfold, and mitigate the harms or unintended consequences they could cause while still leveraging the benefits AI provides.
Our interdisciplinary team is comprised of faculty and student researchers from more than two dozen schools and units at UT Austin. They are experts in fields such as communications, community and regional planning, engineering, informatics, liberal arts, public affairs, and robotics, who are working to tackle pressing social issues alongside partners in city government, nonprofits, industry, and community groups. Together, we investigate how to define, evaluate, and build ethical AI systems that will transform society for the better.
What is a “good” system?
Our team defines a good system as a socially beneficial human + AI partnership that is driven by values including agency, equity, trust, transparency,democracy, and justice.