Reducing Psychological Impacts of Content Moderation Work

Event Status
Scheduled
Matt Lease (Associate Professor, School of Information) will lead the first research presentation from the Future of Work Research Focus Area. Social media platforms must detect and block a variety of unacceptable user-generated content, such such as adult or violent images. This detection task is difficult to automate due to high accuracy requirements, costs of errors, and nuanced rules for what is and is not acceptable. Consequently, platforms rely on a vast and largely invisible workforce of human moderators to filter such content. However, mounting evidence suggests that exposure to disturbing content can cause lasting psychological and emotional damage to some moderators. To mitigate such harm, we investigate a set of blur-based moderation interfaces for reducing exposure to disturbing content whilst preserving moderator ability to quickly and accurately flag it. We find that interactive blurring designs can reduce emotional impact without sacrificing moderation accuracy and speed. See our online demo at: http://ir.ischool.utexas.edu/CM/demo/.   Register now!
Date and Time
Feb. 8, 2021, 12:01 to 1:01 p.m.
Event tags
Good Systems