Date of Award
Master of Design (MDes)
As machine learning grows more sophisticated everyday, the societal impacts of algorithm models automating and dictating much of our online behaviors becomes undeniable. This has led to ethical problems in technology-based media (filter bubbles on facebook, search discriminations on google search, datafication and privacy in general, etc), and despite our growing dependency upon machine learning systems, we have no clear ethical guidelines to provide either the computer scientists when building the algorithms, or designers when implementing such systems. This thesis explores the friction points where design might intervene in ways that effectively address the challenges at hand, as well as better ways of designing users’ relationships with their filter bubbles, through literature reviews, exploratory research, paper prototypes, and a survey. The goals are twofold: 1. To diversify news consumption practices, and 2. To encourage people to become more aware of their own behaviors on social media. After conducting literature reviews, I identified the following friction points: lack of transparency over what data is being pulled; and a general lack of user agency and control over the kinds of data they would like pulled from their engagement and the sorts of content they might like to be shown. As a basis for inquiry, this study questions if there’s a way to leverage design to help people become more aware of their contributions to their own filter bubbles, instead of pushing people to engage with others who think differently. Findings from the survey suggest that people find value in the resulting prototype, but that the stakeholders would need to expand beyond social media users, in order to consider a financial incentive for the business.
Kim, Min, "Interventions that Inform: Designing Communications that Highlight News Filter Bubbles and Provide Strategies for Combatting their Negative Effects" (2017). Theses. 123.