Algorithms of Oppression

Weapons of Math Destruction by Cathy O'Neil: 9780553418835 | PenguinRandomHouse.com: Books

Don't Build a Database of Ruin

Why big data is actually small, personal and very human | Aeon Essays

Our Metrics, Ourselves: A Hundred Years of Self-Tracking From The Weight Scale to The Wrist Wearable Device - Microsoft Research

How Data Can Be Used Against People: A Classification of Personal Data Misuses

The Ethics of Emotion in AI Systems (Research Summary) | Montreal AI Ethics Institute


<aside> 💡 The theme that struck me the most from this week’s readings was the manipulation of emotions. Starting with the Crawford, Lingel, and Karppi reading “Our metrics, ourselves”, it was truly enlightening to compare the history of the weight scale to the capitalist push of today’s wearables. When talking about diet culture, I really despise how it makes me and others feel anxious about not hitting certain numbers on the weight scale. I have many family members and friends who love talking about their wearables. I wonder if they feel a similar anxiety when they don’t hit certain goals and how often do they feel this way?

What raises even more concern is how computer scientists are trying to define and model emotions within AI systems as mentioned by Stark and Hoey in “The Ethics of Emotion in AI Systems”. Specifically, that there’s a lack of awareness of “stereotypes and assumptions about emotions that materially affect a person or group” and that “Not only may EAI systems be unwanted and overly invasive, but they also raise broader concerns around the use of data for tracking, profiling, and behavioural nudging.” Understanding the 11 classification schemes through “How Data Can Be Used Against People” (Kröger, Miceli, Müller), while certainly terrifying, was also reassuring. It provided a solid framework to learning the nuances of data privacy risks and helped me expand my vocabulary in defense of data privacy.

</aside>