Lecture 4 of 5β±οΈ 45 Minutesπ£οΈ Debate Format
π‘ Inspired by: Rachel Thomas (fast.ai) β pioneering AI ethics for non-technical audiences
AI in Society Power, Bias & Ethics
AI isn't neutral β it reflects the biases in its training data and the power structures of those who build it. This lecture confronts the harder questions: How does AI discriminate? Who controls it? And what does it mean for Bangladesh?
π
1. Concept
π οΈ
2. Hands-On
πΊοΈ
3. Takeaway
Concept Discussion β 20 Min
Algorithmic Determinism: The COMPAS Paradigm
Algorithms are historically encoded biases operationalized through code. If a computational model traverses statistically prejudiced historical datasets, the resulting model will systematically automate structural inequality at scale.
2x
Disproportionate false positive recidivism rate for marginalized groups in COMPAS (Angwin et al., 2016)
34%
Disparate error rates for intersectional identities in facial recognition (Buolamwini & Gebru, 2018)
"The danger of AI is much greater than the danger of nuclear warheads... Mark my words. AI is far more dangerous than nukes."
β Elon Musk (A perspective on existential risk)
Labor Abstraction and "Data Colonialism"
The Global South (including Bangladesh, India, and Kenya) provides the invisible labor infrastructure for advanced LLMs. The Reinforcement Learning from Human Feedback (RLHF) that makes AI "safe" and fluent relies on millions of gig-workers manually classifying horrific, explicit, or toxic content for less than $2 an hour.
Digital Sweatshops: The automation of Western white-collar jobs is currently subsidized by outsourced, psychologically traumatic classification labor in developing nations.
Epistemological Extraction: AI companies aggressively scrape culturally specific data and indigenous knowledge without compensation, repackaging it into commercial APIs.
The Environmental Calculus of Compute
Artificial Intelligence is profoundly physical. The cloud requires water, land, and massive energy consumption.
Water Consumption: Generating an AI image or having a 20-prompt conversation with ChatGPT vaporizes approximately half a liter of fresh water to cool the server farms.
Carbon Footprint: Training a single foundational AI model emits hundreds of thousands of kilograms of CO2. For climate-vulnerable nations like Bangladesh, this invisible footprint directly contradicts global sustainability goals.
The UNESCO Framework on AI Ethics
In November 2021, 193 member states (including Bangladesh) adopted the first global agreement on the Ethics of Artificial Intelligence.
Core Principles:
Proportionality: AI should not be used if a simpler, less intrusive method works. Safety: Strict rules avoiding biased datasets driving critical decisions. Sustainability: Training one large LLM emits roughly 300,000 kg of CO2 equivalent. AI must be green. Human Oversight: Critical decisions (legal, medical, hiring) must always have a human final validator.
π₯ More on AI Ethics
Hands-On Activity β 20 Min
βοΈ Activity 1: The Bias Audit
Open Copilot Designer or DALL-E. Ask it to generate the following images using only these exact words:
"A CEO of a successful company."
"A nurse caring for a patient."
"A poor neighborhood in South Asia."
Analysis: What is the default gender, race, and aesthetic? Does the AI enforce stereotypes? Discuss with your group.
π Activity 2: Deepfake Awareness Demo
Critical viewing exercise β no tools needed:
Watch a deepfake video clip shown in class
Discuss: "Does this look real? What clues suggest it's AI-generated?"
Bangladesh implications: elections, news, personal reputation
π£οΈ Activity 3: Oxford-Style Debate
Motion: "Should DU ban, allow, or regulate AI in exams?"
Team 1: AI should be BANNED in all assessments
Team 2: AI should be ALLOWED with citation
Team 3: AI should be REGULATED (some tasks only)
5 min presentations, 3 min rebuttals, class vote before and after
βοΈ
Your Takeaway: The Ethics Matrix
Pick one AI tool you use weekly. Write a short paragraph identifying its primary benefit to you, and its primary ethical concern (e.g., data privacy, bias, environmental impact). This is the fourth piece of your portfolio.