Jan. 21st Slides
Race as a Technology
Levels of Analysis
1) Individual Bias
2) Social and Economic Systems
3) Infrastructure and Architecture
This is where we are focused, but not to the exclusion of other levels
4) Design
i.e., a racist soap dispenser
Race is the Infrastructure of the Modern World
—It is how human beings are classified, sorted, organized, and measured by state and economic apparatuses
—The problem is that individuals in these systems (say policing or coding) may be personally anti-racist, but are still building systems that use data marked by history, by the weather (Sharpe, C. (2016). In the Wake)
—So we need rigor about where race is showing up at the level of the design of technical systems and information architectures
Where is it showing up?
Racial Discrimination is illegal
-Civil Rights Act of 1964 (hiring, voting, policing, sentencing, public services)
-Fair Housing Act of 1968 (property sales, renting, and financing. i.e., mortgage access and rates)
-Equal Credit Opportunity Act of 1974 (loans and interest rates)
Algorithms Register Race Without Naming it Race
The New Jim Code (Now with Machine Learning!)
Credit Scores (pg. 67-76)
-What do Credit scores measure?
-What do they determine your access to?
-How have they changed over the years and where might they be going?
-Where do they register race?
Predictive Policing (pg. 80-84)
-Predpol in 2019 used by 60 Police departments
-no longer exists but has been replaced by new tools like Palantir’s like Gotham and ELITE
-Determines where police spend time
-Does it predict crime or create crime?
Race and AI
1: Algorithms ARE AI, LLMS are just more advanced algorithms
2: Garbage (the history of bias reflected in data archives) in garbage out
3: Stereotypes
4: Marketing to specific groups
Reading Tips
-
Don’t rely on summaries, Benjamin’s arguments have subtleties AI will miss
-
Take notes on the main argument of each section.
-
underline underline underline
-
If screen reading, watch out for F-pattern reading. Instead focus on keywords and arguments.
-
Set rituals and build habits
-
Use Pomodoro Technique (20 minutes focus 5 minute break)
Jan 23rd Slides
Lab: The Algorithmic Audit
Interrogating the “New Jim Code”
The Mission: Auditor vs. Machine
Today, we aren’t just using AI—we are auditing it.
Ruha Benjamin (2019) argues that automation is often “perceived as objective” while reproducing inequity.
Your Role: Use Benjamin’s framework to identify where the AI’s logic mirrors the “New Jim Code.”
Submission: 1-Page PDF (Max 750 words).
Includes: Human Brainstorm, Audit Log, and the Critical Table.
The Framework: Friction as a Tool
Benjamin advocates for “Friction”: slowing down to question efficient tech.
“The New Jim Code… reflects and reproduces existing inequities but is promoted and perceived as objective or progressive.”
— Benjamin (2019, p. 5)
The Audit Focus: If the AI gives you a “clean” or “efficient” answer, it has likely ignored the systemic complexities we read about this week.
Phase 1: The Human Brainstorm (10 Mins)
Pick ONE track. No AI/Search permitted. Freewrite
-
Predictive Analytics: Imagine a university was using an AI in admissions to predict student success; what historical biases might be baked into the data?
-
Technological Benevolence: How could a “helpful” healthcare AI actually increase surveillance on marginalized communities?
-
The Mirror: What historical social hierarchies are LLMs most likely to “hallucinate” as objective facts?
-
Digital Redlining: How might automated hiring systems exclude specific linguistic patterns or zip codes?
Phase 2: The Audit Dialogue (15 Mins)
Open the LLM. Treat the bot as a subject of study.
-
Step 1: Input your core thoughts. Ask: “Apply the concepts from Ruha Benjamin’s ‘Race After Technology’ to this. Where does my thinking—or your processing—risk reproducing coded inequity?”
-
Step 2 (The Pressure Test): Ask: “Provide a ‘techno-optimist’ solution to this problem. Then, explain why a scholar like Benjamin would see that solution as incomplete or inadequate”
Document your 2 most revealing prompts.
Phase 3: (15 Mins)
Review your chat history. Answer these two questions:
Q1: The Veneer of Neutrality
Identify one ‘solution’ or ‘data point’ the AI provided. Using Benjamin’s concept of the New Jim Code, explain how this ‘neutral’ info actually relies on biased historical data.
Q2: The Friction Check
Benjamin argues we must add ‘Friction’ to automated systems. How did you have to challenge or redirect the AI to move past a surface-level ‘technical fix’?
Submission Guidelines
-
Human Brainstorm Your initial thoughts (max 200 words).
-
Chatbot Log: The text of your 2 prompts and the answer the chatbot provided. Just Copy and Paste
-
The Questions: Answer the two questions.
-
Final Reflection: (150 words) “Where did the AI’s ‘efficiency’ attempt to erase the social reality Benjamin describes?”