Day 27
Day 27
Did You Know?
Sociology isn’t just relevant to AI—it might be one of the most important tools we have to understand it. AI is everywhere right now—your playlists, your TikTok feed, even your homework tools.
At its core, AI runs on data. Sounds neutral, right? It’s not. That data comes from real people living in a world shaped by inequality. So when AI learns from that data, it can end up learning bias too.
One sociologist, Kelly Joyce, puts it this way: if we want AI that’s fair, we need to understand how inequality is baked into the data it learns from. Otherwise, we’re just automating the same problems.
Another sociologist, Mike Zajko, points out something even more uncomfortable: a lot of AI systems sort people into categories—like race, gender, even emotions—without really understanding what those categories actually mean. It’s like trying to define someone’s entire identity based on surface-level patterns. That’s… not great.
AI isn’t just “doing its own thing.” Humans are making decisions at every step—what data to use, how to label it, what the system should prioritize. So when AI produces biased outcomes, it’s not just a tech problem—it’s a social one.
AI doesn’t just reflect society—it can reinforce it.
“A sociological understanding of data is important, given that an uncritical use of human data in AI sociotechnical systems will tend to reproduce, and perhaps even exacerbate, preexisting social inequalities,” said Bell. “Although companies that produce AI systems hide behind the claim that algorithms or platform users create racist, sexist outcomes, sociological scholarship illustrates how human decision making occurs at every step of the coding process.” -Emily Storz, Drexel News