Top headlines

Lead story

Bias in artificial intelligence algorithms has been in the news in recent years, particularly algorithms that are biased against women and people of color. Facial recognition algorithms have been especially unfair to Black women. But it’s not a new problem.

Computer scientist John MacCormick recalls creating a head-tracking algorithm 25 years ago when he was a Ph.D. student. When the time came to demo the algorithm, he had the shocking realization that it was racially biased.

As Big Tech rushes headlong into a new era of powerful AI systems, MacCormick sees the same mistakes cropping up again and again. It boils down to who is in the room, what gets prioritized and how hard it is to spot bias lurking in the numbers.

[Sign up here to our topic-specific weekly emails.]

Eric Smalley

Science + Technology Editor

Facial recognition software misidentifies Black women more than other people. JLco - Ana Suanes/iStock via Getty Images

I unintentionally created a biased AI algorithm 25 years ago – tech companies are still making the same mistake

John MacCormick, Dickinson College

One researcher’s experience from a quarter-century ago shows why bias in AI remains a problem – and why the solution isn’t a simple technical fix.

Science + Technology

  • Memories may be stored in the membranes of your neurons

    John Katsaras, University of Tennessee; Charles Patrick Collier, University of Tennessee; Dima Bolmatov, University of Tennessee

    Pinpointing where memories are stored in the brain and how they are transmitted could provide new targets to treat neurological diseases and serve as models for neuromorphic computing.

Economy + Business

Politics + Society

Ethics + Religion

Arts + Culture

  • The unbearable allure of cringe

    Carly Drake, North Central College; Anuja Anil Pradhan, University of Southern Denmark

    What does secondhand embarrassment say about your own anxieties and biases?

Environment + Energy

Health + Medicine

From our international editions

Today's graphic 📈