Digital Colonialism

Medium | 23.12.2025 18:39

Digital Colonialism

Murat Durmus (CEO @AISOMA_AG)

1 min read

·

1 hour ago

--

Listen

Share

When most AI is trained on Western data, it not only inherits Western perspectives but also mistakes them for universal truths.

The consequence is subtle but profound: a machine that confidently misinterprets the world outside its training set. An African patient becomes a statistical anomaly. A rural dialect is labeled “noise.” A local custom is flagged as a bug in the system.

Get Murat Durmus (CEO @AISOMA_AG)’s stories in your inbox

Join Medium for free to get updates from this writer.

Subscribe

Subscribe

In trying to teach machines to see, we’ve trained them to see like us, but only some of us at that.

Bias in AI isn’t just a glitch. It’s a form of digital colonialism, in which power is encoded rather than imposed by armies, through algorithms. And the danger isn’t that machines will discriminate maliciously; it’s that they’ll do it efficiently, invisibly, and at scale.

To correct this, we don’t just need more diverse data; we also need to understand how to utilize it effectively.

We need new epistemologies. Ways of knowing that reflect the full spectrum of human life, not just the narrow bandwidth of Silicon Valley.

Thanks for reading