Latest stories of our network
The world of digital privacy is changing.
Algorithms are everywhere, but they are trained based on the beliefs of their developer. In episode two of our second season of Defenders of Digital, we learn about Homo Digitalis' work to expose algorithm bias that impedes digital rights for millions. The first corporate they catch might surprise you.
Ethical algorithm moderation
Algorithms can improve our experience online. But one not-for-profit is going beyond the code for the greater good. Founded in 2018, Homo Digitalis has over 100 members. They promote transparency in algorithmic programming and safeguards against discrimination by algorithm.
Because programmers – as humans – have biases, algorithms learn from those biases. When we hand power over to the algorithm, it may erode digital rights and impinge freedom of expression without us knowing.
Homo Digitalis has already called out one tech giant for their moderation process. It could have impacted millions. Who was it? Find out in Defenders of Digital season two, episode two.
Look into the future of pleasure, lust and connection
Until now, scientists and developers have pushed to discover whether artificial intelligence can love humans, and vice versa. Welcome to the age of robot relationships.
AI loves me; AI loves me not<p>In Steven Spielberg's 2001 blockbuster science fiction film A.I. Artificial Intelligence, a highly advanced robot boy pursues a loving foster human who abandoned him. At the time it seemed far fetched. Today, it looks more like reality.</p><p>Imagine Beyond: Build me Somebody to Love looks at how AI is changing the way we look at love, lust and human connection. Could you marry a robot? Will a hunk of metal look after you in your dying days? Let's see how human machines could become.<br></p>
Eat turmeric, exercise regularly, sleep well – a few of many tips to increase your lifespan. But if they work, they will probably only give you a handful of extra years. If you want to drastically prolong your time on earth, here's what you might do instead.