Latest stories of our network
Chicago's tiny not-for-profit taking on powerful institutions.
The history of surveillance is one of control. As monitoring technologies accelerate, one not-for-profit noticed a concerning rise in unethical police cell phone observation. Their objections led to new, stronger digital rights legislation.
Stingrays and cell phones: Is your pocket private?
Smartphones have improved our lives more than we could have imagined. We work on them, use them to take and store private photos and they know where we are at any moment. But with advanced surveillance techniques, phones have become a powerful way for law enforcement to observe and identify us, ethically or not.
Last year's change to remote life made us all digital. Are we now in danger of trading private digital data for convenient digital services? Check out Kaspersky's privacy predictions for 2021 and learn how this year is going to affect our privacy in cyberspace.
One Chicago not-for-profit, Lucy Parsons Labs, is demanding government agencies like the police and Immigration and Customs Enforcement (ICE) be more transparent about how and why they track people through their phones. Defenders of Digital episode three speaks with Lucy Parsons Labs' Executive Director Freddy Martinez about how law enforcement use technologies to covertly observe people, what it means for digital rights and how his team made US legal history.
The world of digital privacy is changing.
Algorithms are everywhere, but they are trained based on the beliefs of their developer. In episode two of our second season of Defenders of Digital, we learn about Homo Digitalis' work to expose algorithm bias that impedes digital rights for millions. The first corporate they catch might surprise you.
Ethical algorithm moderation
Algorithms can improve our experience online. But one not-for-profit is going beyond the code for the greater good. Founded in 2018, Homo Digitalis has over 100 members. They promote transparency in algorithmic programming and safeguards against discrimination by algorithm.
Because programmers – as humans – have biases, algorithms learn from those biases. When we hand power over to the algorithm, it may erode digital rights and impinge freedom of expression without us knowing.
Homo Digitalis has already called out one tech giant for their moderation process. It could have impacted millions. Who was it? Find out in Defenders of Digital season two, episode two.