The Yuan requests your support! Our content will now be available free of charge for all registered subscribers, consistent with our mission to make AI a human commons accessible to all. We are therefore requesting donations from our readers so we may continue bringing you insightful reportage of this awesome technology that is sweeping the world. Donate now
The Legal Black Box Conundrum
By Bart de Witte  |  Mar 29, 2022
The Legal Black Box Conundrum
Image courtesy of and under license from
Fears that intellectual property lawyers, policymakers and lobbyists are determining the future of health is real. But HIPPO AI founder Bart de Witte believes artificial intelligence policymakers should start focusing more on open-source AI for healthcare, to create true transparency and trust.

BERLIN - In 2020, Berlin-based non-government organization (NGO) AlgorithmWatch launched a project to monitor Meta’s Instagram newsfeed algorithm. Only by understanding how society is affected by platforms' algorithmic decisions can we take action to ensure they do not undermine our autonomy, freedom, and the common good. 

Volunteers could install a browser add-on that scraped their Instagram newsfeeds. Data was sent to a database we at the non-profit HIPPO AI Foundation used to study how Instagram prioritizes pictures and videos in a user’s timeline. The research team wanted to understand from the collected data which images and videos were favored by the algorithms of Meta, formerly Facebook. 

In spring 2021 Meta threatened AlgorithmWatch with legal action if it continued its Instagram algorithm data donation project. Algorithmwatch reported that Meta had accused it of violating its terms of use, which prohibit the automatic collection of data. Faced with Facebook's threat to take "more formal action," the NGO terminated the project. 

Understanding the reasoning of algorithmic decisions is a fundamental difficulty in the implementation of algorithms. The so-called Black Boxes make this tough. The Black Box issue within the artificial intelligence (AI) industry mostly refers to the technical Black Box. The popularity of the technical Black Box has driven further research and development of explainable AI. The field of explainable AI is concerned with how AI models arrive at their results, with the goal of developing AI that can be traced back to its origin. 

But as the case of AlgorithmWatch versus Meta demonstrates, the major issue of understanding algorithmic decision-making might not be the technical Black Box, but what is being described in literature as the legal Black Box. The term legal Black Box refers to the opacity that arises from the fact that it is proprietary software. To pu

The content herein is subject to copyright by The Yuan. All rights reserved. The content of the services is owned or licensed to The Yuan. Such content from The Yuan may be shared and reprinted but must clearly identify The Yuan as its original source. Content from a third-party copyright holder identified in the copyright notice contained in such third party’s content appearing in The Yuan must likewise be clearly labeled as such.
Continue reading
Sign up now to read this story for free.
- or -
Continue with Linkedin Continue with Google
Share your thoughts.
The Yuan wants to hear your voice. We welcome your on-topic commentary, critique, and expertise. All comments are moderated for civility.