Tracking Exposed is an open and free software to idependetly analyze the effects of the algorithmic diffusion of contents online.
We developed a tool to enable ordinary users to collect evidence on their level of personalization.
But how can you understand your personalization if you can not compare it? In your Parsonal page, you can also compare your experience on the platform with other users’ one. Doing that, you will break your filter bubble by getting in touch with Youtube’s recommendation made for someone else.
“Is Youtube suggesting those videos to me or to all of us?" This is a question you can answer quickly with our tool.
Our tool can be used for research propose too.
There are many ways to make a systematic review or assessment of the recommender systems like Youtube’s algorithm. It is always better not to rely on official API data for that purpose: the list of suggested videos you get there are different from the ones you get in real interaction with the platform. We found evidences for that in 2019 and 2020.
Red circles represent the videos declared by YT as related. In yellow and green, the videos actually suggested to watchers.
You can perform at least five types of algorithm auditing (Sandvig 2014). Whit Tracking Exposed you will be able to replicate at least 3 of them:
To perform the analysis you will have to use some Data Analysis (step 5). methodology. We like to use general descriptive and statistical analysis (with Tableau and Python) and Network Analysis (with Gephi)
The spreading of online news consumption is increasing year after year; algorithmic mediation is a crucial issue for our democracy.
Nevertheless there is a tremendous gap between public understanding of algorithms and their prevalence and importance in our lives. For instance, the majority of Facebook users in a recent study did not even know that Facebook ever used an algorithm to filter the news stories that they saw.
Many anecdotal events have been reported on the effects of Youtube. Many scholars, for examples, talks about the Youtube’s radicalization “rabbit hole”. S ometimes, regular users have bad experiences with extremist videos surfing the platform, especially during some particular moments like elections. But it remains really challenging to provide strong evidence and to quantify phenomena like this.