Curate what you see

Youth Online Safety team

Why is it that you see some user’s posts in your Instagram feed and not others? Why does Netflix think you like romantic comedies, when you really love sport dramas? These are some of the questions that are important in our increasingly recommended lives when we engage our online platforms.

While it may not be the end of the world if you have to sit through a Ryan Gosling film, it can be very concerning if violent or criminal content appears in a young person’s social feeds. In our eSafety Commission funded research that examined the Emerging Issues for Young people on Social Media, it was clear that young people and the carers have little idea on how content curation works.

In fact, one of the recommendations was for humans to take over the moderation role in many platforms, instead of bots that do a less than appropriate job.

To help us better understand what curation on social media is and how to improve what we see, we spoke with one of the emerging experts on the topic, Agata Stepnik, who has just completed her PhD at the University of Sydney.

In this interview, Agata not only outlines what curation is, but how we can take a more active role in its process to ensure we see what we want to see. Agata notes that curation is not a bad thing – it helps us make sense of our digital worlds – but it can be problematic when automatic processes aren’t aware of the sorts of people they are presenting content to. Agata says simple things like using the like and dislike functionalities or blocking content that isn’t OK can have a huge impact on what we see.

To access the full interview, and for more information, please visit: https://youthonlinesafety.org/education/