Regulating consent by age and digital exclusion

In March 2019, the government made a commitment to strengthen the Privacy Act 1988 (Privacy Act) by introducing a binding code of practice for social media and other platforms that trade in personal information online, and increasing penalties and enforcement measures.

By Justine Humphry and Mahli-Ann Butt

The Online Privacy Code Privacy Legislation Amendment (Enhancing Online Privacy and Other Measures) Bill 2021 (henceforth referred to as the OP Code) seeks to address the increasing privacy concerns and issues of online harm as a result of social media and other online media activities.

The OP Code has a particular focus on vulnerable groups such as older Australians, younger Australians, those with English as a second language, marginalised groups and those with disabilities. The proposed code aims to address the privacy practices of online platforms that can be detrimental to children and vulnerable persons, including sharing data for advertising purposes, or engaging in harmful tracking, profiling, or targeted marketing noting the ‘wide range and volume of personal information that social media platforms handle’ (2021, p. 5).

To protect the online privacy of vulnerable young people, the OP Code will require a parent’s or guardian’s verifiable consent for young people under the age of 16 to use online services, platforms, and social media that collect data from users. The OP code will require a range of new protections in relation to children or other groups of people who are considered not capable of making their own privacy decisions including:

  • stricter requirements for how social media companies handle children’s personal information;
  • taking reasonable steps to verify the age of individuals, and
  • obtaining parental or guardian consent before collecting, using or disclosing the personal information of a child who is under the age of 16, and take all reasonable steps to verify the consent (Ibid., pp. 15-16).

However, despite the government’s intentions to protect young people in Australia online, the proposed OP Code may exacerbate exclusion and create new barriers from accessing social media and other online services for those children and young people who are already some of the most socially and digitally marginalised in the nation.

Exceptions to consent

It is clear there are a number of scenarios where parental and guardian consent is not appropriate or is too difficult to obtain for young users of social and online media services, such that its requirement would result in being deprived of information and support online. For example, girls seeking health or sexual information who are not able to or feel uncomfortable having these conversations with parents or LGBTIQA+ youth who want to connect with like minded peers but who are living in an environment that is not conducive to the the formation of their sexual identity (Holt, 2011).

There are other situations and environments that are not conducive to children and young people having to obtain consent from their parents or guardians, for example, children in institutional care, detention or in a violent domestic environment, or young people experiencing homelessness for whom access to online information and networked peer support is essential for health and safety (Humphry & Pihl, 2016; Rice and Barman-Adhikari, 2014).

In these instances, how will young people be able to obtain consent or seek legitimate exemption from consent for their own health, safety and well-being? There needs to be defined parameters for legitimate exemptions from consent and an easy-to understand and use mechanism for young people under the age of 16 to be able to exercise independent consent in these and other legitimate circumstances.

Gatekeeping through consent

Further to these kinds of specific cases or circumstances, there is the wider concern that introducing a consent system that is overly restrictive will lead to exclusion from online services and other digital benefits. There is broad agreement that children and young people’s ability to participate in social media is essential for identity formation, building friendships and peer-groups, for developing skills and connections and for leisure and entertainment (Livingstone, Mascheroni, & Staksrud, 2017; Milosevic, 2017).

The urge to protect children from real and perceived dangers can inadvertently lead to the over-reliance on ‘gatekeeping through consent’ shifting the onus away from cultivating children and young people’s independence and digital skills and developing a less abusive social media environment. This may also add to an increase in parents’ use of ‘restrictive mediation’ techniques (Smahel et al., 2020) rather than fostering an open dialogue and conversation about potential online harms.

While the research team strongly support the development of a binding code of practice for social media and other online platforms that trade in personal information, there is a real risk that an overly restrictive consent and age verification system will reduce young people’s independence and introduce new kinds of harms in the form of being excluded from services that they rely on.

Digital literacies around consent and age verification

At this stage it is unclear how requirements in the Code in relation to consent, age verification or verification of consent, will be technologically implemented and monitored to ensure children and young people are adequately supported through their social and online media experiences. What we do know is that any such systems will be accompanied by new kinds of digital skills for children and parents/carers alike.

The current systems and processes that are in place for social media platforms to obtain permission from users for data collection are complex and require specialist expertise to enable seamless integration. From initial account registration, through to enabling comments to appear from a Facebook signed-in state on a blog or website, or connecting a personal tracking device to a social media platform, there exist a wide range of systems and processes that platforms use to connect and communicate with each other and to obtain upfront consent and ongoing permission for changes in what or how user data is collected.

While new digital skills required with these new systems and processes apply to all social media users, there is a risk that these will be new barriers to access and use of social media among some groups, who already have lower levels of digital ability/literacy as demonstrated in the scores of annual the Australian Digital Inclusion Index (Thomas et al, 2021). There are widely varying levels of maturity and cognitive capacity within different age groups – this also applies to digital skill level, which impacts on an individual’s ability to consent to and negotiate these new systems (See for example, the EUkidsonline survey 2011 which found that younger children tend to lack digital skills and confidence).

In many instances of the OP Code, the perspective is assumed to be from one of an imagined family and their use of social and online media services. This will not be the case in many applications of the Code where a variety of cultural, religious, ethnic and gendered backgrounds are present. While the consultation is open to a wide range of interested stakeholders for the increased safety and protection of Australian social and online media services, it has become obvious the user voice (young Australians) is missing from the discussion. In addition to this, the voice of parents and carers of those young people is also not obvious. We would ask the Attorney General’s Department to not only engage with industry consultation in the first instance, but to consult extensively at all stages with those who will be impacted most by these legislative changes.

Read our research team’s full response to the OP Bill.

References

Attorney General’s Department. (2021). Enhancing online privacy and other measures. Early Assessment – Regulation Impact Statement. https://consultations.ag.gov.au/rights-and-protections/online-privacy-bill-exposure-draft/user_uploads/online-privacy-bill-regulation-impact-statement.pdf

Holt, D. B. (2011). LGBTIQ teens-plugged in and unfiltered: how internet filtering impairs construction of online communities, identity formation, and access to health information. Serving LGBTIQ library and archives users: Essays on outreach, service, collections and access, 266-277.

Humphry, J., & Pihl, K. (2016). Making connections: Young people, homelessness and digital access in the city. Sydney: University of Western Sydney.

Smahel, D., Machackova, H., Mascheroni, G., Dedkova, L., Staksrud, E., Ólafsson, K., Livingstone, S., and Hasebrink, U. (2020). EU Kids Online 2020: Survey results from 19 countries. EU Kids Online. Doi: 10.21953/lse.47fdeqj01ofo

Livingstone, S., Haddon, L., Görzig, A., & Ólafsson, K. (2011). Risk and safety on the internet: The perspective of European children. Full findings from the EU Kids Online survey of 9-16 year olds and their parents. EU Kids Online, LSE. http://eprints.lse.ac.uk/33731/

Rice, E., & Barman-Adhikari, A. (2014). Internet and social media use as a resource among homeless youth. Journal of Computer-Mediated Communication, 19(2), 232-247.

Thomas, J., Barraket, J., Parkinson, S., Wilson, C., Holcombe-James, I., Kennedy, J., Mannell, K., Brydon, A. (2021). Australian Digital Inclusion Index: 2021. Melbourne: RMIT, Swinburne University of Technology, and Telstra.