CDF webinar explores questions of parental consent, culturally specific factors
In the debate over how to regulate age-restricted content online, truths that are common across borders often bump up against regional challenges. In a webinar hosted by the Citizen Digital Foundation (CDF) that claims to go “With Alice, Down the Rabbit Hole,” digital rights advocates, policy experts and parents explore different perspectives on the issue as it applies to online child safety and age assurance in India, with a particular eye on India’s Digital Personal Data Protection Act (DPDPA).
CDF says its research shows that “subversive and harmful content and threats from bad actors online pose a greater threat to children and vulnerable groups in India.” Many of the culprits, however, are familiar. Panel moderator Nidhi Sudhan says that lip service from tech CEOs on child safety is contradictory to the models and practices of big social media firms, and cites the case of Meta researching teenage brain function to increase youth platform engagement.
YouTube is a rabbit hole of notable concern, where, according to CDF researcher Aditi Pillai, “engagement-driven algorithms remain a key issue.” But journalist and parent Dhanya Krishnakumar notes the overall difficulty of imposing age verification for digital content without causing additional detriment to kids. She notes the need to balance safety with “the dilemma of too much protection from parents leading to children facing peer pressure and cyberbullying,” and says more clear, open discussions are needed across the age spectrum to increase overall digital literacy.
Aparajita Bharti, co-founder of the Quantum Hub and Young Leaders for Active Citizenship (YLAC), points out that “the Indian ecosystem needs to be evaluated differently from the West. Putting the onus solely on parents in India won’t work – most parents lack the knowledge and resources to ensure child safety online.” Furthermore, she says, it’s “important to balance access to the Internet, which provides economic opportunities, with ensuring online safety for children.”
Consistent digital regulation is a tall order for a country with 28 states, 8 union territories and 1.5 billion people. Arnika Singh, co-founder of Social & Media Matters, says that because “India changes every 2 kilometers,” one-size-fits-all policies are inadequate for addressing the diverse needs of India’s rich socio-cultural and geographical makeup.
One key point was the need for “techno-legal solutions that allow parental control over algorithmic behaviors on platforms,” but also the need to factor in where accountability could land for kids’ mistakes. Nivedita Krishnan, director of law firm Pacta, says the DPDPA provision that puts an obligation on all data fiduciaries to obtain verifiable parental consent before processing the personal data of a child could inadvertently hold parents accountable for any illegal activity a child carries out online. Expecting parents to know everything their kids are doing online, she says, is unrealistic, and places a significant burden on parents.
Parents, in effect, are being stretched across an accountability gap that should be shared with the platforms that deliver harmful content to kids, and better policed by governments. “The platforms’ accountability is abysmally low at this point in time,” says Chitra Iyer, co-founder and CEO of consultancy Space2Grow.
Arnika Singh emphasizes the need to continue advocating for tech platforms to prioritize user safety over profit, and the need for context-specific content moderation that is attuned to India’s diverse cultural landscape. But ultimately, the DPDPA needs more teeth. Having been down the rabbit hole, Alice’s conclusion is that the law has a lack of specific audit and impact measurement mechanisms, and needs to look at international models as a benchmark for India’s laws.
Article Credit: biometricupdate