Monday, December 30, 2024
Home > Sectors > Education > Ugandan Student Shares Moving Testimony On Cyberbullying at UN Human Rights Council
EducationICTInternational News

Ugandan Student Shares Moving Testimony On Cyberbullying at UN Human Rights Council

Santa Rose Mary Addressing the Human Rights Council. (Snapshot from UN News Website)

Santa Rose Mary, a Ugandan student delivered a moving testimony of her harrowing experience with cyberbullying during a panel discussion at the UN Human Rights Council on the vice. The fifteen-year-old Child Rights Advocate and a valued member of the Children’s Advisory Council was granted a five-minute platform to address the Council via Zoom on behalf of all the children of the world.

She passionately discussed the escalating concerns stemming from the digital and social media era. She emphasized the devastating impact of personal information and intimate photos being shared online, stating, “You can’t even face your own community or your own parents.” Rose Mary issued a solemn warning about the potential consequences, highlighting that such situations could push a child to contemplate taking their own life when they feel unwanted in their community.

The student also highlighted the often-overlooked problem of cyberbullying directed at children with special needs, shedding light on the importance of addressing this specific and vulnerable group’s challenges in the digital age. She also advised all schools to include digital literacy in the school syllabus to help children understand more about online safety.  To parents, Mary requested all parents to teach children about the dangers of cyberbullying. To her fellow children, she encouraged them to report any case of cyberbullying they encountered or witnessed.

During the session on Wednesday, the other panelists echoed a grave concern: the dire consequences of cyberbullying, which encompass heightened anxiety, emotional turmoil, and, tragically, even instances of child suicide. The impassioned speakers stressed the pressing requirement for the implementation of more robust prevention measures, with a particular focus on the involvement and responsibility of major tech companies.

According to research by the UN Children’s Fund, a staggering 130 million students worldwide are affected by cyberbullying, a problem exacerbated by the widespread use of digital technologies. UNICEF estimates that one out of every three students aged 13 to 15 falls victim to cyberbullying. A research study conducted in 2021, showed significant insights into children’s online safety in Uganda. One of the pivotal findings revealed that 30 percent of the interviewed children had experienced abuse and threats in the online realm.

The predominant types of threats included exposure to inappropriate content, encounters with online predators, and instances of cyberbullying, particularly on widely-used social media platforms such as Facebook and WhatsApp. Additionally, the research report shed light on a concerning issue: a pronounced lack of awareness and understanding among teachers, parents, and guardians concerning children’s safety in the digital landscape.

Deputy Human Rights Chief Nada Al-Nashif highlighted that, according to the Committee on the Elimination of All Forms of Discrimination Against Women, cyberbullying disproportionately affects girls, affecting them twice as often as boys. She also cited data from the World Health Organization, indicating that children subjected to bullying often skip school, perform poorly on tests, and suffer from sleep disturbances and psychosomatic pain.

Al-Nashif told the Council that the “complex” topic of cyberbullying lies at the intersection of human rights, digital and policy issues. “To get this right, we must adopt a holistic approach, and address root causes”, she said, underscoring that “central to this is the voice of children themselves”. She also stressed the “centrality and power of companies in the online space”, insisting on the responsibility of tech companies to provide adapted privacy tools and follow content moderation guidelines “in line with international human rights standards”.

A representative of Meta, Safety Policy Director Deepali Liberhan, took part in the discussion and spoke about the magnitude of the problem. She said in the third quarter of 2023 alone, some 15 million pieces of content had been detected on Meta’s platforms Facebook and Instagram that constituted bullying and harassment; most were proactively removed by Meta before even being reported, she said.

Liberhan highlighted the company’s content moderation policies and ways in which Meta was enforcing them on its platforms, partnering with experts to inform the action it takes, and incorporating anti-bullying tools into the user experience.  At the conclusion of the session, panelist Philip Jaffé, a Member of the Committee on the Rights of the Child, stressed the “collective” responsibility for the safety of our children.

“We need to make children more aware of their rights and make States and other components of society more aware of their obligations to protect [them],” he insisted.

-URN

Leave a Reply

Your email address will not be published. Required fields are marked *