Speaking Up About Online Harm
In June 2025, the Government launched an inquiry to better understand the harm young New Zealanders encounter online and identify potential solutions.
Dear Em gave both written and oral submissions for the inquiry to help uplift young people’s voices on this important issue that directly impacts them. Submissions are an important part of our democracy, and allow people to voice their experiences, opinions, and ideas to the Government. Anyone can make a submission, regardless of age or background, which means this can be a useful tool for young people to have their voices heard and represented.
We wanted to share our submissions with you to keep the conversation going about online harm, and also to remind every young person that your voice deserves to be heard, especially on issues that have a direct impact on you.
Check out what we have advocated for in our submissions below!
Dear Em’s Oral Submission
This submission was given to the Education and Workforce Committee on 8 September, 2025. The recording has been sourced from the Education and Workforce Committee’s publicly available Vimeo account - you can access the full recording through the link provided here.
Dear Em’s Written Submission
As a group who are passionate about uplifting young people’s voices, and creating a world where we can all be free from harm in its various forms, we want to share our insights into the harm young New Zealanders encounter online, and our thoughts on the roles that Government, business, and society could be playing to address these issues. We write this submission from our own lived experience of the harms young people face online and the failure of the current systems to adequately protect us.
What We Experience
Online harm is not an abstract or distant concept for us. It is a daily, personal, and often dismissed experience. Young people encounter bullying, exploitation, misogyny, racism, dangerously addictive content, misinformation, and more. Rather than being supported or educated, we find ourselves being told to ‘toughen up’ or ignore it, as though the harm is not real.
Society has been sending the message that many forms of online harm are normal. We hear phrases like “sticks and stones” or “just block them.” Our experiences are often downplayed or met with normalisation. Platforms meet reports with, “this doesn’t breach our standards,” even when content clearly promotes harmful behaviour, causes distress, or has the potential for other dangerous flow-on effects. As a result, young people are becoming desensitised. We are internalising the implication that our safety, identity, and well-being are not a priority.
Social media is often used as a mask to justify cruelty. We see catfishing used by predators, yes, but also from our own peers who are failing to fully understand their own impacts. The harms perpetrated online cannot be attributed to one particular group, but rather, we see that harmful behaviour has been allowed to go on unchecked. This has created a digital landscape that many are able to take advantage of for their own gains or pleasure, with complete disregard for the victims. The culture of minimising experiences of harm and an apparent absence of any accountability for perpetrators is leaving us isolated and unprotected. Furthermore, we are severely lacking in adequate resources and education to understand how to navigate these digital landscapes safely.
Limited education and regulation have enabled a harmful online culture that places greater responsibility on victims as opposed to perpetrators, and the harm that occurs is dismissed and normalised.
What Won’t Work
We have heard some suggest that to protect young people and reduce online harm, a ban on social media is a viable option. We strongly disagree.
Prohibition does not stop harm; it simply pushes it underground. Bans will increase vulnerability. Rangatahi are resourceful; if something is banned, we will find a workaround, and often via more risky and unregulated platforms. Worse still, when harm inevitably occurs despite a ban, many young people will be afraid to seek help in fear of punishment and judgement.
This also doesn’t provide a sustainable solution, and we fear that this would provide grounds to reduce the already limited education we receive around online safety. Banning under-16s from social media would reduce opportunities to develop good digital citizenship skills and limit the spaces available for informed and supported discussion about how to navigate these spaces. In an increasingly digital world, even if young people are banned from social media, it is inevitable that it will still play a role in their lives at some stage. While we are still in school and have access to educators and support, this is the ideal time to foster a strong understanding of how to interact with digital spaces safely.
You cannot regulate safety through fear. Instead of cutting young people off from online spaces, we need to be equipped to navigate these spaces safely. Instead of placing an ill-fitting consequence on young people, platforms and perpetrators need to be held accountable.
What Will Have An Impact
1) Shifting Responsibility
Too often, responsibility is placed on those being harmed, rather than those doing the harm or allowing it to happen.
Telling youth that they “shouldn’t have been online” is like telling a survivor of sexual violence that they “shouldn’t have worn that skirt.” It is a victim-blaming mentality that has proved ineffective and damaging time and time again. Instead of punishing young people for existing online, we should be investing in holding those who cause harm accountable. This would include online users perpetrating harm, platforms that inadequately moderate and regulate online behaviour, and the systems and policies that allow harm to continue unchecked.
2) Education & Media Literacy
Young people need real, relevant education that reflects the online world we truly live in. Advice cannot be outdated; it must be current and informed by real experiences. When content is based on outdated experiences of the online world, this can perpetuate stigmas and unhelpful norms. An example of this is the concept of online ‘stranger danger.’. In previous generations, the idea of meeting new people online was considered dangerous and weird. But today, we see this completely normalised by dating apps and similar platforms, which encourage people to meet each other online. Approaching any education on this topic from an outdated perspective would be unhelpful and impose a sense of shame, reducing the likelihood of someone seeking help if one of these interactions turned into a harmful experience.
Some of the key topics we currently see a need for renewed education around include:
Pornography.
Misinformation & disinformation.
Recognising scams and exploitation.
The risks of artificial intelligence misuse.
Online addictions (e.g. porn, gambling, etc.).
How algorithms shape what content is shown.
Digital impersonation, catfishing, and identity theft.
Healthy and unhealthy online relationships and interactions.
How to create and maintain consent and boundaries in digital spaces.
How to be safe and respectful online (protecting both ourselves and others).
The reality is that many of these issues can impact us at any stage of our lives, and while we see young people being heavily impacted by online harm, we also see harm occurring at later stages in life. With how quickly the online world changes, too, we may not always be able to provide education on every specific risk or issue. This speaks to the need to use education at an early stage to foster digital literacy, an understanding of how to navigate online issues, and ensure young people know how and when to reach out for help.
Education should be regularly updated and based on how youth are currently using the internet, not how adults assume they are. Educational tools and content should be youth-informed and co-created. As those directly experiencing these issues, it just makes sense to give us the space to contribute and help shape what needs to be taught, as this will ensure content is relevant and helpful.
3) Moderation & Accountability
Companies that own and run online platforms, and the Government both play a key role in a safer online future.
Companies in this space benefit from the use of their platforms and make significant profits from our patronage. However, automated moderation, vague guidelines, and overseas-based staff do not reflect the values and realities of Aotearoa. Currently, there is little to no consequence for these companies when harm occurs through their platforms or tools.
There are key steps that the Government could be taking to place greater responsibility on these companies, and ensure that digital spaces are regulated in a way that prioritises user safety, and holds perpetrators accountable. Potential approaches we would recommend the Government explore include:
Requiring platforms to have NZ-based human moderators.
Mandating clearer and more culturally relevant guidelines and moderation standards.
Designing any new platforms with youth safety and wellbeing in mind (e.g. including opt-in features to support safer communication and block harmful content earlier).
Exploring platforms made specifically for rangatahi, where youth control the culture, not algorithms.
Explore policy that looks at regulating social media platforms in a similar way to how gambling establishments are regulated. This would acknowledge that these platforms inherently enable harm, and so to be able to make profit off of that harm, these companies have a duty to return some of that profit back into the community either through taxes or community grants targeted at the most impacted communities, like youth.
While some policy approaches, like regulation around moderation and platform guidelines, are fairly straightforward, other potential solutions may require a greater level of creativity. Different approaches that we have seen that inspire us include platforms with user-first safety features, like Bumble’s women-first messaging model, or youth-specific spaces like Yubo.
4) Empathy, Voice, & Connection
Above all else, young people must be heard.
We need safe spaces to speak, share experiences, and connect with others without judgement and shame. The current system needs to stop normalising damaging experiences and stop assuming youth have no understanding of the consequences of being online. Youth understand better than anyone else.
Meet young people where they are at. Bring empathy to the table. Let youth lead wherever possible.
On a practical level, this could look like:
Making submissions processes more accessible for young people.
Ensuring young people are aware that their input is valid, and doesn’t need to be presented in a formal manner to be heard.
Prioritising young people as experts in their own experiences and seeking out their insights as Government would any other key stakeholder.
Forming well-resourced and representative youth advisory panels where appropriate
Exploring education frameworks which enable youth to introduce issues they see as relevant to learn about, while being supported by educators and supplied with digital literacy skills.
Facilitating safe spaces in the community for young people to engage in conversation about these topics, to both learn and provide insight.
Online harm can not be solved without youth involvement. It is your job to ensure young people are supported to be involved.
In conclusion
Online harm is a systematic issue, not a youth problem. Solutions need to be evidence-based and led by those most affected - youth themselves.
Dear Em urges the Committee to:
Centre youth voice in all decisions.
Resist reactionary responses like bans and blanket restrictions.
Focus on accountability, education, and cultural shifts across society, government, and platform management and design.
We are not asking for a perfect digital landscape. We are asking for a safer one, and we are ready to help build it.
This submission was made during the open call for submissions, closing 30 July 2025. To find out what matters are currently open for submission, visit the Parliamentary Submissions Page here.
Keen to learn more about how your voice can be heard on the issues that matter to you? Check out these handy resources:




