What teens see in closed online spaces like the Discord app
- Written by Brianna Dym, Ph.D. Candidate of Information Science, University of Colorado Boulder
Ever since the earliest days of the internet in the 1980s, getting online has meant getting involved in a community. Initially, there were dial-up chat servers[1], email lists and text-based discussion groups[2] focused on specific interests[3].
Since the early 2000s, mass-appeal social media platforms have collected these small spaces into bigger ones, letting people find their own little corners of the internet, but only with interconnections to others. This allows social media sites to suggest new spaces users might join[4], whether it’s a local neighborhood discussion or a group with the same hobby, and sell specifically targeted advertising[5]. But the small-group niche community is making a comeback with adults, and with kids and teens[6].
When Discord[7] was initially released in 2015[8], many video games did not allow players to talk to each other using live voice chat while playing the game – or required them to pay premium prices[9] to do so. Discord was an app that enabled real-time voice and text chatting[10], so friends could team up to conquer an obstacle, or just chat while exploring a game world. People do still use Discord for that, but these days most of the activity on the service is part of a wider community than just a couple of friends meeting up to play.
Examining Discord is part of my research[11] into how scholars, developers and policymakers might design and maintain healthy online spaces.
A little bit old school
Discord first came onto my radar in 2017 when an acquaintance asked me to join a writer’s support group. Discord users can set up their own communities, called servers, with shareable links to join and choices about whether the server is public or private.
The writer’s group server felt like an old-school chat room, but with multiple channels segmenting out different conversations that folks were having. It reminded me of descriptions of early online chat[12] and forum-based communities that hosted lengthy conversations between people all over the world.
The people in the writers’ server quickly realized that a few of our community members were teenagers under the age of 18. While the server owner had kept the space invite-only, he avoided saying “no” to anyone who requested access. It was supposed to be a supportive community for people working on writing projects, after all. Why would he want to exclude anyone?
He didn’t want to kick the teens out, but was able to make some adjustments using Discord’s server moderation system. Community members had to disclose their age, and anyone under 18 was given a special “role” that tagged them as a minor. That role prevented them from accessing channels that we marked as “not safe for work,” or “NSFW.” Some of the writers were working on explicit romance novels and didn’t want to solicit feedback from teenagers. And sometimes, adults just wanted to have their own space.
While we took care in constructing an online space safe for teens, there are still dangers present with an app like Discord. The platform is criticized for lacking parental controls[13]. The terms of service state that no one under 13 should sign up for Discord, but many young people[14] use the platform regardless.
Additionally, there are people who have used Discord to organize and encourage hateful rhetoric, including neo-Nazi ideologies[15]. Others have used the platform to traffic child pornography[16].
However, Discord does maintain that these sorts of activities are illegal and unwelcome on its platform, and the company regularly bans servers[17] and users[18] it says perpetuate harm.
Options for safety
Every Discord server I’ve joined since then has had some safeguard around young people and inappropriate content. Whether it’s age-restricted channels or simply refusing to allow minors to join certain servers, the Discord communities I’m in share a heightened concern for keeping young people on the internet safe.
This does not mean that every Discord server will be safe at all times for its members, however. Parents should still take the time to talk with their kids about what they’re doing in their online spaces. Even something as innocuous as the popular children’s gaming environment Roblox[19] can turn bad in the right setting.
And while the servers I’ve been involved in have been managed with care, not all Discord servers are regulated this way. In addition to servers lacking uniform regulation, account owners are able to lie about their age and identity when signing up for an account. And there are new ways for users to misbehave or annoy others on Discord, like spamming loud and inappropriate audio[20].
But, as with other modern social media platforms, there are safeguards to help administrators keep online communities safe for young people if they want to. Server members can label an entire server “NSFW,” going beyond single channel labels and locking minor accounts out of entire communities. But if they don’t, company officials can do it themselves[21]. When accessing Discord on an iOS device, NSFW servers are not visible to anyone, even accounts belonging to adults. Additionally, Discord runs a Moderator Academy[22] to support training up volunteer moderators who can appropriately handle a wide range of situations.
Stronger controls
Unlike many other current popular social media platforms, Discord servers often function as closed communities, with invitations required to join. There are also large open servers flooded with millions of users, but Discord’s design integrates content moderation tools to maintain order.
For example, a server creator has tight control[24] over who has access to what, and what permissions each server member can have to send, delete or manage messages. In addition, Discord allows community members to add automations[25] to a server, continuously monitoring activity to enforce moderation standards.
With these protections, people use servers to form tight-knit, closed spaces safe from chaotic public squares like Twitter and less visible to the wider online world. This can be positive, keeping spaces safer from bullies, trolls and disinformation spreaders. In my own research, young people have mentioned their Discord servers as the safe, private space[26] they have online in contrast to messy public platforms.
[Like what you’ve read? Want more? Sign up for The Conversation’s daily newsletter[27].]
However, moving online activity to more private spaces also means that those well-regulated, healthy communities are less discoverable for vulnerable groups[28] that might need them. For example, new fathers looking for social support[29] are sometimes more inclined to access it through open subreddits rather than Facebook groups.
Discord’s servers are not the first closed communities on the internet[30]. They are, essentially, the same as old-school chat rooms, private blogs and curated mailing lists. They will have the same problems and opportunities as previous online communities.
Discussion about self-protection
In my view, the solution to this particular problem is not necessarily banning particular practices or regulating internet companies. Research into youth safety online[31] finds that government regulation aimed at protecting minors on social media rarely has the desired outcome[32], and more often results in disempowering and isolating youth instead.
Just as parents and caring adults tell the kids in their lives about recognizing dangerous situations in the physical world, talking about healthy online interactions can help young people protect themselves in the online world. Many youth-focused organizations, and many internet companies, have internet safety information[33] aimed at kids of all ages.
Whenever young people hop onto the next technology fad, there will inevitably be panic[34] over how the adults, companies and society may or may not be keeping young people safe. What is most important in these situations is to remember that talking to young people about how they use those technologies, and what to do in difficult situations, can be an effective way[35] to help them avoid serious harm online.
References
- ^ dial-up chat servers (www.theatlantic.com)
- ^ text-based discussion groups (usenetreviewz.com)
- ^ specific interests (cfiesler.medium.com)
- ^ suggest new spaces users might join (www.colorado.edu)
- ^ sell specifically targeted advertising (theconversation.com)
- ^ with kids and teens (www.nytimes.com)
- ^ Discord (discord.com)
- ^ in 2015 (discord.fandom.com)
- ^ pay premium prices (techcrunch.com)
- ^ enabled real-time voice and text chatting (techcrunch.com)
- ^ my research (scholar.google.com)
- ^ early online chat (www.theatlantic.com)
- ^ parental controls (www.wsj.com)
- ^ many young people (www.nytimes.com)
- ^ neo-Nazi ideologies (www.theverge.com)
- ^ traffic child pornography (www.justice.gov)
- ^ regularly bans servers (www.theverge.com)
- ^ and users (www.engadget.com)
- ^ Roblox (www.wired.com)
- ^ spamming loud and inappropriate audio (doi.org)
- ^ company officials can do it themselves (support.discord.com)
- ^ Moderator Academy (discord.com)
- ^ Discord (support.discord.com)
- ^ a server creator has tight control (discord.com)
- ^ automations (discord.com)
- ^ safe, private space (dx.doi.org)
- ^ Sign up for The Conversation’s daily newsletter (memberservices.theconversation.com)
- ^ less discoverable for vulnerable groups (medium.com)
- ^ new fathers looking for social support (doi.org)
- ^ closed communities on the internet (slate.com)
- ^ youth safety online (mitpress.mit.edu)
- ^ rarely has the desired outcome (blogs.lse.ac.uk)
- ^ internet safety information (beinternetawesome.withgoogle.com)
- ^ inevitably be panic (theconversation.com)
- ^ effective way (doi.org)
Authors: Brianna Dym, Ph.D. Candidate of Information Science, University of Colorado Boulder
Read more https://theconversation.com/what-teens-see-in-closed-online-spaces-like-the-discord-app-178741