![teamspeak hosting payment wall teamspeak hosting payment wall](https://www.ionos.com/digitalguide/fileadmin/DigitalGuide/Screenshots_2022/connecting-to-the-created-teamspeak-server.png)
Discord also worked with the Southern Poverty Law Center to shut down other white nationalist servers and ban users.īy letting private groups manage themselves, Discord is able to keep something close to server culture alive, where the only mods are those chosen by the server owner. It was somewhat big news at the time, and the servers pointed out by Slate and other publications in 20 have since been shut down. As Slate reported in 2018, the relative privacy afforded to Discord groups meant that hate speech and violent rhetoric could go unnoticed. "As a result, violations have remained very low as a percentage of total messages."ĭiscord isn't entirely reactive, however. "As the platform has scaled, Discord’s trust and safety technologies have scaled and evolved with it," says a company representative.
![teamspeak hosting payment wall teamspeak hosting payment wall](https://static.wikia.nocookie.net/mc-central/images/8/80/TeamSpeak_Infobox_Image_28th_February_2016.png)
According to Discord, if you don't count spam bots, only 0.0003 percent of Discord's userbase was banned after being reported in that three month period, a statistic it presents as evidence that abuse is not widespread. That comes to an average of 582 reports per day. In a blog post, the company said it received 52,400 user reports during the first three months of 2019.
![teamspeak hosting payment wall teamspeak hosting payment wall](https://img.themesinfo.com/i/2/1359/wp-template-twenty-sixteen-koeco-m.jpg)
It's only when something serious has happened that it intervenes: targeted harassment, hate speech, threats to others or of self-harm (Discord may contact law enforcement in these cases), raiding, and so on. Discord also won't intervene in petty personal disputes which can be solved by one user blocking another, or leaving a server where they aren't getting along. Li says the team views itself as analogous to law enforcement, in that cops don't hang out in your house waiting for you to commit a crime, but will come to your house if someone reports one-the 'crime' in this case being a TOS violation, or, sometimes, an actual crime. In a blog post, Discord said it received 52,400 user reports during the first three months of 2019. They can't view them once they're gone, which on one hand allows abusers to remove evidence, but on the other gives normal users the ability to erase benign chat messages that they simply don't want sitting on a server. None of the work is outsourced.ĭiscord also notes on its support page that deleted messages are truly deleted. The ability to view messages is solely with the trust and safety team, and all of Discord's trust and safety team members are employees. All access to message logs is itself logged, so abuse of the system would be discoverable. When a pattern of behavior is reported, they may view more than a single message. If someone reports, say, a threatening DM, a member of the trust and safety team may view the reported message and decide what action to take. Every chat message has a unique ID which users can report. Broadly speaking, Discord's moderation policy is reactive rather than proactive.