Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

AZJonnie

(1,537 posts)
1. Not sure ATM whether companies that host communications platforms can be held liable for the content
Sat Sep 13, 2025, 04:24 AM
Saturday

However it could well be worthwhile to put some public pressure on the company regardless. They have access to all conversations as Discord is cloud-based and all messages pass through their infrastructure, and the company does perform moderation

Discord performs moderation both automatically and through human reporting to keep the platform safe. Violations that can lead to bans include:

Harassment, bullying, or threats: Any form of severe, repeated, or targeted harassment toward individuals or groups.

Hate speech: Language or symbols that degrade or dehumanize based on protected characteristics like race, ethnicity, gender, sexual orientation, religion, disability, etc.

Violent extremism: Promoting or glorifying violence, violent acts, or extremist groups.

Sexual content and exploitation: Posting, soliciting, or sharing sexually explicit content involving minors, non-consensual intimate media, or not respecting required age restrictions for adult content.

Illegal actions: Activities like sharing child sexual abuse material (CSAM), doxing (sharing private information without consent), organizing or encouraging illegal acts.

Self-harm and suicide: Encouraging or glorifying self-harm or suicide.
Spam and scams: Repeated unwanted messages, scams, or misleading content.
Ban evasion: Circumventing bans using new accounts.

How moderation works:
Discord uses automated tools (such as filters and bots) and manual review for suspicious content.
Users and server moderators can report violations.
For severe issues (threats, CSAM, terrorism), Discord will ban users/server and may report to authorities.

Summary:
The “ban hammer” falls for serious violations such as hate speech, harassment, exploitation, threats, or breaking laws. Discord’s Community Guidelines detail what’s not allowed and set the standard for bans. Server-specific rules can also trigger bans within a particular community.


One problem though that the members of groups figure out how to avoid bans, what not to say. Also if Discord were known to be moderating too heavy handedly, radicals will just split to other platforms that don't.

All in all Discord may seem emblematic, but its more of a symptom of a more general problem of allowing exclusive social groups to exist and converse in a largely unfettered way to one another. From a 1A perspective, it may be a tall order to regulate it like it would need to be to make a real difference in safety. Like I say they already do moderate, and still this horrible shit happens.

Not sure what the answer here is, short of like banning all such software outright. But that would be monopolistic unless Facebook groups were also banned. It's not like they've solved all the same problems there either.

Recommendations

2 members have recommended this reply (displayed in chronological order):

Latest Discussions»General Discussion»Wow, food for thought. Sh...»Reply #1